Same ACT Test. Different Scoring. What Families Need to Know.

Tina Wiles · Dec 19, 2025 · 7 mins read

TL;DR

• The October 2025 paper and digital ACT used the same questions

• They did not use the same scoring conversions

• Section scores can shift by 1–2 points based on format alone

• The composite score may stay the same

• Format choice and format-specific prep matter more than ACT has communicated

Editor’s note: This analysis is based on a comparison of official ACT score reports and test forms from the October 2025 administration.

Same ACT Test. Different Scoring. What Families Need to Know.-img

You may have heard that one version of the ACT was “harder” than the other.

What most families haven’t heard is this:

The paper and digital versions of the ACT used the same questions, but they were not scored identically.

That distinction matters more than people realize, especially for students whose scores sit near important admissions or scholarship cutoffs.

In October 2025, students who took the paper ACT (Form J08) and students who took the digital ACT (Form A1H) answered the exact same questions -word for word - across English, Math, Reading, and Science.

However, the two versions used different raw-to-scaled score conversion tables.

This doesn’t mean ACT changed the difficulty of the test or tried to disadvantage students. It does mean that the testing format influenced how raw scores were converted into scaled scores, something families were never clearly told to expect when choosing between paper and digital testing.

A Real Student Example: Same Performance, Different Scoring

Here’s a simplified version of a real student’s results.

Untitled Draft-imgPaper ACT (J08)

  • English: 30

  • Math: 31

  • Reading: 30

  • Science: 32

  • Composite: 30

Now take the same exact performance and apply the digital scoring scale.

Digital ACT (A1H scoring applied)

  • English: 30

  • Math: 30

  • Reading: 29

  • Science: 30

  • Composite: still 30

Nothing changed academically. Nothing changed in effort. Nothing changed in ability.

But three section scores dropped, and one dropped by two points.

If you’re thinking, “I’d feel cheated if that were my student,” you’re not alone.

Where ACT’s Messaging Falls Short (and Why Families Were Caught Off Guard)

ACT has publicly stated that paper and digital versions of the ACT measure the same skills and that scores from different formats are comparable. In ACT’s language, this means colleges can interpret a score consistently, regardless of how the test was taken.

What ACT has not clearly communicated to families is how that comparability is achieved.

ACT does not publicly explain that:

  • raw scores may be converted differently depending on test format

  • section scores may shift by one or even two points between paper and digital

  • students choosing a format should expect different scoring behavior, even with identical questions

In other words, ACT’s statements about comparability focus on how scores are used, not on how students experience the scoring process.

That distinction matters.

Families reasonably assume that “the same test” means the same questions and the same scoring mechanics. October 2025 showed that this assumption isn’t always valid, particularly for score ranges associated with admissions and scholarships.

Why This Feels Unfair to Students

From a student’s perspective, this situation is tough to swallow.

They didn’t choose different content. They didn’t prepare differently. They didn’t underperform.

Yet their section scores changed simply because the test was delivered in a different format.

Even when the composite score remains the same, section score drops can affect:

  • confidence

  • superscore strategies

  • scholarship thresholds

  • how students interpret their own ability

When students feel like they chose the “wrong” version of the same test, that’s not a failure of effort — it’s a failure of transparency.

What This Means for You (and What You Can Do About It)

If you’re reading this and feeling frustrated, you’re not wrong. But the goal here isn’t to dwell on what already happened — it’s to help families make better, more informed decisions going forward.

Here’s what October 2025 taught us.

1️⃣ Format choice matters more than ACT has communicated

Paper and digital ACTs may use the same questions, but they are not interchangeable experiences for every student.

Some students:

  • Read more accurately on paper

  • Pace better without scrolling

  • Rely heavily on annotation

Others:

  • Move faster digitally

  • Benefit from easier answer changes

  • Struggle less with bubbling errors

Neither format is “better.” The right format is the one that aligns with how your student thinks, reads, and manages time.

2️⃣ Section scores deserve context — not panic

A lower section score on a digital test does not automatically mean:

  • your student regressed

  • preparation failed

  • they “did worse than expected”

In many cases, it reflects:

  • scoring compression

  • format-based equating

  • how close the student was to a score cutoff

This is especially true in Reading and Science, where a single question can move a score by one or two points.

Before reacting to section scores, ask:

  • Was the test paper or digital?

  • Is this section historically sensitive to format?

  • Is the composite telling a different story?

3️⃣ If your student plans to retest, be intentional about format

If a student is considering a retake, this is the most important takeaway:

Do not treat paper vs. digital as a neutral choice.

Instead, ask:

  • Which format has this student practiced most?

  • Which format reduces fatigue and anxiety?

  • Which format allows them to show what they actually know?

Retesting in a better-fit format can be as important as additional content review.

4️⃣ Preparation should match the testing medium

One of the biggest mistakes families make is preparing in one format and testing in another.

Effective prep should include:

  • full-length practice in the same format

  • pacing strategies specific to paper or digital

  • navigation strategies (especially for Reading and Science)

  • stress-management tools tailored to the medium

Format fluency is now part of test readiness.

5️⃣ The composite matters, but sections still tell a story

ACT prioritizes composite score comparability, and colleges largely do too. But section scores still:

  • influence superscoring strategies

  • affect confidence

  • shape a student’s narrative about their abilities

Understanding how format influences sections helps families:

  • interpret results accurately

  • avoid unnecessary self-blame

  • make smarter next-step decisions

The Bottom Line

October 2025 showed us something important:

Standardized tests don’t just measure knowledge. They measure performance under specific conditions.

The ACT isn’t a broken test, and students didn’t fail or choose “wrong.” What matters now is recognizing that format is part of the testing equation.

When families understand how paper and digital testing can affect performance and scoring, they can make choices that better reflect a student’s strengths.

Transparency leads to better decisions — and better outcomes.

We’ll continue sharing updates as ACT evolves.

As testing formats change, so do the strategies students need to succeed. Staying informed is the first step toward fair preparation and confident performance.

Tina Wiles is an educator and test prep specialist with over 20 years of experience working with ACT and SAT assessments. As the founder of My2tor and ACTSATPrepHQ, she helps students and families understand not just what to study, but how testing format, scoring, and mindset influence outcomes. Her work focuses on transparency, strategy, and helping students perform at their best under real testing conditions.

ACTACT ScoringStandardized Testing