22 March 2016

An analysis of problems with PSAT scores, courtesy of Compass Education

Apparently I’m not the only one who has noticed something very odd about PSAT score reports. California-based Compass Education has produced a report analyzing some of the inconsistencies in this year’s scores.

The report raises more questions than it answers, but the findings themselves are very interesting. For anyone who has the time and the inclination, it’s well worth reading.

Some of the highlights include:

  • Test-takers are compared to students who didn’t even take the test and may never take the test.
  • In calculating percentiles, the College Board relied on an undisclosed sample method when it could have relied on scores from students who actually took the exam.
  • 3% of students scored in the 99th percentile.
  • In some parts of the scale, scores were raised as much as 10 percentage points between 2014 and 2015.
  • More sophomores than juniors obtained top scores.
  • Reading/writing benchmarks for both sophomores and juniors have been lowered by over 100 points; at the same time, the elimination of the wrong-answer penalty would permit a student to approach the benchmark while guessing randomly on every single question.

Leave a Reply