So now we come to the end (well… sort of)

Here I was, all set for the SAT to take its final bow when, in a remarkable twist, it was announced that hundreds of testing centers would be closed and the January test postponed until Feb. 20th thanks to the blizzard about to descend on the east coast.

Given that it was 60 degrees on Christmas Day in New York City and that this is the first real snowfall of the year, I can’t help but find this to be an bizarrely coincidental turn of events. It would seem that the SAT is not about to go quietly.

That notwithstanding, tomorrow is still the last official SAT test date, and thus I feel obligated to post a few words in tribute to an exam that’s had a disproportionately large impact on my life over these last few years. (Full disclosure: I’m also posting this now because I’ve gone through the trouble of writing this post, and if I wait another month, I might get caught up in something and forget to post it.)  

I’ll do my best not to get all mushy and sentimental. 

From time to time, various students used to ask me hedgingly whether I loved the SAT. It was a reasonable question. After all, who would spend quite so much time tutoring and writing about a test they didn’t really, really like?

I can’t say, however, that I ever loved the SAT in a conventional sense. The test was something I happened to be good at more or less naturally (well, the verbal portion at least), and tutoring it was something I just happened to fall into. I didn’t start out with any particular agenda or viewpoint about the test; it was simply a necessary hurdle to be dealt with on the path to college, and as I saw it, my job was to make that hurdle as straightforward and painless as possible. To be sure, there were aspects of the tests that were genuinely interesting to discuss, and don’t even get me started on the let’s-use-Harry-Potter-examples-to-define-vocabulary-fests, but as I always told my students, “You don’t have to like it — you just have to take it.”

What I will say, though, is something I’ve heard from many tutors as well as from many students (and their parents), namely that after spending a certain amount of time grappling with the SAT, picking it apart and understanding its strengths as well as its shortcomings, you develop a sort of grudging respect for the test. For a lot of students, the SAT is the first truly challenging academic obstacle they’ve faced — the first test they couldn’t ace just by reading the Sparknotes version or programming their calculator with a bunch of formulas. For the students I tutored long-term, there was almost always a moment when it finally sank in: Oh. This test is actually difficult. I’m going to have to really work if I want to improve. And usually they rose to the challenge. 

But the interesting part is that what started out as no more than a nuisance, another hoop to jump through on the way to college, could sometimes turn into a real educational experience — one that left them noticeably more comfortable reading college-level material, whether or not they got all the way to where they wanted to go. And when they did improve, sometimes to levels beyond what their parents had thought them capable of, their sense of accomplishment was enormous. They had fought for those scores. Perhaps I lack imagination, but I just don’t see students having those types of experiences quite as often with the new test. 

That’s a best-case scenario, of course; I think the worst-case scenarios have been sufficiently rehashed elsewhere to make it unnecessary for me to go into all that here. But regardless of what you happen to think of the SAT, there’s a lot to be said for having the experience of wrestling with something just high enough above your level to be genuinely challenging but just close enough to be within reach. 

This test has also led me down roads I never could have foreseen. While I’ve also been primarily interested in the SAT’s role as a cultural flashpoint, in the way it sits right at the crux of a whole host of social and educational issues, it’s also taught me more than I ever could have imagined about what constitutes effective teaching, how the reading process works, and about the gap between high school and college learning. And I’ve met a lot of (mostly) great people because of it, many of whom have become not only colleagues but also friends. I never thought I’d say this, but I owe the SAT a lot. It wasn’t a perfect test, but considered within the  narrow confines of what it could realistically be expected to demonstrate, it did its job pretty well. 

So on that note, I’m going to say something that might sound odd: to those of you taking this last test, consider yourselves lucky. Consider yourselves lucky to have been given the opportunity to take a test that holds you to an actual standard; that gives you a snapshot of the type of vocabulary and reading that genuinely reflect what you’ll encounter in college; that isn’t designed to pander to your ego by twisting the numbers until they’re all but meaningless. 

And if you’ve been granted a reprieve for tomorrow, enjoy the snow day and catch up on your sleep. 

 

 

What is ETS’ role in the new SAT?

Update #2 (1/27/16): Based on the LinkedIn job notification I received yesterday, it seems that ETS will be responsible for overseeing essay grading on the new SAT. That’s actually a move away from Pearson, which has been grading the essays since 2005.  Not sure what to think of this. Maybe that’s the bone the College Board threw to ETS to compensate for having taken the actual test-writing away. Or maybe they’re just trying to distance themselves from Pearson. 

Update: Hardly had I published this post when I discovered recent information indicating that ETS is still playing a consulting role, along with other organizations/individuals, in the creation of the new SAT. I hope to clarify in further posts. Even so, the information below raises a number of significant questions. 

Original post: 

Thanks to Akil Bello over at Bell Curves for finally getting an answer:

Screen Shot 2016-01-19 at 2.48.03 PM

(In case the image is too small for you to read, the College Board’s Aaron Lemon-Strauss states that “with rSAT we manage all writing/form construction in-house. use some contractors for scale, but it’s all managed here now.” You can also view the original Twitter conversation here.) 

Now, some questions:

What is the nature of the College Board’s contract with ETS? 

Who exactly is writing the actual test questions?

Who are these contractors “used for scale,” and what are their qualifications? What percentage counts as “some?”

What effect will this have on the validity of the redesigned exam? (As I learned from Stanford’s Jim Milgram, one of the original Common Core validation committee members, many of the College Board’s most experienced psychometricians have been replaced.) 

Are the education officials who are mandated the redesigned SAT in Connecticut, Michigan, Colorado, Illinois, and New York City aware that the test is no longer being written by ETS? 

Why has this not been reported in the media? I cannot recall a single article, in any outlet, about the rollout of the new test that even alluded to this issue. ETS has been involved in writing the SAT since the 1940s. It is almost impossible to overstate what a radical change this is. 

For what it’s worth (how the College Board stole the state-testing market from the ACT)

For what it’s worth (how the College Board stole the state-testing market from the ACT)

For those of you who haven’t been following the College Board’s recent exploits, the company is in the process of staging a massive, national attempt to recapture market share from the ACT. Traditionally, a number of states, primarily in the Midwest and South, have required the ACT for graduation. Over the past several months, however, several states known for their longstanding relationships with the ACT have abruptly – and unexpectedly – announced that they will be dropping the ACT and mandating the redesigned SAT. The following commentary was sent to me by a West Coast educator who has been closely following these developments.  

For What It’s Worth

On December 4, 2015 a 15-member evaluation committee met in Denver, Colorado to begin the process of awarding a 5-year state testing contract to either the ACT, Inc. or the College Board. After meeting three more times (December 10, 11, and 18th) the evaluation committee awarded the Colorado contract to the College Board on December 21, 2015. The committee’s meetings were not open to the public and the names of the committee members were not known until about two weeks later.

Once the committee’s decision became public, parents complained that it placed an unfair burden on juniors who had been preparing for the ACT. Over 150 school officials responded by sending a protest letter to Interim Education Commissioner Elliott Asp. The letter emphasized the problem faced by juniors and also noted that Colorado would be abandoning a test for which they had 15 years of data for a new test with no data. (more…)

Sleight of hand: an illustration of PSAT score inflation

A couple of posts back, I wrote about a recent Washington Post article in which a tutor named Ned Johnson pointed out that the College Board might be giving students an exaggeratedly rosy picture of their performance on the PSAT by creating two score percentiles: a “user” percentile based on the group of students who actually took the test; and a “national percentile” based on how the student would rank if every 11th (or 10th) grader in the United States took the test — a percentile almost guaranteed to be higher than the national percentile. 

When I read Johnson’s analysis, I assumed that both percentiles would be listed on the score report. But actually, there’s an additional layer of distortion not mentioned in the article. 

I stumbled on it quite by accident. I’d seen a PDF-form PSAT score report, and although I only recalled seeing one set of percentiles listed, I assumed that the other set must be on the report somewhere and that I simply hadn’t noticed them.

A few days ago, however, a longtime reader of this blog was kind enough to offer me access to her son’s PSAT so that I could see the actual test. Since it hasn’t been released in booklet form, the easiest way to give me access was simply to let me log in to her son’s account (it’s amazing what strangers trust me with!).

When I logged in, I did in fact see the two sets of percentiles, with the national, higher percentile of course listed first.  But then I noticed the “download report” button, and something occurred to me. The earlier PDF report I’d seen absolutely did not present the two sets of percentiles as clearly as the online report did — of that I was positive.

So I downloaded a report, and sure enough, only the national percentiles were listed. The user percentile — the ranking based on the group students who actually took the test — was completely absent. I looked over every inch of that report, as well as the earlier report I’d seen, and I could not find the user percentile anywhere.

Unfortunately (well, fortunately for him, unfortunately for me), the student in question had scored extremely well, so the discrepancy between the two percentiles was barely noticeable. For a student with a score 200 points lower, the gap would be more pronounced. Nevertheless, I’m posting the two images here (with permission) to illustrate the difference in how the percentiles are reported on the different reports.

Screen Shot 2016-01-16 at 8.19.42 PMScreen Shot 2016-01-16 at 5.48.09 PM

Somehow I didn’t think the College Board would be quite so brazen in its attempt to mislead students, but apparently I underestimated how dirty they’re willing to play. Giving two percentiles is one thing, but omitting the lower one entirely from the report format that most people will actually pay attention to is really a new low. 

I’ve been hearing tutors comment that they’ve never seen so many students obtain reading scores in the 99th percentile, which apparently extends all the way down to 680/760 for the national percentile, and 700/760 for the user percentile. Well…that’s what happens when a curve is designed to inflate scores. But hey, if it makes students and their parents happy, and boosts market share, that’s all that counts, right? Shareholders must be appeased. 

Incidentally, the “college readiness” benchmark for 11th grade reading is now set at 390. 390. In contrast, the I confess: I tried to figure out what that  corresponds to on the old test, but looking at the concordance chart gave me such a headache that I gave up. (If anyone wants to explain it me, you’re welcome to do so.) At any rate, it’s still shockingly low — the benchmark on the old test was 550 — as well as a whopping 110 points lower than the math benchmark. There’s also an “approaching readiness” category, which further extends the wiggle room. 

A few months back, before any of this had been released, I wrote that the College Board would create a curve to support the desired narrative. If the primary goal was to pave the way for a further set of reforms, then scores would fall; if the primary goal was to recapture market share, then scores would rise. I guess it’s clear now which way they decided to go. 

If you still haven’t received your PSAT scores…

Then apparently you’re not alone. 

I was under the impression that all PSAT scores had been finally released on 1/7, at least until I happened to check Mercedes Schneider’s blog. Apparently, some students are still unable to access their scores (or at least they were as of 1/9).

According to one Pennsylvania parent Schneider cites:  

As of today, more than 24 hours after the scores were supposedly released yesterday, we are still unable to see them. I have been advised by a school counselor that this is also happening to [several other] top ranked [Pennsylvania] high schools. There may be many others here and around the country, because when parents/students call the College Board to complain, the PSAT help line (866-433-7728) is so profoundly overloaded that a recorded message comes on advising to try to call again tomorrow or after going through a series of prompts hangs up or puts people into an endless cue. I just reached an agent after holding for nearly an hour. The agent looked into my teen’s account and confirmed that no October 2015 scores have been released and that access codes for …many high schools have not been delivered. She offered to “escalate” my concern to another department, which she said would respond in 5-7 business days, which is outrageous.

If this is widespread, College Board is in deep crisis. If it is just [our area in our state], then our students are being placed at a disadvantage compared to other students in terms of preparation for the March SAT. In any case, College Board has demanded that it hold the key to the futures of millions of children, but it is increasingly showing that is unworthy of such a great trust and responsibility.

You can read the entire post here.