A note to the College Board

Dear College Board:

I understand that you are very busy helping students prepare for college and career readiness success in the 21st century; however, as I was perusing (excuse me, looking at) the section of your website devoted to describing the essay contest run jointly by your organization and The Atlantic magazine, I couldn’t help but notice a sentence that read as follows:

“To be successful at analytical writing, students must support your arguments with evidence found in the text and clearly convey information to the reader.”

As the writers of your website copy presumably know, the correct use of parallel structure and pronoun agreement is an important component of analytical writing — the exact type of writing that employees use authentically in their actual careers. 

Moreover, given that your organization is responsible for testing over 1.5 million students on these exact concepts annually, I assume that the appearance of this type of faulty construction is simply the result of an oversight rather than any sort of indication that College Board writers lack the skills and knowledge necessary for success in the 21st century — that is, the skills and knowledge that matter most.

As you update your website to reflect the upcoming changes to the SAT, however, do try to remember that carefully editing your work is also an important skill for college and career readiness. After all, you wouldn’t want to set a poor example. 

Best,
Erica

A possible explanation for “The Atlantic’s” article about the new SAT

Ever since I encountered Emmanuel Felton’s article “How the Common Core is Transforming the SAT” a couple of days ago and wrote my ensuing diatribe, I’ve been trying to figure out just why The Atlantic in particular would publish information so blatantly false. Sure, there have been plenty of articles regurgitating the standard hype about the new test, in pretty much every major media outlet, but this one crossed a line.

To be perfectly fair, Felton doesn’t actually state flat-out that analogies are still included on the test, but with lines such as On the reading side, gone are analogies like “equanimity is to harried” as “moderation is to dissolute,” the implication is so strong that it’s pretty much impossible for casual readers not to draw that conclusion.

Then, halfway through my run this morning, I had a “duh” moment. I had somehow forgotten that the Atlantic had partnered with the College Board to run an annual “analytical writing” contest for high school students.

In fact, James Bennett, the president and editor-chief of The Atlantic even appears in this College Board video on analytical writing for the new APUSH exam. That exam is Coleman’s baby.

Coincidence? I think not.

Mystery solved.

Why is the Atlantic publishing false information about the SAT?

From Emmanuel Felton’s Atlantic article, “How the New SAT is Taking Cues from Common Core:

While other standardized tests have also been criticized for rewarding the students who’ve mastered the idiosyncrasies of the test over those who have the best command of the underlying substance, the SAT—with its arcane analogy questions and somewhat counterintuitive scoring practices—often received special scorn.

And this:

On the reading side, gone are analogies like “equanimity is to harried” as “moderation is to dissolute…Eliminating “SAT words” isn’t the only change to the new reading and writing section, which will require a lot more reading…The passages themselves are changing, as The College Board tries to have them represent a range of topics from across the disciplines of social studies, science, and history. 

Emmanuel Felton is entitled to his own opinion about the SAT; he is not entitled to his own facts.

The SAT eliminated analogy questions in 2005 — that was 10 full years ago, in case you didn’t care to do the math. Yet his article very directly implies that these questions are still part of the exam.

Felton also does not acknowledge that the SAT already includes passages drawn from fiction, social science, science, and history, on every single test. The fact that the passages are not explicitly labeled as such, as they are on the ACT, does not mean that they are drawn randomly.

These are exceedingly basic facts, which presumably could have been checked with five seconds of internet research and a quick glance through the Official Guide.

Does the Atlantic not employ fact-checkers? Or does it simply not care about facts?

Furthermore, the small print at the bottom of Felton’s article indicates that it was written “in collaboration with the Hechinger Report.” On its website, The Hechinger Report  describes itself as  “… an independent nonprofit, nonpartisan organization based at Teachers CollegeColumbia University. We on support from foundations and individual donors to carry out our work.” (Unsurprisingly, the Gates Foundation is listed among those donors.)

Why on earth is a publication produced by an Ivy League university allowing this type of blatant misinformation to be disseminated?

If you are going to take potshots at the SAT in a major national magazine, fine; people have been doing that for decades. At the very least, though, those criticisms should be anchored in some sort of reality.

Even by the very questionable standards of general reporting about the new SAT, this is sloppy, lazy work.

Update #2: who is writing the new SAT?

A month or so ago, when I first became aware of the questions surrounding ETS’s involvement (or lack thereof) in the new SAT, I wrote to several people who had been vocal about criticizing Common Core and the slapdash manner in which it was thrown together by Coleman et. al. One of the people I contacted was Jim Milgram, whose response I cited in an earlier post; the other was his colleague Sandra Stotsky, the other member of the validation committee who refused to sign off on the Standards.

Unfortunately, neither of them was able to offer any insight into the authorship of the SAT; however, Sandra did suggest that I write to Valerie Strauss at the Washington Post and alert her to the College Board’s deliberate and persistent evasiveness regarding that question. (The Post’s Education section, unlike that of the  Los Angeles Times, hasn’t been bought out by one of the billionaires funding the reform movement… at least not yet.) Valerie promptly responded to let me know that she found the issue “fascinating” and would do some investigation of her own. 

So stay tuned. If enough people start asking questions, perhaps the powers that be at the College Board will finally be forced into providing some answers — no doubt heaping on scads of reformster gibberish in an attempt evade the issue at every step. (Transparency? What transparency?). But at least that would be a step in the right direction. 

Update: who is writing the new SAT? (don’t get too excited)

After listening to me natter on about who was writing the new SAT, a tutor friend of mine decided to take matters into her own hands and email the College Board.

Just as I had predicted, the response (posted below) consisted of a non-answer packed full of every favorite reformer platitude imaginable (College and career readiness! measuring skills and knowledge! evidence for validity! Ummm… It hasn’t been administered yet. Exactly how does one go about acquiring evidence for the validity of something before it’s occurred?).

This was my friend’s original inquiry: 

Hi,

I’m just wondering if the new redesigned SAT and PSAT were written/designed by ETS.

Thanks!

And this is the response she received from the College Board:

Subject
Inquiry for SAT

Discussion Thread
Response Via Email (Vivian Agent ID 192285) 10/22/2015 02:27 PM
Thank you for contacting the College Board.

We have received your email in reference to the redesigned SAT and PSAT exams. We will be more than happy to assist you and provide you with some information.

To establish a strong foundation of evidence for validity, the new test design is based on a growing body of national and international research on the skills and knowledge needed for college and career readiness and success. Great care goes into developing and evaluating every question that appears on the SAT and PSAT exams. College Board test development committees, made up of experienced educators and subject-matter experts, advice on the test specifications and the types of questions that are asked. Before appearing in a test form that will count toward a student’s score, every potential SAT and PSAT question is:

• Reviewed by external subject-matter experts, such as math or English educators, to make sure it reflects the knowledge and skills that are part of a rigorous high school curriculum.
• Subjected to an independent fairness review process.
• Pretested on a diverse sample of students under live testing conditions for analysis by subgroups.

Meticulous care goes into developing and evaluating each test for fairness. Classroom teachers, higher education faculty who teach freshman courses, test developers, and other trained content experts write the test questions for the SAT and PSAT exams. Test developers, trained content experts, and members of subject-based development committees write the test questions for the SAT Subject Tests.

Test development committees, made up of racially/ ethnically diverse high school and college educators from across the country, review each test item and test form before it is administered. To ensure that the SAT, SAT Subject Tests, and PSAT are valid measures of the skills and knowledge specified for the tests, as well as fair to all students, the SAT Program maintains rigorous standards for administering and scoring the tests.

Careful and thorough procedures are involved in creating the test. Educators monitor the test development practices and policies and scrupulously review each new question to ensure its utility and fairness. Each test question is pretested before use in an actual SAT, SAT Subject Test, and PSAT exams. Not until this rigorous process is completed are newly developed questions finally used in the administrations.

For further information or assistance, please feel free to call us at 1 (866) 756-7346 (Domestic), 001 (212) 713-7789 (International), Monday through Friday, from 8:00 a.m. to 9:00 p.m. (Eastern Time) or visit us at www.collegeboard.org.

Thank You,

Vivian
Agent ID #192285
The College Board Service Center

The only part that took me even slightly aback was the length; I guess it’s hard to say nothing concisely. At any rate, it’s a masterpiece of obfuscation — “obfuscation” being highly relevant word that means “deliberating making something unclear in order to avoid awkward or unpleasant facts.” 

So while it’s reassuring to know that the writing committees are made up of “racially and ethnically diverse high school and college educators” (does that include current classroom teachers or just administrators?), it would also be nice to have a straight answer regarding whether ETS is still involved, and if so, in what capacity?

As Larry Krieger put it:

Why not just be succinct and say: “Yes, the ETS is responsible for the new SATs.” Or, “No a new team at Pierson [sic] is responsible for the new SATs.”

Reading between the lines, I would assume the answer is no but that the College Board has imposed some sort of prohibition against admitting as much directly — presumably because they don’t want to stir things up, and because that type of admission could lead to awkward questions about things like validity. Proclaiming that the new test is “relevant” has gotten the College Board pretty far, but then again the same thing happened with Common Core before people actually understood anything about it. The other tests linked to Common Core have already come under plenty of fire; the last thing the College Board needs is to be linked to that kind of controversy. 

It could just be me, but something doesn’t seem right here. 

 

Who is writing the new SAT?

Who is writing the new SAT?

A while back, in the course of my discussion about how to choose between the current SAT, the new SAT, and the ACT, I mentioned in passing that the SAT would no longer be written by ETS. Larry Kreiger (of Direct Hits and APUSH Crash Course fame) posted a comment expressing his surprise and asking what my source was for that information.

I responded somewhat sheepishly to Larry that I didn’t actually remember — I had been given the information so long ago that I actually no longer recalled who had told it to me. Until Larry asked me, I had just assumed it was common knowledge, or at least somewhat common knowledge.

As I pointed out in my previous post, the tests released thus far seem sloppier and less consistent than what I’ve come to expect from ETS (more about that another time).  Even when correct answers were justified, they seemed to be lacking the precision I’ve come to associate with the SAT. Granted, that could be because the details of the test are still being worked out, but these questions simply didn’t have an ETS feel. Having spent countless hours analyzing SAT questions in order to mimic them as effectively as possible, I think my instincts are pretty reliable.

Larry’s response to me, however, was as follows:

I can tell you that the CB has a long standing contract with the ETS. In fact they have a person who earns a six figure salary whose job is to monitor the contract. Given the lack of an authoritative source I believe it is likely that the ETS is writing the new test. I do agree that the questions are below the usual ETS standards. 

I was willing concede that I – or, rather, my source – had been mistaken, but now my curiosity was piqued. I didn’t want to be responsible for disseminating misinformation, but something about those questions seemed “off.”

I did some googling, which turned up absolutely nothing.

I also tried calling the College Board where, after a several surprised silences, various representatives quickly passed me off to other representatives, who eventually left me right back at the original menu. I considered making some further attempts; however, after considering that any response I did manage to elicit would inevitably consist of a non-answer involving edu-babble about “best practices,” “evidence-based standards,” “21st century skills,” and “preparation for college and career readiness,” I decided I was better off pursuing other avenues.

I got back in touch with the blog-reader who had worked for ETS several decades ago and had written to me to express her surprise at the clearly lowered standards. She mentioned that she’d heard David Coleman had “cleaned house” and had fired many experienced ETS writers, and that some of the sloppiness could be attributed to that.

When I thought about it, though, that didn’t make sense. My understanding has always been that the College Board and ETS are separate entities; ETS has traditionally contracted to write the SAT. Why would the head of the College Board have the discretion to fire ETS employees?

I posted on the LinkedIn SAT prep teachers forum, where one tutor (and former ETS writer) with ETS contacts reported that she couldn’t get a straight answer out of anyone affiliated with that organization.

While trying to find more information online, I stumbled across “The Revenge of K-12,” by Richard Phelps and R. James Milgram, which confirmed that “house cleaning” did in fact occur. For those of who haven’t been following my blog, Jim Milgram is an emeritus professor of math at Stanford and one of two members of the Common Core validation committee who refused to sign off on the Standards; he has since co-authored a number of papers presenting in-depth critiques of the process by which they were created and implemented.

As Milgram and Phelps write:

Prior to Coleman’s arrival, competent and experienced testing experts suffused the College Board’s staff. But, rather than rely on them, Coleman appointed Cyndie Schmeiser, previously president of rival ACT’s education division, as College Board’s Director of Assessments. Schmeiser brought along her own non- psychometric advisors to supervise the College Board’s psychometric staff. While an executive at ACT, Schmeiser aided Coleman’s early standards- production effort from 2008–2010 by loaning him full-time ACT standards writers. (It should be no surprise, then, that many of the “college readiness” measures and conventions for CCS-aligned tests sound exactly like ACT’s.)

I was aware that Coleman had brought in a number of people from the ACT, but prior to reading the article, I had not realized that Coleman had brought in new, less qualified people to advise the psychometricians themselves. (This is hardly a surprise, though – as Coleman has indicated in the past, he’s not particularly interested in whether his hires are qualified.)

I emailed Jim Milgram, who told me that unfortunately he had no more information regarding who was writing the actual test than what I had turned up.

Then, a couple of days later, I happened to find myself at the house of a friend whose son is a junior. My friend wasn’t sure whether I’d seen this year’s practice PSAT booklet, so she made a point of showing it to me. I’d already seen the test, but as I looked over all the fine print, I realized I should check it for references to ETS. I couldn’t find any, which meant nothing in itself, but then something else occurred to me – perhaps I could compare this year’s booklet to previous years’ booklets and see whether those booklets contained references to ETS.

Sure, enough, at the bottom of the back page, previous years’ PSAT booklets contained a standard disclaimer stating that the views presented in the passages were not intended to reflect those of the College Board, the National Merit Corporation, or ETS.

This year’s booklet did not mention ETS.

Screen Shot 2015-10-04 at 10.49.46 PMScreen Shot 2015-10-04 at 10.43.48 PM

Furthermore, references to the SAT on ETS’ website link back only to the current version of the test; aside from one mention of an ETS testing code, I could find no reference to ETS on any of the material intended for post-January 2016 use.

I realize that this is no way represents conclusive proof, but it does suggest that ETS is playing a less prominent role in the writing of the new test than it did in the old.

If it is in fact true that ETS is no longer writing the SAT, it would mark the end of a nearly 70-year relationship and create an even more radical break with the current exam than what has already been publicized.

SAT questions have always been written by a motley group – professional test-writers, teachers, even students – but they have also undergone an extensive, rigorous field-testing process closely supervised by people qualified to supervise it.

So if ETS is no longer responsible for writing the test (or assembling groups to write the test) and overseeing the test-writing process, then who is?

A group of test-writers handpicked/led by David Coleman? (We know how well his attempt at national standards-writing has been received.)

Former ACT writers and remaining College Board employees deemed sufficiently loyal to the Coleman regime?

Khan Academy employees?

Or, dare I say, it…might Pearson be somehow involved? (I really did think that was a stretch until I saw Dipti Desai’s graphic; I emailed her to ask what information she had about the connection, but I still haven’t heard back.) 

Based on Milgram and Phelps’s report, it would certainly seem that regardless of who is writing the actual questions, the people supervising the process have far less expertise than those who did so in the past.

Furthermore, with the experimental section gone, there will no longer be a way for new questions to be tested out nationally on an actual group of test-takers. That absence of an experimental section is, I imagine, a significant part of the reason ACT scales can be so unpredictable; the questions just aren’t vetted as rigorously.

Say what you want about the SAT, but it is nothing if not consistent. It’s actually quite remarkable to watch a student take a test administered in 2007 and one administered in 2014 and get exactly the same score. In the absence of an experimental section, it’s hard to see how that kind of consistency will be retained.

In reality, though, the new test is not really the SAT at all. Call it a modified (ripped-off) ACT, a Common Core capstone exam, or just a grab for lost market share. The name is simply being retained because altering it would risk calling attention to the extremity of the changes and create too much potential for backlash.

Likewise, the return to the 1600 scoring scale is a carefully calculated distraction, designed to make adults think that the test will be closer to what it was when they were in high school and thus not to bother to investigate further.  

Somehow, though, I wouldn’t be surprised if the same problems that have plagued other Common Core-aligned tests (PARCC, SBAC) start cropping up with the new SAT. Witness, for example, this discussion on College Confidential about determining cutoffs for National Merit. Somehow, I don’t recall so many pages of a thread ever being devoted to something so basic in the past. Elegance and transparency…right. And I’m guessing that this is only a warm-up for what’s to come.