Using evidence is not a formal skill
In my previous post, I examined the ways in which most so-called “supporting evidence” questions on the new SAT are not really about “evidence” at all, but are actually literal comprehension questions in disguise.
So to pick up where I left off, why exactly is the College Board reworking what are primarily literal comprehension questions in such an unnecessarily complicated way?
I think there are a couple of (interrelated) reasons.
One is to create an easily quantifiable way of tracking a particular “critical thinking” skill. According to the big data model of the world, things that cannot be tagged, and thus analyzed quantitatively, do not exist. (I’m tagged, therefore I am.) According to this view, the type of open-ended analytical essays that actually require students to formulate their own theses and analyze source material are less indicative of the ability to use evidence than are multiple-choice tests. Anything holistic is suspect.
The second reason – the one I want to focus on here – is to give the illusion of sophistication and “rigor.”
Let’s start with the fact that the new SAT is essentially a Common Core capstone test, and that high school ELA Common Core Standards consist pretty much exclusively of formal skills, e.g. identifying main ideas, summarizing, comparing and contrasting; specific content knowledge is virtually absent. As Bob Shepherd puts it, “Imagine a test of biology that left out almost all world knowledge about biology and covered only biology “skills” like—I don’t know—slide-staining ability.”
At the same time, though, one of the main selling points of Common Core has been that it promotes “critical thinking” skills and leads to the development of “higher-order thinking skills.”
The problem is that genuine “higher order thinking” requires actual knowledge of a subject; it’s not something that can be done in a box. But even the new SAT is being touted as a “curriculum-based test,” it can’t explicitly require any sort of pre-existing factual knowledge – at least not on the verbal side. Indeed, the College Board is very clear about insisting that no particular knowledge of (mere rote) facts is needed to do well on the test. So there we have a paradox.
To give the impression of increased rigor, then, the only solution was to create an exam that tested simple skills in inordinately convoluted ways – ways that are largely detached from how people actually read and write, and that completely miss the point of how those skills are applied in the real world.
That is, not coincidentally, exactly the same criticism that is consistently directed at Common Core as a whole, as well as all the tests associated with it (remember comedian Louis C.K.’s rant about trying to help his daughter with her homework?)
In practice, “using evidence” is not an abstract formal skill but a context-dependent one that arises out specific knowledge of a subject. What the new SAT is testing is something subtly but significantly different: whether a given piece of information is consistent with, a given claim.
But, you say, isn’t that the very definition of evidence? Well…sort of. But in the real world (or at least that branch of it not dominated by people completely uninterested in factual truth), “using evidence” isn’t simply a matter of identifying what texts say, i.e. comprehension, but rather using information, often from a variety of sources, to support an original argument. That information must not only be consistent with the claim it is used to support, but it must also be accurate.
To use evidence effectively, it is necessary to know what sources to consult and how to locate them; to be aware of the context in which those sources were produced; and to be capable of judging that validity of the information they present — all things that require a significant amount of factual knowledge.
Evidence that is consistent with a claim can also be suspect in any number of ways. It can be partially true, it can be distorted, it can be underreported, it can be exaggerated, it can be outright falsified… and so on. But there is absolutely no way to determine any of these things in the absence of contextual/background knowledge of the subject at hand.
Crucially, there is also no way to leap from practicing the formal skill of “using evidence,” as the College Board defines it, to using evidence in the real world, or at least in the way that college professors and employers will expect students/employees to use it. If you don’t know a lot about a subject, your ability to analyze – or even to fully comprehend – arguments concerning it will be limited, regardless of how much time you have spent labeling main ideas and supporting details. That is why even the most motivated students can hit the 700 wall in SAT Critical Reading, sometimes while scoring 800s in Math and Writing; there are a sufficient number of holes in their general knowledge that there’s always something they misunderstand. There is no short-term way to get around that weakness, no matter how many “main point” or “primary purpose” questions they do.
This (misc)conception of “evidence” as a strictly formal skill leads to a parody of what real-world academic inquiry actually consists of. A 16 year-old might impress her teacher by throwing around words like “discourse,” but that does not mean that her analytical abilities are in any way comparable to those of a 50-year old tenured historian with a Ph.D., a list of peer-reviewed articles, and a couple of books under her belt — not to mention a rock-solid understanding of the chronology, major players, and running debates in her particular area of specialization, as well as the ability to sit still, take notes, and listen to her colleagues speak for long stretches at a time. Yet the College Board is effectively insisting that by superficially mimicking certain aspects of the work that actual scholars do, teenagers can leapfrog over years of hard work and magically acquire adult skills. (Ever watched a high school sophomore try to complete an exercise in “historical thinking” about the Spanish conquest of the New World when she isn’t quite sure who the Amerindians were? I have, and it’s not pretty.)
In his “Common-Sense Approach to Common Core Math” series, Barry Garelick makes this point as well:
[Students] are taught to reproduce explanations that make it appear they possess understanding—and more importantly, to make such demonstrations on the standardized tests that require them to do so. And while “drill and kill” has been held in disdain by math reforms, students are essentially “drilling understanding.”
The repeated going back to the text to answer “evidence” questions serves exactly the same purpose; it gives the appearance that students are performing a sophisticated skill when in fact they’re doing nothing of the sort. The underlying issue, namely that students might not actually understand what they read because of deficiencies in vocabulary and background knowledge, is conveniently sidestepped.
I suspect that the College Board’s “skills and knowledge” slogan was created in an attempt to head off this criticism. By cannily associating (eliding) those two things, the College Board implies that it knows just what this whole education thing is really about, and that the new SAT reflects…well, all that good stuff.
Let us recall, though, that Common Core standards essentially had to consist of a series of empty formal skills slapped together and pushed through as quickly as possible in order to circumvent close investigation or pushback. Factual knowledge was an afterthought. It was never dealt with because it was politically inconvenient and could easily have led to the sort of controversy that would have deterred governors from signing on to the standards.
As a result, proponents of Common Core are left to assert that the knowledge element will somehow just take care of itself. Exactly how that is supposed to happen is never explained, but rest assured, it just will. That is how you get nonsensical articles like Natalie Wexler’s New York Times piece, “How Common Core Can Help in the Battle of Skills vs. Knowledge”. As it turns out, Wexler chairs the board of trustees at an organization called Writing Revolution… an organization that David Coleman just happens to sit on the board of. That’s quite a coincidence, is it not?