The College Board has announced that beginning January 1, 2017, students who receive accommodations in school will automatically receive equivalent accommodations for all College Board exams (PSAT, SAT, SAT II, AP).
According to the Washington Post:
Early this year, as more states began to adopt the SAT or the ACT as a required test for high school students to take, the Justice Department’s Civil Rights Division began to look into complaints that the testing organizations were too stingy with accommodations to eligible students, Education Week reported.
In a new statement, David Coleman, president and chief executive of the College Board, said: “Educators, students, and families have asked us to simplify our process, and we’ve listened. The school staff knows their students best, and we want to cut down on the time and paperwork needed to submit a testing accommodations request.”
While this will obviously eliminate a major headache for many families, it probably isn’t overly cynical to assume that the College Board is making this move out of self-interest as well.
Which of course brings us to the ACT.
Now, the ACT is notoriously stingy about accommodations. I have had multiple students who were turned down after the initial request. Some appealed successfully; others did not. Even the successful appeals sometimes took months, and the extra time was only granted after a student had taken the ACT multiple times.
In contrast, most of my students who required extra time were able to obtain it for the SAT easily, the first time the request was made.
So while acquiring accommodations will undoubtedly become easier, it is questionable just how much of a broad-scale difference the change will make. It could very well make a huge difference, but I’d be hesitant to assume as much.
Given that the College Board is determined to wrest every inch of market share possible back from the ACT, it seems reasonable to assume that this announcement is in part a ploy to induce students who are on the fence between the SAT and the ACT, and who require accommodations, to opt for the former.
So the question, now, is whether the ACT will come on board and relax restrictions on extra time as well.
In addition, the effect of the change on student equity is questionable as well. In my (anecdotal, non-statistically backed-up experience), the students who receive accommodations in school tend to be those with the savviest, most persistent parents. Not coincidentally, those parents tend to be well-off and well-educated.
Some of their children have genuine learning disabilities — and I in no way intend to minimize the struggles of anyone in that category. For them, the College Board’s new policy will undoubtedly be a boon.
That said, there is another category of students who do not truly have learning disabilities, but who have been enabled (by technology, by ineffective pedagogy, by an incoherent curriculum, by parents, by tutors, and even by therapists) to the point where their ability to complete work on their own, under standard conditions, is severely compromised.
In some cases, mild to moderate difficulties that nevertheless fall within the range of normal are overblown and pathologized by well-meaning adults, with the result that the line between a learning disability and the belief in a learning disability becomes blurred.
It’s not that the student doesn’t genuinely struggle. It’s that the student, given a different set of pedagogical approaches and adult attitudes, would not struggle in the same way, or even at all.
I have witnessed this phenomenon many times, and it disturbs me more than I can say. Not only will the College Board’s new policy do nothing to deter it, but it will most likely encourage it further.
Conversely, the children of less-educated parents, or those who lack knowledge of how the system functions as well as how to work it effectively (and the time work it), are less likely to receive accommodations in the first place. As a result, they are no more likely to receive accommodations under the new system than they are under the old.
As a tutor, I observed a striking phenomenon: despite the pressure to boost students’ confidence levels, I noticed that the amount of confidence my students exhibited often had an inverse relationship to their amount of knowledge.
My highest scorers were moderately confident but also very aware of their weaknesses, whereas my persistently low scorers tended to overestimate their abilities, sometimes dramatically so. (True story: the only student who ever told me he was going to answer every question right on the SAT was scoring in the high 300s-400s.)
As for students who started off lower and raised their scores significantly, they almost always experienced a watershed moment in which they realized that the test was actually hard and that they were going to have to put more in to get the results they wanted. As their knowledge increased and they were able to more effectively self-assess – that is, to more accurately recognize what they didn’t know – their confidence was shaken. But notably, their performance continued to improve.
It turns out that all this is actually an established phenomenon known as the Dunning-Kruger effect; and as I’ve come to realize, it applies to teaching as well. Regardless of how well novice teachers know their subject, they don’t know what they don’t know about teaching.
Novice teachers, for example, do not know what stumbling blocks their students will likely encounter. This is a particularly acute problem for people who are naturally good at a subject. Having never struggled themselves, they often do not realize how much knowledge they take for granted. Indeed, the Dunning-Kruger effect also accounts for the tendency of knowledgeable people to assume that tasks they find easy are also easy for other people.
Because most new teachers cannot anticipate where the difficulties will most probably arise, they cannot take steps to address those potential pitfalls as they teach. As a result, they may inadvertently confuse their students, or end up having to spend time backtracking to clear things up. In some cases, they might not even become aware of the misunderstandings until much later, if at all, because they consistently overestimate the amount of knowledge their students possess. If something seems evident to them, why wouldn’t it seem obvious to their students as well?
I’ve come to hesitate about using the word “efficiency” because I think it has a somewhat dangerous connotation in today’s educational climate – behaviorism and canned, scripted lessons, trained pigeons, and the like – but I nevertheless think there is something to be said for it.
Some basic things can genuinely be taught quickly and easily, and when that is the case, there is absolutely no reason to waste students’ time and energy overcomplicating them. The most experienced (good) teachers know how to get the point across by making things seem simple and intuitive; they’re secure enough that they don’t need to show off by making concepts seem more sophisticated than they actually are.
However, when teachers are pressured to turn everything into a “high level critical thinking skill” – and to continually demonstrate to administrators that they are doing so – the result is that simple, straightforward concepts are presented tortuously, leaving students confused about the basics and unable to apply more genuinely sophisticated ideas in anything resembling a competent manner. (See: Common Core.)
I think these issues also hearken back to the false dichotomy between “rote learning” and “critical thinking.” As I’ve written about before, I think it’s more apt to think of these concepts as part of a spectrum. Despite all the rhetoric, I suspect that there are exceedingly few – if any – classrooms anywhere in the United States where students are simply required to memorize names and facts and formulas and dates without any consideration of their larger context.
The real question is not whether concepts are investigated in any depth, but rather what quality of depth they are investigated in, whether that type of depth is appropriate, and how effectively new information is linked to the rest of the curriculum.
Ideally, teachers should understand not only how what they are teaching builds on what students have done before, but also how it builds a further foundation for what students will be doing a year or two down the line. But in order to accomplish that, teachers must have a solid understanding of their subject as a whole, not just their own little piece of it.
In addition, “high-level critical thinking” is not always the best goal initially; sometimes shallow thinking has to occur first. But it really depends. So much of teaching new information and concepts involves negotiating and re-negotiating just how much depth is appropriate for a student, or group of students, at a given time. What’s true today might not be the case a month or six months down the line.
I was unaware of how much time I spent walking that line myself until I started training tutors and inevitably found myself confronted with the question of how to know the amount of depth to go into, and when.
How do you tell when to give a student the “hard” version of a rule as opposed to the easier “trick” that will get them the right answer 80% of the time?
How do know you when it’s time for a student who’s learned the easier version to make the jump to the harder version? How do you know when going into depth is more likely to cause more problems than it solves?
When I thought about it, I realized that so much of what had become intuitive to me was the result of having worked with dozens of students; of having observed patterns in their thinking; of having learned which questions to ask in order to accurately gauge their level of understanding; and of having seen which types of students responded best to which types of approaches. As a result, there was really no way for me to lay it all out in a set of rules.
And that, I realized, is part of what makes good teaching so challenging. It’s the constant monitoring of whether what you’re saying is really getting across, and knowing how to adjust your approach if things aren’t working. Those are things that come only with experience. Indeed, they are things most teachers do not really even start to think about until what they’re doing doesn’t work.
The crux of the issue is that teaching is something that happens between people. It does not matter how many education courses one has taken or how much developmental or pedagogical theory one has studied. It does not even matter how well one knows one’s subject.
One of the most important parts of learning to teach involves developing the ability to perceive the distance between oneself and others, and learning how to bridge that gap. This demands the ability to stop taking one’s own knowledge as a norm or point of reference, and to try to adopt the perspective of someone who knows much less.
Teaching is not just a matter of explaining xyz, but also of recognizing what parts of x are likely to require clarification to a particular group of students, or what parts of y students may be missing some of the foundation for – and of learning to work those issues into the lessons themselves so that the misunderstandings don’t even have a chance to occur. That is what I mean by “efficiency.”
I confess that I was a terrible know-it-all about some things when I started tutoring. I had my strategies, and since they worked best for me (and were pretty much all I knew), I tried to foist them on everyone I tutored. Sometimes it worked spectacularly well, and other times it, well, didn’t.
After working with enough students, however, I started to loosen up. I realized is that I needed to meet people where they actually were instead of where I thought they should be. Some relatively high-scoring kids, for example, had a terrible time with “big picture” reading questions on the SAT. They simply could not consistently identify main ideas, usually because they lacked sufficient context to make the leap from the literal words to what the passages were actually saying.
The more I learned about how what is called “reading” works, the less doctrinaire I became. Once I really clued into the fact that a lot of reading problems are actually knowledge problems, I stopped trying to insist that kids use strategies that were too sophisticated for them at that point. Understanding that sometimes there was no way to translate formal skills into concrete knowledge was in a way liberating for me. If students had already improved so dramatically reading passages in sections and diligently marking line references, who was I to insist that they throw that strategy away and approach the passages in a manner better suited to adult readers? That really wasn’t fair to them. Instead, I started building on what they had, in ways that worked for them.
But it took me years get to that point. Years. And some of that time was after I had written an entire book about reading!
To be clear, I should point out that I am not implying every veteran teacher is superior to every novice teacher – I think most people remember at least one teacher who had taught for decades and still managed to be an absolute disaster.
I am, however, suggesting that between the best veteran teachers and the best novice teachers, the former will pretty much always outshine the latter, hands down. This goes for tutors, classroom teachers, and pretty much anyone else responsible for teaching anything to anyone.
As is common knowledge by now, however, classroom teachers are currently leaving their profession in droves. Despite an occasional halfhearted gesture such as merit pay (whose effectiveness has been thoroughly debunked), most of the discussions about education now center on how to “build” a better teacher – as if great teachers could simply be churned out according to a formula.
One of the biggest problems (among many) with this line of thinking is that it completely overlooks the role of experience itself in making good teachers. There is absolutely no way to speed up the professional maturation process. What you end up with is a group of overconfident twenty-something ed school grads who can spout buzzwords like there’s no tomorrow but are utterly incapable of imagining just what it is they don’t yet know. And if there’s no one left to school them in those things – if the novices are the ones in charge – then the result is a very sorry state of affairs indeed.
While looking for models for the little sendup of progressive education that I posted recently, I came across a New York Times op-ed piece entitled “What Babies Know About Physics and Foreign Languages.” As the title and tag line (“Our kids don’t need to be taught in order to learn”) suggest, the piece is a pitch-perfect paean to the progressive ethos, touting the benefits of allowing preschoolers to learn “naturally,” through imitation.
While preschoolers can of course acquire many important skills this way, my immediate response to the article was to wonder how quickly Gopnik’s assertions about the benefits of natural learning for four year-olds would be misappropriated as an endorsement for treating higher levels of education this way.
As it turned out, I got my answer pretty quickly.
It came in the form of an Inside Higher Ed article entitled “Playing, Learning, and the Teaching Problem,” by Barbara Fister, a college librarian in Minnesota, and it pretty much epitomizes the phenomenon I was attempting to satirize.
Given that the article represents an outstanding specimen of its genre, I thought it might be interesting, not to mention instructive, to take a closer look at how the piece functions – that is, to consider the stock features and rhetorical moves it employs.
What makes Fister’s piece such a prime exemplar is the way in which it takes a real issue of concern, namely the difficulty that college freshmen have using and citing sources, and proceeds to draw a series of nonsensical and potentially destructive conclusions.
The article is also of particular interest to me because it gets at the heart of one of my major areas of interest: the high school-college reading and writing gap.
Note: if you’re studying for the SAT essay or planning to take AP Comp, you might want to pay attention to this. In particular, notice my use of concessions such as to be fair…, it is true that…, and this is a valid point. Although I take a clear stance against Fister, I also consider what she gets right. This a deliberate strategy designed to produce a nuanced analysis, one that shows I’ve considered the issue from multiple angles. I’m not “just saying stuff.”
So, that said, let’s start by considering Fister’s opening:
Allison Gopnik, a psychology professor at UC Berkeley, had an interesting piece in last Sunday’s New York Times that has been sticking to my brain like a burr. She argues that our current obsession with preparing children for success gets in the way of their learning. She describes several recent experiments with small children, who are naturally curious and determined to figure things out – and remarkably good at it. They learn about the world around them by observation, imitation, and play, not by being taught. In fact, she argues, if they are taught, they will imitate accurately, but being told how to do something takes away the opportunity to figure things out. When small children observe and imitate, they are testing the physical world around them and coming up with their own understanding of how things work. Explicit instruction short-circuits that process.
The first thing to notice here is that Fister uses Gopnik in order to establish credibility. Gopnik is a well-known cognitive psychologist who has published extensively about the way young children learn. As Fister immediately goes out of her way to point out, Gopnik is a professor at UC Berkeley (an elite institution) writing in The New York Times (an elite publication). The purpose of the first sentence thus functions as an appeal to authority, providing weight for the argument that follows: Gopnik is an expert in a scientific field, ergo her ideas – and, by extension, the author’s – have the backing of scientific fact.
The next thing to notice is that the following sentences outline one of the central tropes of progressive education writing, namely the “learning should be natural and teaching destroys it” argument (“She argues that our current obsession with preparing children for success gets in the way of their learning…”.). This is an idea that can be traced through a long line of educational theorists, starting with Rousseau in the eighteenth century and proceeding up through Emerson and Dewey, and ultimately ending up as accepted doctrine in pretty much every American school of education.
Now, it is possible, and even probable, that Fister believes she is being daring and innovative by making this connection – that she is pushing back against a rigid and hidebound system that sells students short by constraining their natural love of learning. But from the perspective of someone who has repeatedly encountered this exact argument, expressed in almost exactly the same words, it comes off as clichéd and almost painfully naive.
Also of note here is the fact that Fister is writing in Inside Higher Ed – this is a publication geared toward members of the academic community. As most reasonable people would agree, the pedagogical needs of four year-olds are quite different from those of 18 year-olds. While preschoolers are acquiring basic physical, social, and emotional management skills necessary to function independently in everyday life (skills that are indeed more important at that point than filling out worksheets), college freshmen are accruing more abstract, specialized, intellectual skills necessary to function in the public and professional adult sphere. Most of these skills are not “natural” by any means.
These two situations are not really comparable, yet as Fister moves from Gopnik’s research to her own story, she implicitly conflates them:
Every fall, as new students arrive on campus, I struggle with what to do to help them feel comfortable exploring ideas in the library and beyond. In that first semester, they tend to be stressed and pressed for time. They don’t have models in front of them for how scholars do research, but they’re often asked to find scholarly sources to use in an argument as they are introduced to academic writing. They are intensely curious about what the teacher wants, if not about the topic they’re researching, and often focus on getting that boring task done as efficiently as possible. It’s not just that there’s no time for creativity, or that they think creativity is a violation of the rule that you have to quote other people in this kind of writing. It’s simply too big of a risk.
Although Fister describes the typical freshman’s plight in great detail, she offers no compelling pedagogical justification for equating the needs of 18 year-olds with those of preschoolers, beyond the superficial fact that both are intensely curious and not entirely sure of what the adult world expects of them.
More tellingly, she casts the problem in terms of “creativity,” the primary lens through which education is understood within the progressive universe. It’s a cue suggesting that her argument is about to veer off course. And here we have the next step down the road to perdition, so to speak. Having identified some reasonable problems (students lack models for how to do research, they’re stressed out and lack time), she then edges onto less stable territory (being creative is too risky). The slide from one thought to the other is so subtle that the leap of logic is almost unnoticeable.
When you think about the problem carefully, though, it does not really make sense to think of the problem as one of creativity. If students lack models for writing that integrates multiple sources, then the solution should involve professors or TAs providing them with models and walking them through how those models work; and students are stressed out and lack time, then they need help managing their time. It’s possible that some of these students must juggle work as well as family obligations, but it’s also possible that some of them are spending hours drinking beer and playing video games (these are college freshmen, after all). In any case, creativity is not really the issue.
And what about the claim that students “don’t have models [for research] in front of them?” Perhaps professors are not taking the time to explain their research in class, but the idea that university students lack access to models of scholarship is, quite frankly, absurd. Set a foot on pretty much any campus, and you’ll find bulletin boards overflowing with announcements about lectures, conferences, and colloquia. Models of genuine scholarship abound. If undergraduates are not witnessing these core activities of academia for themselves, it is most certainly not for lack of access or opportunity.
What makes this type of discourse so slippery, however, is that Fister’s description then takes a solid turn back into reality:
When [freshmen] come to the library for an hour or two to do some guided poking around, they want to know the rules. How many sources do I need? Will these ones do? They’re not particularly interested in how those sources got there. In fact, they’re often not interested in reading them. After all, it’s right there in the rubric: points for citing articles accurately; nothing about reading them. When they patch together quotes lifted out of context, it’s hardly surprising. Since they haven’t seen how scholarship actually happens, and have never seen this way of mapping out the evolution of ideas, they have nothing yet to imitate or puzzle out. The rules are all they have.
This description is in most ways perfectly fair and accurate. College freshmen do in fact frequently arrive on campus not knowing how citations work in college, and not understanding how to integrate all those required points of view into their own writing. High schools rarely prepare students for this type of assignment, and even at very good high schools, the level of work and the expectations simply are not the same as they are in college.
The professors who give these types of assignments to freshmen may also be unaware of just how underprepared their students are – just how innocent they are about how the “game” of academia is played. That too is unsurprising: when people spend their entire professional lives performing a skill at a high level, it is easy for them to forget how mysterious that skill can seem to novices. Things that they would consider self-evident (academic writing integrates multiple points of view, some of which support the author’s argument and some of which contradict it) are often nothing of the sort. Misunderstandings are thus inevitable.
At the same time, though, one has to give students some credit. A rubric may indeed state that students only need “cite” articles, without mentioning that students must also read those articles, but any student – even a genuinely confused one – who simply goes through an article pulling out quotes, knows at some level that he or she is gaming the system. There is naiveté, and then there is deliberate misunderstanding. Does Fister seriously think 18 year-olds lack the ability to make the distinction?
Again, the problem is not a lack of “creativity.” It is unlikely that students have been required to engage, in writing, with a range of ideas, including ones they do not agree with, and in any case, it is exceedingly unlikely that they possess the rhetorical tools to do so. It is also an issue of students having learned, in high school, that they will be rewarded for doing the bare minimum spelled out on the rubric.
The second issue points in turn to several larger problems, namely that high school teachers are increasingly required by their administrators to track and report as much data as possible; detailed rubrics are necessary to monitor discrete skills. In addition, teachers are not infrequently faced with contentious students and parents who are ready to complain about poor grades. In that context, a rubric is a defense mechanism that can be used to justify taking off points in a particular area. And the pressure to hand out good grades based on nothing more than adherence to the rubric comes from all sides.
It is hardly surprising that students who are graded this way starting in elementary school naturally grow accustomed to having all aspects of their assignments spelled out for them. As these students arrive in college, schools are forced to adapt to their needs. So when Fister attributes students’ slapdash approach to using sources to professors’ use of rubrics, she is overlooking the fact that professors have likely adopted this particular tool because freshmen would literally have no idea of how to complete assignments otherwise.
It is also likely that professors have learned, through experience, that if they do not spell out the specifics of an assignment, they will receive a stack of papers almost entirely devoid of any sources, quotations, or outside points of view. In this regard, Fister mistakes a result for a cause. Making requirements clear is a way of ensuring that students, who after all are novices at this kind of work, meet a basic minimum standard. It is not in the least incompatible with creative thought.
To be fair, the continued spelling out of specific criteria might reinforce a less-than-motivated student’s passivity; however, it does not prevent a student who is genuinely interested in writing a good paper from doing stellar job. Students’ approach to the assignment is inevitably colored by their interest level in the subject, their academic backgrounds, what goes on in class, and dozens of external factors over which professors and especially librarians have no control.
The assumption that specific directives are by themselves responsible for stamping out creativity is, however, another one of the central tropes of progressive education. The problem is that concrete, realistic alternatives are almost never proposed in their place, only hazy what-ifs.
Sure enough, Fister again proceeds from a reasonable assumption…
But if you’re hoping students will practice writing formal academic writing using sources in the way academics do so that when they finally have a chance to do actual research they’ll know how to package it, you’re doing them no favors.
…to far more dangerous territory.
Showing new students how to find sources and cite them might actually interfere with the kind of learning we want, the kind babies do when they are doing what comes naturally – figure out through imitation and play.
In other words, when a skill is being taught ineffectively, the solution is not to improve the way it is taught, but rather to stop teaching it entirely in the hopes that students will naturally acquire it on their own.
This is not just a matter of throwing the baby out with the bathwater; it’s dumping both of them in the middle of the Pacific.
Yes, it is obviously unfair to expect students who don’t really know what academic research is all about to suddenly wake up one morning and use sources the same way a 40 year-old Ph.D. would. But what on earth would teaching college students through “imitation and play” actually look like? Would groups of students gather around their professors in their offices as they sit reading academic journals, or as they run statistical models on their computers? Would they dress up in suits and stage mock conferences during class-time to present their “research?” Would they play video games teaching them to navigate departmental politics?
Joking aside, I’d really like to know just what Fister means by this.
Again, the implication here is one of the guiding assumptions of progressive education: that having fun automatically translates into mastery. It’s true that students who are enjoying themselves are more likely to pay attention in class, but moving from high school writing to college writing is genuinely challenging for many students. It requires considerable sustained practice, some of which can be very tedious, and it also requires students to think in ways that can be new and uncomfortable. (I tutored one college student who had a habit of dismissing any argument she disagreed with as “propaganda.”) Students who are accustomed to always having things be made “fun” are unlikely to develop ability to persevere when things aren’t. The focus on managing students’ emotions, a staple of progressive rhetoric, is ultimately a distraction. There is no formula that reliably produces self-motivation, or “grit.”
Besides, aren’t the articles that students are “often not interested in reading” the very models that Fister claims students lack access to? Should professors have to spell out for their students that it is actually necessary to read sources, not just cite them? Should they actually have to sit and demonstrate what reading an article looks like? And are professors truly not talking about their research in class? College professors – even adjuncts – virtually always teach classes in their areas of specialization.
Professors are also pushing back against 18 years of exposure to popular culture that depicts professors as clueless eggheads with British accents and bowties who are barely capable of tying their own shoes. Students may be excited to be in college, and they may be eager to obtain a college degree; however, the vast majority of them are not in school to become professional academics but rather to earn a credential that will make them eligible for higher paying jobs. The kind of research they are being asked to do is often tangential to their reasons for being in college, and there is only so much even the most brilliant and inspiring professor can do to counteract such an entrenched utilitarian mindset.
Furthermore, while copying experts is certainly an important aspect of learning a skill, it is absurd to imagine that novices can become experts through mimicry alone – particularly when those skills are not “natural” but rather formal and often highly technical. Adopting the behaviors of an expert does not, alas, turn a novice into one. If anything, students who are naïve in the ways of the academy need more explicit teaching about how academic research and writing work, not less. Giving students some models to look at and then just turning them loose is a horrible idea.
According to another well-known cognitive scientist, Daniel Willingham (who studies things like reading and critical thinking in older students), it is necessary to understand about 98% of the vocabulary in a given piece of reading in order to comprehend it accurately. Since most college freshmen are not familiar with the specialized terminology used in many academic disciplines, and since academic articles tend to be dense and written for other academics, it is exceedingly likely that even the most advanced college freshman will have some gaps in their understanding – to say nothing of the average student, coming from an average high school, who may or may not have read an entire book sometime in the past four years. Even if students are motivated to read independently, they may completely misunderstand what they are reading.
Students who are struggling to literally comprehend even a single page of jargon-laden academic prose will not, therefore, automatically “catch” what they need to, and be miraculously inspired to view the library as a place where people can “discover and have fun while they are building their own understanding.” On the contrary, many of them will get bored, and frustrated and, if they don’t have anyone to ask for help, give up.
At the other extreme, students may latch onto the jargon and use it to cover a lack of substantive thought. Even if they adopt the pose of academics, their understanding remains superficial.
Fister also laments that “[t]he way we search now isn’t through connections, the way scholarly conversations work. We have been doing everything we can to flatten those conversations into a Google-like search box that takes terms in and returns a list of things to choose from, trying to make it easier and more familiar.” This is a valid complaint, but the problem is that in order to join those scholarly conversations, it is necessary spend time learning what has been said, who the major players and what the major trends are, where the major debates and controversies lie, and so on. Acquiring this knowledge requires a certain amount of patience as well as humility. That can involve a bit of a paradigm shift for students who have for years been praised for proclaiming their own, often shakily supported, ideas. Many of them will not take well to such a change.
Fister is also on shaky ground when pooh-poohing the tendency to view the library as a place for learning to follow “obscure rules that, if broken, carry harsh penalties.” (The denigration of detailed, formal processes as “obscure” is another classic move in the progressive playbook.) While researching a topic in which one is truly invested can be an intellectually thrilling experience, there are rules, very strict ones in fact, and the real-world consequences for breaking them can be harsh. Aside from the fact that real academic publications expect real professors to adhere strictly to particular citation styles, there is the not insignificant issue of plagiarism.
A student who does not understand the importance of citing sources, or who fails to attribute his or her ideas properly – even inadvertently – can end up on academic probation or worse. It is crucial that entering college students be made aware of the stakes. Universities take matters of intellectual property very, very seriously, and leaving freshmen to their own devices when it comes to matters of citation would be nothing short of disastrous. A librarian of all people should understand this.
Ironically, one of the people Fister cites has actually compiled a rather extraordinary list of things that beginning college students need to be taught about just what it is that academics do, and what it means to be a member of an academic community. (If you click the link, ignore all the cultural studies babble and scroll about halfway down the page.) The list is remarkable for its specificity, as well as for the fact that it takes absolutely nothing for granted. It provides exactly the framework that many college freshmen are lacking. I would argue that this, along with a copy of They Say/I Say, is what beginning college students need.
I realize that I’ve spent a lot of time belaboring some of these points, and that I’m doing so with a level of scrutiny that Fister’s piece might not seem to merit; however, given that the solutions it extols have already wreaked so much havoc on K-12 education, and are creeping into higher ed as well, I think it’s important to consider the implications with a dose of hardheaded realism.
It is all too easy for an outsider to get swept up in the warm fuzziness of it all and not realize that some of these proposals are unworkable and even harmful. And when that outsider is a billionaire donor who has no personal experience with education, save his own, and is convinced that everyone would be successful if all those fussy old professors just stopped all that gosh-darn boring teaching stuff and let students discover the joy of learning, then this type of rhetoric can have very significant effects. It’s bad enough at the high school level, but college students are typically paying thousands of dollars, and may even be incurring significant debt. No matter how innovative the architecture or how smart the technology, it is just plain unfair to leave college students to their own devices (pun intended) and expect them to figure out all the hard stuff themselves.
Just imagine if people talked about sports the same way they talk about education…
In the nineteenth century, when modern sports were invented, athletics served as an extension of the factories in which many of their players worked, reinforcing hierarchies and training athletes to be obedient and “play by the rules.” Today’s sports leagues are heirs to that model. Unsurprisingly, for many athletes, playing a sport has become a source of stress rather than one of joy.
Nothing could be more natural than the desire to run and play, but this inborn tendency is all too frequently destroyed by a system that emphasizes rote drilling of individual skills at the expense of more authentic forms of participation.
A new, more progressive model is clearly required, one that harnesses players’ innate love of games and movement, and that places players rather than sports at the center of the athletic process.
Instead of forcing players to abide by a narrow set of rules, rendering athletes passive and stifling their natural creativity, athletic programs should abandon the traditional one-size-fits all approach and strive to develop the whole player.
Thus, coaches should act as facilitators, dividing teams into smaller groups so that players can learn from one another and avoiding heavy-handed tactics such as directly instructing players in how to stand, kick, or dribble. And rather than repeatedly drilling low-level skills such as throwing and catching, a surefire way to stifle players’ natural love of games, coaches should create opportunities for players to develop higher-order performance skills. For example, a league could stage a mock Olympics, with each team dressing up in the uniform worn by the athletes of a specific country. Groups of players could research different aspects of their adopted teams and create posters presenting what they have learned.
Just as importantly, coaches should avoid treating teams as a single entity, or talking to players in a harsh or critical manner. Rather, they should adapt their coaching to athletes’ unique playing styles and seek to inspire each team member as an individual. When players commit penalties or other violations, coaches should not impose punishments such as “time outs” but should seek to understand the motivating forces behind players’ behavior.
Today, every baseball team across the entire country is forced to abide by a single set of regulations, as is every basketball team, soccer team, lacrosse team, and so on. What a boring way to play! Wouldn’t it be wonderful if players were instead encouraged to take an active role in constructing their own games?
If every team were responsible for inventing its own rules, for example, different teams could learn from one another, and players’ ability to innovate, think critically, and solve problems creatively would be vastly improved. Football players could learn from basketball players, and field hockey players could learn from sprinters, erasing artificial divisions between sports and facilitating players’ ability to communicate with other types of athletes. And rather competing against one another, teams could instead collaborate with one another in order to achieve common goals.
Furthermore, the ceaseless ranking of players and teams, as well as the awarding of medals and trophies, creates perverse incentives that are frequently damaging to players’ self-esteem. This model of athletics is not only often developmentally inappropriate, but it normalizes competition, substituting rewards for intrinsic motivation.
If players were no longer ranked or measured according to standardized criteria set out by bodies such as the NBA, the NFL, and the United States Tennis Association, they would be free to develop their true athletic potential. Instead of passively relying on external metrics such as passes, kicks, and goals for validation, they would be inspired to take ownership of their personal athletic development.
Becoming an athlete involves so much more than rigidly adhering to a group rules laid out by experts who often do not understand the relevance of sports to players’ lives. When players are encouraged to explore their unique passions and acquire a deep sense of themselves as athletes, everyone benefits. It is high time for athletics to be brought into the twenty-first century.
In my previous post, I outlined some of the ways in which the progressive methodologies that pervade much of the American system inadvertently fuel a reliance on the private tutoring industry.
On its surface, the tutoring model would seem to be the holy grail of progressive education. Teachers are encouraged to “personalize” their approach to fit students’ unique learning styles, “empowering” them to “find their passions” and “take ownership of the learning process.” But this perspective is based on both a simplification and a misunderstanding of how teaching and learning actually work.
Oftentimes, tutoring is assumed to be effective simply because it epitomizes personalized learning. But although personalization is a component of what makes tutoring effective, it is far from the only element – nor, I would argue, is it the most important element.
Likewise, the importance of soft factor such as personality “fit” and the ability to inspire is somewhat overblown. Obviously, those factors do count for something. I had students I adored, whom I always looked forward to seeing, and whose families I have remained friendly with for nearly a decade now. I even had one student who genuinely fell in love with English, ended up as editor of his college newspaper, and is now a professional journalist! (To be fair, he was a star in English class before I showed up.)
But the reality is that I also taught students of whom I was not particularly fond; tutoring them was, to be frank, a job. It is unrealistic to expect that any tutor, like any other human being, will get along with every other person with whom they work. The point, though, is that provided those students did their work and showed up diligently, they still improved very significantly.
Conversely, some of the students whom I got along with wonderfully, and who could rhapsodize wide-eyed about their love of learning, never quite seemed to make the kind of improvement they wanted. Almost invariably, these students attended the most progressive schools. Somewhere along the way, they had clearly absorbed the belief that being excited about learning was synonymous with actually learning.
These students were often very enthusiastic, and we had a wonderful time together, but they were somehow unable to put in the necessary practice on their own. I always got the sense that they had never done the kind of work that real improvement would have required – that they literally had no concept of it. Sometimes, they even switched tests in the hopes that their scores would magically rise without their having to put in too much work. Needless to say, that approach did not pay off.
Interestingly, I have several colleagues who regularly find themselves in the position of being “second round” tutors – tutors who are called in after a student has failed to make sufficient progress with another tutor, or even multiple tutors. Like me, they are often stunned at the types of basic information their students’ previous tutors failed to impart, or at least to impart in a way that students were able to absorb. If personalization were truly the issue, these types of scenarios would not occur with such alarming regularity.
I suspect that many, if not most, of these formers tutors are well-meaning, but techniques that are ineffective in the classroom are just as ineffective in one-on-one situations. An adult who lets a student flail around for 15 minutes trying to “discover” a concept that could be easily taught in three is doing a major disservice. I’ve witnessed this kind of teaching, and it’s almost painful to observe. (I have to restrain myself from grabbing tutors by the shoulders, crying, “Just teach it to them already!”) One is left with the impression that these tutors have been so thoroughly indoctrinated with the importance of indirectly “guiding” students that they cannot really see what is happening in front of them.
For their part, students do not generally get overtly upset because they want to please their tutors, and tutors can consequently pat themselves on the back for helping students take control of their own learning. But the result is that basics are made out to be inordinately complicated and confusing, preventing students from ever really getting a handle on the subject or progressing to more advanced activities.
Sometimes this state of affairs goes on for months before it becomes apparent that something just isn’t working. Finally, parents start hunting around for yet another tutor, one who can really get the job done. At that point, they’re eager to have someone knowledgeable and competent, with a demonstrated track record, tell their teenager (and possibly them as well) exactly what to do.
I’ve had several recent discussions with fellow “second-round” colleagues about just what it is that makes the most effective tutors so effective, and the overwhelming consensus is always that the best tutors possess a very particular type of efficiency. Not only do they know their subjects phenomenally well and are able to present them in such a way that students can both retain the material and apply it when it counts, but they can anticipate the problems a student is likely to have and tailor sessions so as to cut off those problems before they even have a chance to occur. As a result, they can sometimes accomplish in only a few sessions what another tutor might not be able to accomplish in months.
Not coincidentally, this type of targeted tutoring is highly traditional in many ways – even if it does contain what are usually thought of as progressive elements. It is student-centered insofar as it is targeted to the student’s particular needs; however, its primary aim is not develop unique gifts or creativity (although the student may sometimes discover a new gift as a result) but rather to transmit information in as clear, coherent, and systematic manner as possible, and to ferret out points of weaknesses so that they can be directly addressed.
Although this type of tutoring must be a conversation in which the student is an involved participant, it is a conversation in which the tutor is unapologetic about knowing more than the student does and is fully willing to embrace responsibility for that fact. It also involves very considerable amounts of repetition.
In that regard, it is the polar opposite of pretty much everything current wisdom about education holds dear.
But because this type tutoring is so personalized, and often so engaging, no one really notices its more traditional features. (Indeed, “traditional” and “boring” are so thoroughly conflated in the popular imagination that any teaching that is not boring is automatically assumed not be traditional.) Besides, when college admission is on the line, educational theories are the last thing anyone worries about. And the inescapable fact is that whatever someone happens to think about it, this type of teaching works.
Thus, tutoring largely escapes the kind of criticism that, in another context, would be heaped on the type of pedagogy it employs.
To come at this from another angle, I think it’s fair to say that the lack of regulation is simultaneously the best and the worst aspect of the tutoring industry. Anyone can throw an ad up on Craigslist and advertise their services, and there are a lot of hacks out there. On the other hand, the lack of oversight means that private tutors are not compelled to march in lockstep with pedagogical fads. They remain free to use techniques more common in 1986, or even 1966, without any fear of pushback. Pragmatism is free to trump ideology.
People who wonder why bright college grads don’t want to go into teaching should look no further than the tutoring industry because there are certainly plenty of them there. If schools don’t offer sufficient autonomy – in my experience, successful tutors tend to be somewhat quirky as well as fiercely independent – the private sector certainly allows these individuals free reign, not to mention potentially far higher compensation.
The supply side only exacerbates the issue.
As more students come through a progressive-inf(l)ected system, college included, fewer and fewer graduates have experience with the most effective type of direct instruction. And people can’t normally teach in a format they haven’t experienced themselves. On this subject, a quick anecdote: A colleague, a decorated AP teacher, told me the story of a group of younger teachers sent in to observe him teach. They had heard he was “traditional” and were astonished to discover that he did not simply talk at his students for the entire class but actually allowed them to ask questions. And this was in a highly ranked district in one of the most educated states in the country.
Furthermore, the promotion of STEM and “practical” (read: business) degrees has also lead to an ever-declining number of students achieving advanced competencies in the humanities. Despite the popular rhetoric about useless English majors working at Starbucks, the reality is that only 6.1% of all college students received degrees in all areas of the humanities combined in 2014.
In addition, humanities departments at many schools are notorious for their lack of rigor as well as their grade inflation. It is usually safe to assume that even English majors have never had to diagram sentences.
The people who do acquire serious skills in the humanities tend to come out of a small group of elite schools and be fairly privileged to begin with. Within that group, the number of people who can also teach well is quite small indeed. An even smaller group actually wants to teach. And don’t even get me started on “soft” factors like reliability.
Now, basic economic theory of course states that decreased supply of an item or skill will cause prices for that item or skill to rise. And nowhere is this clearer than in the private market for test-prep tutoring, where the ability to effectively teach a certain set of skills usually deemed “irrelevant” is actually in very high demand.
As a tutor, I spent a good deal of time covering material I had been directly taught for free, in public school – material that was very clearly foreign to my students. To put it mildly, it always seemed to me that there was something not quite right about that.
Essentially, the direct instruction of crucial skills that used to be – and that should still be – standard fare in classrooms across the country has now become something accessible to a much smaller fraction of students.
Techniques that would be viewed with distaste when associated with less privileged students have been transformed into a coveted marker of status. I know of one Manhattan tutoring firm, famous for its exorbitant rates, whose tutors reportedly dictate notes while students write them word-for-word, by hand.
I do think that this situation is in large part the result of misplaced good intentions. But in seeking to avoid one extreme, it is possible to go too far in the other direction. Pedagogical strategies that are appropriate for preschoolers are far less suited to high schoolers; and to return to one of my favorite themes, what makes students happy in the short term is not necessarily what will serve them best in the long run.
Looking back on my own high school experience, some of the teachers from whom I learned the most were not the inspirational ones, but rather the merely competent and unremarkable ones who, in their own steady, dull way, taught me what it meant to acquire a rock-solid foundation in a subject. For a girl whose parents could not help her with her homework (not until I was out of college did I realize that some of my classmates had probably received that kind of support), and who was only dimly aware of the concept of professional tutoring (which would have been unaffordable anyway), that was not a small thing.
That foundation took me very far in college, and I literally would not be where I am without it. I suspect that those types of teachers are in much shorter supply today. I am sorry for that, and for all the students whose educations will be shortchanged because of allegiance to a theoretical ideal.
Unfortunately, as education schools increasingly promote the teacher-as-facilitator model, other approaches are largely reduced to a caricature. And as teachers come under increasing administrative pressure to employ progressive pedagogies, teachers who don’t fit the mold are unlikely to remain in the classroom for decades the way their predecessors did. That is a shame for the education system – but for tutoring industry, it is a boon.
To begin this post, two anecdotes.
The first one comes from Shamus Khan’s Privilege: The Making of an Adolescent Elite at St. Paul’s School. In the book, Khan recounts the following story about a graduate of the uber-elite St. Paul’s school in New Hampshire:
“I don’t actually know much,” an alumnus told me after he finished his freshman year at Harvard. “I mean, well, I don’t know how to put it. When I’m in classes all these kids next to me know a lot more than I do. Like about what actually happened in the Civil War. Or what France did in World War II. I don’t know any of that stuff. But I know something they don’t. It’s not facts or anything. It’s how to think. That’s what I learned in humanities.”
“What do you mean, ‘how to think’?” I asked.
“I mean, I learned how to think bigger. Like, everyone else at Harvard knew about the Civil War. I didn’t. But I knew how to make sense of what they knew about the Civil War and apply it. So they knew a lot about particular things. I knew how to think about everything.” (44)
The second anecdote is mine.
A couple of months ago, I had lunch with a colleague who works at an elite tutoring company in Manhattan. We started chatting about what makes the best tutors so effective, and at one point in the conversation, my colleague mentioned somewhat sardonically that the more clients paid, the more they wanted to be yelled at – a statement to which I immediately and knowingly nodded my assent. I was very familiar with the phenomenon, but I’d never heard it expressed in quite such stark terms, and the comment stuck with me.
I find these two anecdotes particularly telling because they perfectly capture two sides of the same coin. On one hand, schools – particularly elite private schools – tout the wonders of progressive education, the sort of schooling that goes beyond “mere facts” and “rote memorization” and teaches students to think; and on the other hand, those same students who spend their school hours gathered around Harkness tables (or even just desks dragged into “clusters”) expounding on The Great Gatsby or I Have a Dream while a teacher walks around the room facilitating, not infrequently spend their after-school hours at Kumon or at the side of highly compensated professional tutors who provide the sort of direct instruction that is largely missing in the classroom.
Discussion of equity and test prep are virtually always predicated on the assumption that wealthy students do well on standardized tests because they are able to afford classes/tutors who can teach them specialized “tricks.” As I’ve written about before, the definition of a “trick” is an awfully slippery one. I suspect part of the reason the whole idea of tricks is so appealing is that no one can – or wants to – imagine that well-off students could possibly need help with the basics. Yet that is all too often the case.
Part of the reason that top schools can get away with eschewing fundamentals is of course that many of their students tend to have highly educated parents, and/or are exceptionally motivated academically. The University of Chicago Lab School, which was founded as a laboratory for John Dewey, the patron saint of American progressive education, now has a student body composed of more than 50 percent University affiliates. (As a side note, Dewey spent only two years teaching high school and one teaching elementary school before deciding he wasn’t cut out for the classroom.)
Students in that category tend to either naturally absorb, or be taught by their parents, a good deal of basic and not-so-basic knowledge that other students must obtain in school. As a result, they can come through twelve years of group work and project-based learning relatively unscathed, and schools can continue to securely promote the philosophy that the classroom is a place for exploration and the development of each child’s unique qualities.
The problem, however, comes when students who have failed to naturally “catch” some of the basics, and who have been asked to work at a faux-high level that is well beyond what their actual skills allow, are suddenly placed in a high-stakes situation that requires them to have mastered certain fundamental skills and suddenly cannot perform at the level they are expected to.
Likewise, when high schools attempt to be “rigorous” by assigning college-level reading and writing assignments without explicitly giving students the tools to complete, them the result is typically procrastination, frustration, panic, parent-child screaming matches, and all-nighters (with the last two usually occurring simultaneously).
The inescapable fact is that if schools are not spending enough time ensuring that students master the basics, and students want to – or need to – master those basics, someone still needs to teach them.
And that is where the tutoring industry steps in.
I know this for a fact because I was the person called in to help teach the basics, be it subject-verb agreement or how to write an introduction – over and over and over and over and over again. I witnessed the screaming matches (at midnight!) and the panic and the meltdowns. All the things teachers don’t get to see.
So largely thanks to people like my former self, schools can remain blissfully shielded from the shortcomings of ed-school theory. In fact, everything teachers and administrators observe is a testament to its efficacy. If students struggle, the problem must be with them. Perhaps it’s undiagnosed ADD, or an executive functioning problem – something that can be conveniently be made someone else’s problem, and not infrequently medicated away. The basic pedagogical orthodoxies are never questioned.
Provided they can get the help they need, though, the students who are obviously struggling are in some ways lucky. As for the kids who seem to be coasting through just fine… Well, that’s how you end up with rising Harvard sophomores who don’t know what, like, actually happened during the Civil War. (But hey, it’s Harvard, so grade inflation.)
Let me be clear, though: this is not just an elite phenomenon, although it is often most exaggerated in that environment. Given all the standard rhetoric about “drill and kill,” “rote learning,” and the droning lecture of boring teachers (who could of course be gotten rid of if only those pesky tenure regulations were overturned), I think that the extent to which the American education system has been shaped by progressive principles is difficult to grasp for someone who has never spent time a classroom in a genuinely traditional system.
Read any report on education in virtually any American publication, though, and you’ll see an almost compulsive insistence on the idea that the primary goal of education is to develop that elusive quality known as “creativity.” (What exactly does mean, by the way? Finger-painting? Designing video games with loud noises and cool graphics? Creating an app for pre-cooked, organic meal delivery?)
To cite a typical example, consider this quote from an article about the importance of handwriting, which recently appeared on the PBS website:
If teachers required students to take their own notes or (on top of that) requested that they handwrite them, students could perform better on tests—and they might even feel empowered to be more creative throughout the learning process, too.
Even this most traditional of skills must be presented in such a way that confirms the accepted worldview. It is not enough to learn skills because they are important and aid the learning process – they must be spun as promoting creativity, no matter how tangential the relationship. The edu-speak is a fallback, signaling there is no other acceptable way to think of think about education.
As E.D. Hirsch has explained, for nearly a century now, the shortcomings of the education system have been attributed to “rote learning,” with more progressive pedagogies inevitably proposed as the solution. When the latter methodologies prove to be ineffective, the result has inevitably been to double down and insist either that they are not being implemented properly, or that they are not being implemented at all.
But the less direct instruction students receive in class, the more they need to seek it elsewhere. The need for it doesn’t disappear – it just moves elsewhere, out of the public domain and into the private.
And if parents cannot provide it, then many of them can at least pay for it.