I recently came across an Atlantic article by the child psychologist Erica Christakis, in which she discusses a concept she terms “adultification”—that is, the attribution of adult traits and behaviors and ways of thinking to children. On its surface, the article—which focuses on active shooter drills in elementary schools, of all things—seems very far removed from things like test prep and college admissions; however, as I read through the piece, I couldn’t help but notice a link. I think Christakis really nails this phenomenon in a way I haven’t seen elsewhere. As she writes:

On the one hand, we view children as incredibly vulnerable—to hurt feelings, to non-rubberized playground surfaces, to disappointing report cards. This view is pervasive, and its consequences are now well understood: It robs children of their agency and impedes their development, and too often prevents them from testing themselves either physically or socially, from taking moderate risks and learning from them, from developing resilience.

But on the other hand, we demand preternatural maturity from our children. We tell them that with hockey pucks and soup cans and deep reservoirs of courage, they are capable of defeating an evil that has resisted the more prosaic energies of law-enforcement officers, legislators, school superintendents, and mental-health professionals. We ask them to manage not the everyday risks that they are capable of managing—or should, for their own good, manage—but rather the problems they almost by definition cannot.

This second notion of the child stems from what I call adultification, or the tendency to imagine that children experience things the way adults do. Adultification comes in many forms, from the relatively benign (dressing kids like little adults, in high heels or ironic punk-rock T-shirts) to the damaging (the high-stakes testing culture creeping into kindergartens). We also find adultification in the expectation that kids conform to adult schedules—young children today are subjected to more daily transitions than were previous generations of children, thanks to the dictates of work and child-care hours and the shift from free play to more programmed activities at school and at home.

Similarly, we expect children to match adults’ capacity to hurry or to be still for long periods of time; when they fail, we are likely to punish or medicate them. Examples abound: an epidemic of preschool expulsions, the reduction in school recess, the extraordinary pathologizing of childhood’s natural rhythms. ADHD diagnoses, which have spiked in recent years, are much more common among children who narrowly make the age cutoff for their grade than among children born just a week or so later, who must start kindergarten the following year and thus end up being the oldest in their class; this raises the question of whether we are labeling as disordered children who are merely acting their age. The same question might be asked of newer diagnoses such as sluggish cognitive tempo and sensory processing disorder. These trends are all of a piece; we’re expecting schoolchildren to act like small adults.

I think that what Christakis describes here is also trickling down (up?) to affect high school and the college admissions process. In fact, I suspect that much of the mania surrounding admissions—both in terms of academics and extracurriculars—can be directly attributed to it.

On the academic side, I see it manifesting itself primarily in the increased drive to push students into AP classes at younger and younger ages. Once upon a time (say, back in the olden days of the mid-late 1990s, when I was in high school), AP classes were treated as something best taken sparingly, and only when students were genuinely performing at an advanced level in a subject—because, you know, they were college-level classes, and we were in high school.

Somewhere along the line, though—whether because the College Board realized just how profitable the AP program had the potential to be, or because the online Common App led to application inflation, pushing the academic bar higher—taking a ridiculous number of AP classes became something much closer to norm.

The fact that some of the test have been dumbed down and the scoring made more lenient notwithstanding, one really has to wonder: why is it no longer ok for high school students to just be in high school? Why should they have to try to be in college instead? Why should sophomores, and now even freshmen, now get pushed into classes that many of them don’t come close to having the background or skills to manage? And how did so many people come to believe that classes can’t be challenging unless they have the AP label slapped on them?

How did the narrative get this messed up?

A very concerted marketing campaign by the College Board, for starters, plus the understandable push for advanced standing given the insane cost of college, plus the fact that high schools have slashed honors clashes because of budget cuts, plus a belief that students have to be STEM superstars to succeed in the twenty-first century economy. But all that said, I think that the issue has gone beyond practical concerns now and morphed into a general sense that students have to do more, and younger, and faster.

In many ways, I get the sense that the entire American educational system is now operating on a profoundly distorted sense of time. The pressure on teachers to demonstrate to administrators that students are engaged in “higher order thinking skills” means that, ironically, children are often not given the opportunity to master the fundamentals necessary for genuinely advanced work and are instead reduced to a kind of play-acting (e.g., history classes in which students “practice historical thinking” or “learn to think like historians” when they have not even mastered basic chronology or aren’t really sure who they various parties involved are).

The notion that children’s actual level of development can be ignored because actual grownups find it inconvenient or boring to deal with—or worse, have no idea how to deal with it; or even worse, are oblivious to it—that, to me, is the essence of adultification.

Part of this is also no doubt a side effect of data-driven ed-world, in which constant, small-scale, graphable feedback is prized over the messiness of long-term development. That kind of learning is not linear, nor does it always lend itself to simple, quickly graded multiple-choice assessments that can be plotted as points on a graph. But the obsessive focus on that kind of data collection creates a state of constant hysteria in which more material must always be covered, and faster, regardless of whether students are really absorbing it in any meaningful way.

Oh, and then there are the elite private schools, where Ph.D.s who have failed to land tenure-track positions now go to pretend they’re teaching graduate seminars. That’s a different sort of adultification altogether.

So that’s part one.

Part two is that somewhere a couple of decades back, colleges stopped trying to admit well-rounded students and started trying to admit well-rounded classes. It was a significant shift, but one that doesn’t yet seem to have been fully absorbed by the public at large. (Seriously, I want to groan every time I come across an article in which the writer proclaims thateveryone knowselite colleges are looking for students who are well rounded—no, they’re not.)

I think that this misunderstanding is what drives so much of the misplaced expectations regarding elite college admissions. When the parents of today’s applicants were applying to college, the Ivies were still willing to admit the straight-A captain of a sports team with a few AP classes and 1400-ish SATs who also did community service and held a part-time job flipping burgers. Smart kids, but normal, with achievements in line with their age. Today, that kind of profile would put a kid smack in the middle of the applicant pool; without a significant hook, he or she would not be particularly competitive (and, realistically given the odds, should probably not even be applying).

The expectation now on the part of elite schools is that their teenage applicants will arrive quasi-formed as experts in…whatever. The budding scientists will have spent time doing research in real labs, and may even have their names on a published paper (or, at the very least, be finalists in national or international competitions). The rowers and soccer players and swimmers will be headed to the Olympics—if they haven’t been there already. The budding venture capitalists will have already started their own companies; the budding philanthropists will have already raised tens of thousands of dollars and started their own “empowerment” programs… and so on.

Part of what’s driving the extracurricular arms race, of course, is rampant grade and score inflation, along with a loosening of testing requirements: in the past, many of today’s applicants simply would not have had strong enough stats to apply. But with so many candidates now clearing the minimum academic bar on paper, adcoms inevitably end up focusing on what students have done outside the classroom. And the higher that bar gets raised, the more outside help students need—and this is, naturally, where the consulting industry steps in.

Because how could it not? This is effectively a stunting of natural progression: with the exception of a very small number of truly, wildly exceptional individuals, it is unrealistic to expect anyone to be so accomplished by the age of 18. Most college applicants will live another 60 years or so; to insist that they cram so much achievement into the first not-even-quarter of their lives is frankly ridiculous. It is also deeply, deeply unfair, to both the students whose parents can pay to have them fake it (and who have no illusions about what’s going on) and to the students who parents can’t pay but who aren’t in a position to pull it off for real themselves.

I think it’s worth quoting here from a Reddit thread to which a reader directed me to after my previous post. The writer, a Stanford senior, provides a brutally clearheaded assessment of what sort of students the current system produces:

So what happens after high school graduation? The kids who run foundations/ non-profits/ programs, at least in my super competitive silicon valley suburb, don’t go on to keep up this facade for the rest of their lives (why would they?). Most of the kids in my area, myself included, went on to major in econ/CS and sell our souls out to a giant tech company/ investment bank/ consulting firm after graduation. Despite our liberal political inclinations, few Stanford students graduate and truly go on to advocate for the communities they supposedly dedicated themselves to in high school. Sure, there are some exceptions.

But for the most part, there’s a huge campus mentality of “ditching your high school self” and “getting to live a little for the next 4 years” on the Farm because a good portion of us–especially unhooked applicants like myself–spent almost all of our high school years to get into schools like Stanford.

For the record, lots of my supplements (including my Stanford one) talked about how “I was driven to empower students from East San Jose/ Oakland from the beginning of my journey,” but clearly, that’s not the case. And [admissions officers] never noticed, as both my Stanford and Yale regional AO gave me hand-written, physical notes in my acceptance packages telling me how they “could just feel my enthusiasm for using art as a praxis of empowerment.”

I would suggest that this excerpt should be read by every admissions officer at an elite college, but the institutional pressure to brush off its implications is so great that I frankly don’t think it would make much of a difference. Otherwise intelligent people can be staggeringly, willfully myopic when it comes to protecting the systems in which they operate (as well as their own capacity for judgment—no one wants to cop to being a dupe, least of all well-meaning social justice types). Admitting that so much of the current system is based on bullshit would constitute an unacceptable threat to its foundation.

Actually, I think a lot of admissions officers are probably caught up in a sort of double-think, in which they both understand and don’t understand simultaneously. They know the admissions process is out of control, and they’d really like to do something to relieve all the stress, but they also recognize that systemic change is all but impossible.

Besides, they are also a little in awe of these applicants—these brilliant, supercharged, amazing kids who are undoubtedly going to change the world (presumably for the better, although that part never actually seems to get spelled out).

In her essay “The Crisis in Education,” Hannah Arendt discusses the persistent American need to conflate the boundaries between children and adults, locating the roots of phenomenon in a larger post-World War II crisis of authority—a crisis that continues to find perhaps its fullest expression in the American education system:

[M]odern man could find no clearer expression for his dissatisfaction with the world, for his disgust with things as they are, than by his refusal to assume, in respect to his children, responsibility for all this. It is as though parents daily said: “In this world even we are not very securely at home; how to move about in it, what to know, what skills to master, are mysteries to us too. You must try to make out as best you can; in any case you are not entitled to call us to account. We are innocent, we wash our hands of you.”

I do not think that it is a coincidence that the need to project adult qualities onto children and teenagers has become so powerful just at this moment of political crisis. It is not entirely surprising that adults, astounded by the mess they have made, by the warning signals they did not see, and yet driven by ingrained American assumptions about the inevitability of progress, would want nothing more than to cede responsibility to a younger generation—whether they’re ready for it or not.