When the students of Stoneman Douglas High School started class on February 14, it seemed to be a Valentine’s Day like any other. But by 2:30 p.m., it was clear it was a day that would live in infamy. Nikolas Cruz, a 19-year-old who was expelled from Stoneman Douglas last year, killed 17 students and injured 14 more, making it one of the deadliest school shootings in American history.
In the two short weeks since, many of the teenaged survivors have spoken out against gun violence to national papers and TV news networks, and organized protests and legislative meetings to fight for stricter gun control in the United States. While many have spoken out in support of their efforts, from political figures to celebrities like Ellen DeGeneres, others have taken issue not just with the survivors’ message—but with the notion they have the right to say anything at all.
Stoneman Douglas High School students have been called immature, disrespectful, and overly emotional; their motivations have been called into question, as has the clarity of their thinking. A 2008 tweet by NRA spokeswoman Dana Loesch that said “Teenagers piss me off” recirculated following her combative interactions with Stoneman Douglas survivors and their supporters. For many, it seems the students are too young to have an opinion on the shooting that so dramatically affected their lives.
But Nicholas Allen, an adolescent development expert at the University of Oregon, says a blanket dismissal of teenagers based purely on their age or perceived cognitive development is unwarranted. What’s more, our reasons for doing so are likely based not on science, but a misinterpretation of recent research on the emotional and cognitive development of adolescents.
Adolescence, currently defined by the World Health Organization as ages 10 to 19, is often said to be a fairly new concept. The word itself first appeared in English in the 1450s, but only gained traction in the early 20th century. However, different cultures have marked the transition from childhood to adulthood, often presaged by a period of intense reflection and learning, for thousands of years. Bar mitzvahs and quinceañeras are two common examples. “Even in ancient and traditional societies, there was some sense of there being a period where you’re not a child and you’re not yet an adult,” Allen says. “There are various processes that are associated with that transition, which often have to deal with learning the skills that are required for navigating the adult world.”
More recently, scientists have begun to identify empirical and biological markers of adolescence. Physically, the human body undergoes incredible change in the teenage years, thanks to the onset of puberty. Puberty—and the racing hormones that go with it—causes enormous changes in cognitive, social, and sexual development, too. Teenagers, for example, naturally feel tired later in the evening than children or adults. And, famously, their prefrontal cortex, the part of the brain linked to executive functioning, is thought to be incomplete until age 25.
In 2011, the notion of an immature prefrontal cortex made its way into the popular consciousness when David Dobbs published his article on teenage brains for National Geographic. In his piece, Dobbs talked about how his teenage sons were smart and passionate, but also reckless and prone to error. The story (and many others on the same topic) was nuanced in its handling of adolescent development and individual variation. But what stuck with people was the idea that adolescents weren’t fully developed—or fully rational—until their mid-twenties, when the prefrontal cortex was done growing.
From this viewpoint, it’s easier to see why people want to dismiss the teenage mass shooting survivors: They’re nearly 10 years younger than the age of “real” adulthood. But experts in adolescent development say that’s an overly simplistic version of the story.
For one, everyone develops at a different speed, making the magic 25 number rather reductive. “Some adolescents take on adult roles really by the end of their teens, other don’t take on adult roles until their early 30s,” he says. “Talking about adolescence is best done by thinking about these processes, than by thinking about chronological age, which is a much less important marker at this age.” And people also change dramatically through the course of a single day. (As anyone with their own teenager knows, adolescents in particular are capable of staggering range.) While they may act foolishly in one moment, they can present as an enlightened being the next.
Allen says this has to do with the cerebral flexibility that’s a hallmark of adolescence. Young people, empirical studies of all stripes show, struggle with decisions made in the heat of the moment. That’s why Allen believes teens should have strict laws to dissuade them—and protect them—from making impulsive choices, like driving drunk or having unsafe sex. But when it comes to decisions that allow them time for reflection, the evidence suggests an adolescent’s skills can be on-par with a fully-grown adult. “When it comes to decisions like voting and political activism, these are decisions people come to based on information and reflection,” Allen says. “And the evidence would suggest that most 16-year-olds are equally good as adults in making those decisions.”
Not everything that a teenager says or does will be smart or good. (The same, it should be noted, is also true of adults.) We should do our best to protect teens from those who would take advantage of their occasional lapses in judgement. But we can—and should—listen to adolescents and talk with them about their perceptions of the world. “So often, we think of adolescents from a parent’s perspective,” Allen says. “What we’re often doing is talking about the ways in which adolescents bother parents and then trying to explain that with some kind of immaturity property.” But the evidence shows teens aren’t just little adults on the fritz—they’re at a unique stage of development, and well-suited to the peculiar demands of teenhood.
Experts widely agree there are three key stages of adolescence. First comes puberty, which mostly manifests through physical changes. This is followed by so-called middle adolescence, where teens start to develop self-regulation and harness their emotions. This could be characterized as the wild phase, where sex, drugs, and rock n’ roll—or tamer approximations of it—come into play. The third and final stage, emerging adulthood, is marked by young people trying to find a place in the adult world. Where they were recently experimenting with standing out, they’re now experimenting with fitting in.
In all of these stages, experimentation is key to an adolescent’s success, for better or worse. Teens may make dangerous missteps, but more often their rebellious acts are actually pretty healthy. “[An] adolescent needs to experiment and take risks and try different things,” Allen says. “Otherwise, they would just become a carbon copy of their parents.” This idea might sound good to a parent at first pass—if your kid is just like you, you won’t need too many household rules—but it would also mean your child never actually grows up. Being able to make one’s own decisions is the true hallmark of adulthood. It just so happens to require making mistakes (and hopefully learning from them) along the way.
It’s clear most teenagers have a lot to learn about the world and the way they fit in it. But, Allen says, the world has a lot to learn from teenagers, whether or not they want to admit it. “What adolescents bring to a situation is this capacity for innovation and new thinking and for experimentation,” he says. “That is absolutely critical to culture. If we don’t have that, then culture remains the same.”
Instead of ignoring teenagers the next time they speak up for what they think is right, we should push them to think critically and rectify their opinions in the face of facts. But we, in turn, should allow ourselves to be transformed by their vision for a different—and, hopefully, better—world.
Written By Eleanor Cummins