AI’s biggest threat: Young people who can’t think

Smart computers require even smarter humans, but they tempt us to engage in ‘cognitive offloading.’
Amazon CEO Andy Jassy caused a stir last week with a memo to his employees warning that artificial intelligence could displace them. “We will need fewer people doing some of the jobs that are being done today, and more people doing other types of jobs," he wrote.
Nothing in his memo was shocking. Technological advances as far back as the printing press have eliminated some jobs while creating many others. The real danger is that excessive reliance on AI could spawn a generation of brainless young people unequipped for the jobs of the future because they have never learned to think creatively or critically.
As Mr. Jassy explained, AI advances mean employees will do less “rote work" and more “thinking strategically." Workers will need to be able to use AI and, more important, they will need to come up with novel ideas about how to deploy it to solve problems. They will need to develop AI models, then probe and understand their limitations.
All of this will require a higher level of cognition than does the rote work many white-collar employees now do. But as AI is getting smarter, young college grads may be getting dumber. Like early versions of ChatGPT, they can regurgitate information and ideas but struggle to come up with novel insights or analyze issues from different directions.
The brain continues to develop and mature into one’s mid-20s, but like a muscle it needs to be exercised, stimulated and challenged to grow stronger. Technology and especially AI can stunt this development by doing the mental work that builds the brain’s version of a computer cloud—a phenomenon called cognitive offloading.
Growing research shows that handwriting engages parts of your brain that play a crucial role in learning and helps children with word and letter recognition. Taking notes by hand also promotes memory development by forcing you to synthesize and prioritize information. When you plunk away on a keyboard, on the other hand, information can go, as it were, in one ear and out the other.
A study last year analyzed brain electrical activity of university students during the activities of handwriting and typing. Those who were handwriting showed higher levels of neural activation across more brain regions: “Whenever handwriting movements are included as a learning strategy, more of the brain gets stimulated, resulting in the formation of more complex neural network connectivity," the researchers noted.
Could increasing use of computers in K-12 schools be one reason that standardized test scores have been falling since 2017 despite soaring education spending? It’s worth studying.
Most students in college and many in high school take notes on laptops or tablets, when they take them at all. AI tools that summarize lectures and meetings may soon render note-taking obsolete. Students will likely retain less information as a result.
Why commit information to memory when ChatGPT can provide answers at your fingertips? For one thing, the brain can’t draw connections between ideas that aren’t there. Nothing comes from nothing. Creativity also doesn’t happen unless the brain is engaged. Scientists have found that “Aha!" moments occur spontaneously with a sudden burst of high-frequency electrical activity when the brain connects seemingly unrelated concepts.
College and high-school students increasingly also use large language models like ChatGPT to write papers, perform mathematical proofs, and create computer code. That means they don’t learn how to think through, express or defend ideas. Nor how to construct arguments and anticipate the rebuttals. They offload these cognitive challenges to AI.
Add intellectual laziness to the long list of problems plaguing higher education. Many students don’t even review their AI-generated papers before submitting them. A professor at a large research university tells me that a student paper on the Bosnian war declared that “brave Nazi soldiers were raped in enormous numbers." Chatbots can say the darnedest things.
Such hallucinations aren’t the only giveaway that a paper was written by a bot. As one professor told Times Higher Education, there is “just a kind of aura of machinic blandness that’s hard to describe to someone who hasn’t encountered it—an essay with no edges, that does nothing technically wrong or bad, but not much right or good, either."
Another professor notes that AI papers are replete with “seemingly logical statements that are actually full of emptiness." A depressing thought is that students are incapable of discerning such intellectual vapor because their heads are empty.
When new technologies made manufacturing more efficient, many workers lost jobs and dropped out of the labor force—not because there was a paucity of openings, but because they lacked the skills and training to fill them. College-educated young people face the same risk if they don’t develop intellectual vigor, curiosity and grit.
Why hire a brainless bachelor’s degree holder for a rote job that a bot can do at lower cost and with no complaints?
topics
