automated driving
I have a grand idea. Let’s let student drivers whose families own Teslas with full-self driving use them to take the road test. And, for families who don’t own Teslas, student drivers should be permitted to use a Waymo for the road test.
How would this road test work?
Well, the DMV official would tell the student driver the destination. And, if the student successfully punched in the destination on Tesla or Waymo, and the self-driving car made it successfully to the destination, as we should expect it would, the student would pass.
The state DMV would then issue the student a driver’s license. Grand idea, right? Sure, autonomous driving is the future.
automated learning
Something of this sort is happening right now with education. OpenAI, Google, Microsoft, Elon Musk’s xAI, and other AI companies are dangling free AI to college students like candy. And the marketing of some companies, touting free use “through finals 2026,” implies the use of AI even on exams.
The Ohio State University just went all in on AI. Every college student there will be trained in AI. As the University wrote: “Ohio State’s AI Fluency initiative will embed AI education into the core of every undergraduate curriculum, equipping students with the ability to not only use AI tools, but to understand, question and innovate with them — no matter their major.”
And it’s not just higher education. It’s starting with kindergarten.
Indeed, President Trump issued an executive order declaring it a national policy, for kindergarten through 12th grade, “to promote AI literacy and proficiency among Americans by promoting the appropriate integration of AI into education, providing comprehensive AI training for educators, and fostering early exposure to AI concepts and technology to develop an AI-ready workforce and the next generation of American AI innovators.”
The rush to embrace AI in education is understandable. I doubt Ohio State, where I taught for many years, is the only university to adopt an AI-focused curriculum. The United States and China are locked in an AI arms race, trying to outdo each other in innovation. And businesses in many sectors are figuring out ways to adopt AI to gain greater efficiency and productivity. No one can predict the future. But all signs point to AI being the transformative technology for the 21st century.
Yet, there’s a real danger that embracing AI in education will lead to displacing critical thinking by students. Just as student drivers relying on full self driving will lack the skills necessary to drive themselves, students who rely too much on AI in education will lack the cognitive skills to think critically for themselves.
Researchers calls this “cognitive offloading.” Or we might call it by a simpler term, the dumbing down of humans. There’s a growing body of research studying and identifying negative effects of cognitive offloading from human reliance on AI, especially by younger adults:
- Michael Gerlich, AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking, Societies 2025, 15, 6. https://doi.org/10.3390/soc15010006:
- The results indicate that AI tool use negatively predicts critical thinking (β = −1.76, p < 0.001), suggesting that increased reliance on AI tools is associated with reduced critical thinking skills. Importantly, deep thinking activities, a proxy for cognitive engagement, negatively predicts critical thinking (β = −0.36, p < 0.001). These findings support the hypothesis that increased reliance on AI tools leads to cognitive offloading, which, in turn, reduces critical thinking abilities.
- Many interviewees, particularly those in the younger age group (17–25 years), expressed a heavy reliance on AI tools for tasks ranging from simple information retrieval to more complex decision-making processes. They described how AI tools, such as virtual assistants and search engines, have become integral to their daily routines. A recurring theme was the convenience and speed these tools offer, which often led to cognitive offloading. Participants admitted that they often relied on AI to remember information, solve problems, or make decisions rather than engaging in deeper cognitive processes.
- Nataliya Kosmyna et al., Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task, 2025, arXiv:2506.08872.
- Meanwhile, the Brain-only group [no use of Internet or AI] showed the strongest activations outside of the visual cortex, particularly in left parietal, right temporal, and anterior frontal areas (e.g. P7→T8, T7→AF3). These regions are involved in semantic integration, creative ideation, and executive self-monitoring. The elevated delta and theta coherence into AF3, a known site for cognitive control, underscored the high internal demand for content generation, planning, and revision in the absence of external aids.
- Collectively, these findings support the view that external support tools restructure not only task performance but also the underlying cognitive architecture. The Brain-only group leveraged broad, distributed neural networks for internally generated content; the Search Engine group relied on hybrid strategies of visual information management and regulatory control; and the LLM group [use of ChatGPT only] optimized for procedural integration of AI-generated suggestions.
- These distinctions carry significant implications for cognitive load theory, the extended mind hypothesis [102], and educational practice. As reliance on AI tools increases, careful attention must be paid to how such systems affect neurocognitive development, especially the potential trade-offs between external support and internal synthesis.
- Hamsa Bastani, Generative AI Can Harm Learning, The Wharton School Research Paper (2024), https://ssrn.com/abstract=4895486.
- In a field experiment involving nearly a thousand students, we have deployed and evaluated two GPT based tutors, one that mimics a standard ChatGPT interface (called GPT Base) and one with prompts designed to safeguard learning (called GPT Tutor). These tutors comprise about 15% of the curriculum in each of three grades. Consistent with prior work, our results show that access to GPT-4 significantly improves performance (48% improvement for GPT Base and 127% for GPT Tutor). However, we additionally find that when access is subsequently taken away, students actually perform worse than those who never had access (17% reduction for GPT Base). That is, access to GPT-4 can harm educational outcomes. These negative learning effects are largely mitigated by the safeguards included in GPT Tutor. Our results suggest that students attempt to use GPT-4 as a “crutch” during practice problem sessions, and when successful, perform worse on their own
- Joseph Crawford et al, When artificial intelligence substitutes humans in higher education: the cost of loneliness, student success, and retention, 49 Studies in Higher Educ. (2024), https://doi.org/10.1080/03075079.2024.2326956.
- In our study, no significant direct effect was found between AI usage, perceived grade average, and the intention to leave university. However, an indirect effect was observed through the mediating variables of social support, wellbeing, and sense of belonging. Students who reported feeling socially supported by other people reported higher grade performance and were less willing to leave university than those who reported being socially supported by AI. This may indicate that when university students are well connected to others, as opposed to AI, they may be more likely to have greater academic success. No previous research on student retention and AI usage exists, however, one study showed that retention intention increased when staff used AI assistance (Wang et al. Citation2022), which is contrary to our findings. Also inconsistent with our findings, Ouyang, Zheng, and Jiao (Citation2022), found that AI usage can promote academic performance. Our results align with Seo et al. (Citation2021), warning about the potentially negative impact AI could have on grades, underscoring the need for further insight. The results of the current study indicate that university students using AI tend to experience diminished student success and retention, but only if they feel socially supported by AI.
- Sayed Fayaz Ahmad, Impact of artificial intelligence on human loss in decision making, laziness and safety in education, 10 Humanities & Social Sciences Communs. (2023), https://www.nature.com/articles/s41599-023-01787-8.
- This study is based on qualitative methodology using PLS-Smart for the data analysis. Primary data was collected from 285 students from different universities in Pakistan and China. The purposive Sampling technique was used to draw the sample from the population. The data analysis findings show that AI significantly impacts the loss of human decision-making and makes humans lazy. It also impacts security and privacy. The findings show that 68.9% of laziness in humans, 68.6% in personal privacy and security issues, and 27.7% in the loss of decision-making are due to the impact of artificial intelligence in Pakistani and Chinese society. From this, it was observed that human laziness is the most affected area due to AI. However, this study argues that significant preventive measures are necessary before implementing AI technology in education.
By some reports, colleges are already being overrun by rampant cheating among college students, who are turning to ChatGPT and AI to write their papers and exams. Right after ChatGPT’s launch in 2022, Stephen Marche wrote a clairvoyant article titled “The College Essay Is Dead.” And that might not be the only thing that dies with AI’s rise.
how to teach students critical thinking skills
Everyone in education — from graduate schools to undergraduate level to K12 — is facing profound questions about the uses and abuses of AI. As the tsunami of AI infiltrates education, the transformation of learning is being driven by a set of competing forces that are haphazard, if not contradictory.
I don’t profess to have a solution. But I do have a warning. It’s a terrible idea to let student drivers pass the road test using full-self driving. And it’s an even worse idea to let students pass all of their classes by using AI, such as in writing a paper or even answering an exam. If the only thing a high school and college education teach our students is how to punch in a question or “prompt” to ChatGPT, our society has lost.
Related Stories