
Are Chatbots Dumbing Us Down?
How is artificial intelligence affecting human intelligence?
We are entering a paradoxical moment in human development. As technology, generative artificial intelligence (AI) in particular, increases exponentially in intelligence and autonomy, humans risk becoming more passive, and perhaps even less cognitively engaged. We have seen versions of this dynamic before. For instance, the invention of GPS dramatically reduced our need for directional awareness. This made navigation easier but also changed the skill itself; many people may now struggle to form cognitive maps of their environments in once routine ways.1
Humans are efficient cognitive economists. We naturally optimize our mental energy by delegating tasks to external systems. Smartphones have already reshaped how we use memory: we no longer need to remember phone numbers, addresses, birthdays, or even the names of acquaintances in the same way previous generations did. These forms of information storage have become externalized. Research on offloading suggests that while external tools can enhance immediate performance, they may reduce internal memory consolidation over time.1 As generative AI becomes embedded in daily life, the scope of what can be cognitively offloaded is expanding rapidly.
This shift has particular relevance for the mental health field. From the client's perspective, AI tools may offer immediate emotional processing, problem-solving assistance, and even identity reflection. While this accessibility has clear benefits, it may also alter how individuals develop internal coping skills, tolerate uncertainty, or construct personal narratives.
From the practitioner perspective, increasing reliance on AI for diagnostic reasoning, documentation, treatment planning, and clinical decision support may subtly reshape professional judgment. Over time, clinicians may face new challenges in maintaining independent analytical skills while working alongside technological systems.
Education
Global adoption and increased reliance on AI tools in education for students and teachers have sparked conversations about cognitive offloading in learning processes. Reports across multiple countries suggest that heavy reliance on digital tools may correlate with declining sustained attention, problem-solving persistence, and independent learning skills among younger generations.2 Commentators have raised concerns that technology-mediated learning environments may inadvertently weaken core cognitive competencies, particularly when tools automate processes that once required active mental engagement. Policy shifts also reflect this tension. For example, Sweden recently moved away from large-scale digital-first classroom initiatives after concerns that screen-centered learning environments were undermining reading comprehension and foundational cognitive skills.3 Similarly, debates in Estonia, Iceland, and other early adopters of classroom AI tools highlight fears that students may become overly dependent on automated reasoning systems rather than developing their own analytical abilities.8
Clinical
AI tools are also entering clinical workflows in tools like AI-powered documentation assistants, chatbot companions, and diagnostic support systems. While these innovations can improve efficiency and access, clinicians are beginning to ask whether excessive reliance on automated reasoning could weaken clinical judgment over time. Medicine has long recognized the concept of “automation complacency,” in which professionals defer to technological systems even when errors are present.4 Early evidence from digital health contexts suggests similar risks may emerge with AI-driven decision support.5 In psychotherapy, AI companions and conversational agents raise additional psychological questions. Patients may increasingly outsource emotional processing, memory recall, or reflective thinking to digital tools. While such tools can offer immediate support, over-externalization of cognitive and emotional work may reduce opportunities for internal skill development, including distress tolerance, metacognition, and narrative identity formation.
Why Cognitive Offloading Happens
Humans naturally conserve mental effort. We have limited cognitive energy in our attention-grabbing and hyperstimulating world, so we look to outsource what we can. When external systems reliably perform demanding tasks, the brain reallocates resources away from those functions.6
AI systems are also uniquely persuasive cognitive partners. Unlike earlier technologies, conversational agents simulate understanding, social presence, and adaptive responsiveness. This relational quality may accelerate trust formation and reduce perceived need for independent reasoning. Over time, this can reshape habits of mind at both individual and societal levels.
The Stakes for Mental Health and Society
If these trends continue, the consequences may extend beyond academic performance or workplace productivity. Cognitive offloading may interact with rising loneliness, anxiety, and identity diffusion in younger generations. A population accustomed to outsourcing thinking may also become more vulnerable to misinformation, manipulation, and technological overreach.
Recommendations
Resisting AI is a losing battle; we must learn how to harness it. Several strategies may help mitigate unintended cognitive costs.
Educational and clinical tools should encourage active reasoning rather than providing fully formed conclusions. AI can scaffold thinking without replacing it. Just as physical health requires exercise, cognitive resilience requires sustained mental effort. If using these tools, there needs to be deliberate practice in attention, memory, and problem-solving.7 Some degree of effort and uncertainty is developmentally beneficial. Systems that eliminate all struggle may undermine long-term competence.
Clinical training programs will need to define what skills must remain human-led like diagnostic formulation, ethical judgment, and relational attunement. Users should be educated about when and how to rely on AI, cultivating awareness of cognitive offloading and its downsides.
Concluding Thoughts
Artificial intelligence represents one of the most powerful cognitive technologies in human history. Its capacity to expand knowledge, access, and efficiency is extraordinary. The smarter our tools become, the more intentional we must be about preserving the intellectual and psychological capacities that define human flourishing. We must ensure humans remain capable thinkers in a world increasingly optimized to think for them.
Dr Frances is professor and chair emeritus in the department of psychiatry at Duke University.
Ms Noorily writes and works at the boundary between AI and the humanities.
References
1. Grinschgl S, Papenmeier F, Meyerhoff HS.
2. Wetzel J. Gen Z is the first generation less cognitively capable than their parents—but Denmark has a solution. Upworthy. January 23, 2026. Accessed March 20, 2026.
3. Hart E. 104 million euros to correct a mistake: Sweden says no to major school policy it once championed. Indian Defence Review. December 24, 2025. Accessed March 20, 2026.
4. Khan FA.
5. Ocklenburg S. Cognitive offloading: Using AI reduces new skill formation. Psychology Today. February 2026. Accessed March 20, 2026.
6. Shen JH, Tamkin A. How AI impacts skill formation. arXiv. January 2026. Accessed March 20, 2026.
7. Gerlich M.
8. Singer N. As schools embrace AI tools, skeptics raise concerns. New York Times. January 2, 2026. Accessed March 20, 2026.
Newsletter
Receive trusted psychiatric news, expert analysis, and clinical insights — subscribe today to support your practice and your patients.






