The Cognitive Dilemma: Is AI Atrophying Our Brains or Elevating Our Potential?
A Question That’s Been Bugging Me
I need to admit something: since I started using AI heavily in my work, I’ve noticed my tolerance for finding information “the hard way” has dropped.
Before, I’d spend hours digging through documentation, reading articles, connecting ideas from different sources. Today, my first instinct is to ask Claude or Gemini. And most of the time, the answer is good enough. Fast, structured, useful.
But “good enough” is exactly what worries me.
Because for the first time in human history, we live in a world where any fact is just one question away. Where we once had to search through library catalogs and physical encyclopedias — objects the new generation doesn’t even know — today AI delivers ready-made answers in seconds. And the question that won’t go away is: what happens to our brains when everything becomes too easy?
What Science Is Saying (And It’s Concerning)
I’m not being dramatic. Researchers at the MIT Media Lab have just published a study that should be required reading for anyone who uses AI daily.
Over 4 months, 54 participants were divided into three groups: one using ChatGPT to write essays, another using Google for research, and a third writing on their own (“brain-only”). All wore electroencephalography (EEG) headsets to monitor brain activity in real time.
The results are disturbing.
The ChatGPT group showed significantly weaker brain connectivity than the other two groups. Alpha and theta wave activity — indicators of cognitive engagement — was nearly halved. 83% of participants in this group couldn’t recall key points from their own essays, and none could accurately cite passages from them.
But the part that really got to me was this: in a fourth session, the researchers switched the groups. ChatGPT users wrote on their own, and vice versa. Participants who had used ChatGPT for 3 sessions couldn’t recover the cognitive engagement levels of the other groups. Even without AI, their brains stayed “lazy.”
The authors called this “cognitive debt” — an analogy to technical debt in software. You accumulate shortcuts that work in the short term but exact a growing price over time.
Before You Panic
I know these numbers sound alarming, and my first instinct was to think “I need to use less AI.” But after reading deeper — including a critical analysis published in The Conversation — I realized the story is more nuanced.
The MIT study is small (54 participants, 18 in the final session) and hasn’t completed full peer review. Some researchers argue that the engagement drop in the ChatGPT group can be partly explained by the familiarization effect — when you repeat a task, your brain adapts and naturally reduces effort.
The Harvard Gazette consulted professors across multiple disciplines on the topic, and the consensus is subtle: the problem isn’t AI itself, but how we engage with it.
As one Harvard researcher put it: if a student uses AI to do the work for them rather than to do the work with them, there won’t be much learning. No learning occurs unless the brain is actively engaged in making meaning.
The Risk of “Brainless” Usage
The main concern — and here I speak from personal experience — is what researchers call “brainless” usage: passively accepting what language models deliver without questioning, validating, or reflecting.
I’ve caught myself doing this. Asking for an answer, skimming it, using it as-is. Without checking sources. Without questioning the logic. Without trying to formulate my own version before consulting the machine.
It’s comfortable. It’s fast. And it’s exactly the kind of behavior that accumulates cognitive debt.
Researchers at Frontiers in Psychology coined a term for this: AICICA — AI Chatbot-Induced Cognitive Atrophy. The core idea follows the “use it or lose it” principle: if you systematically delegate cognitive tasks to AI without simultaneously cultivating your fundamental skills, those skills deteriorate.
A study with 666 participants published in MDPI found a correlation between frequent AI use and reduced independent critical thinking ability, mediated by cognitive offloading — the habit of outsourcing mental effort to external tools.
AI as Tutor, Not as Oracle
But here’s the other side of the coin — and it’s what gives me hope.
A 2025 systematic review found positive effects on executive functions when people used AI chatbots actively and interactively in educational contexts, particularly for working memory and cognitive flexibility.
The difference? The intention behind how you use the tool.
When I use AI to give me a ready-made answer, my brain disengages. But when I use AI as a thinking partner — to challenge my ideas, find flaws in my logic, present perspectives I hadn’t considered — the effect is the opposite. My cognitive engagement increases.
In practice, here’s what I’m trying to do more of:
Formulate my own answer before asking AI, then compare. Ask AI to question my reasoning instead of just validating it. Use AI to go deeper into topics that were previously hard to access, not to skip steps. Treat AI as a Socratic tutor that asks questions, not as an oracle that gives answers.
AI Doesn’t Have an Age
One of the points that resonates most with me in this debate is the breaking of the generational myth. In France, 85% of young people aged 18-24 use generative AI. But the ability to use AI intelligently doesn’t depend on age.
I know 60-year-old professionals who are extremely savvy with the technology, and 23-year-olds who use ChatGPT as if it were a ready-answer calculator. The “intelligence” in AI usage is a function of individual choices and the willingness to keep your brain sharp, regardless of career tenure.
The Calculator Analogy (Again)
If you follow this blog, you know I’ve already written about AI’s “calculator moment.” And that analogy fits perfectly here.
In the 1970s, when calculators entered classrooms, educators didn’t eliminate math — they raised the bar. Instead of doing calculations by hand, students used calculators and spent their cognitive effort on more complex problems. The bar went up, and students worked just as hard (or harder) than before.
The challenge with AI is that, in most cases, educators and companies haven’t raised the bar yet. They still ask for the same tasks and expect the same outputs from 5 years ago. In that scenario, AI lets people skip the effort with no compensation — and that’s where atrophy happens.
Conclusion: Wear Your “Thinking Cap”
I’m not going to stop using AI. That would be hypocritical — I write a blog about technology and AI, after all. But this MIT study made me recalibrate how I use it.
The true differentiator in an AI-dominated world isn’t who has the best model, but who makes the best choices with their own brain. AI needs us to be sharp so it can help us accomplish even more complex and interesting work.
Technology should be a step to climb higher, not a couch to sink into.
And I find myself wondering: will the generation that grows up with AI from childhood develop new forms of cognition we can’t even imagine? Or will they lose fundamental abilities that took thousands of years to evolve? The honest answer is: nobody knows yet. But the attitude with which each of us uses these tools will determine which side of the balance we fall on.
What about you? Has AI made you lazier or more curious to learn new things?
I know my answer. And it changes depending on the day. I think the honesty in that self-analysis is already half the battle.
Share if this resonated with you:
- Email: fodra@fodra.com.br
- LinkedIn: linkedin.com/in/mauriciofodra
The best use of AI isn’t the one that spares your brain — it’s the one that challenges it.
Read Also
- AI’s ‘Calculator Moment’: Why Today’s School May Ruin the 2026 Professional? — The same analogy, applied to education: the calculator raised the bar. AI needs to do the same.
- The Impact of AI on Modern Society — The broader reflection on how AI transforms our daily lives.
- The Elite User Secret: Why ‘Saying No’ to AI Is Your Greatest Skill — If AI atrophies the passive brain, the antidote is developing active judgment.