Cognitive Diabetes
The evolutionary mismatch breaking your ability to think clearly
Your body cannot metabolize high-fructose corn syrup the way it metabolizes an apple. Same calories on the spreadsheet, completely different metabolic response. One triggers satiation signals that tell you to stop. The other bypasses those circuits entirely, leaving you hungry for more while your system slides into chronic inflammation.
Your brain has the same problem with information. And almost nobody is talking about the cost.
The Mismatch
Daniel Lieberman’s The Story of the Human Body introduced a concept that should reframe how we think about modern dysfunction: evolutionary mismatch. Our cognitive hardware evolved for the Paleolithic era - scarce information, small social groups, survival dependent on immediate cause and effect. Then we built a digital environment engineered to exploit every stone-age instinct we developed over two million years.
The result is an attention crisis that no amount of productivity advice has solved. We know what focus looks like. We scroll anyway. Not because we’re weak - because the environment is designed to overwhelm circuits that evolved for scarcity.
Max Fisher’s The Chaos Machine documents how social media algorithms exploit our dopamine systems with the same precision that food scientists exploit our taste receptors. Johann Hari’s Stolen Focus catalogs twelve forces fragmenting our attention - not as individual failures, but as designed outcomes. Yuval Noah Harari’s Nexus argues that computers have fundamentally changed how information spreads, undermining the self-correcting mechanisms that once helped truth emerge from noise.
We are feeding our ancient brains high-fructose information. The result is a societal inability to metabolize reality.
I call it Cognitive Diabetes.
The Discipline Trap
Here’s where most advice goes wrong.
Brian Tracy’s No Excuses represents a seductive worldview: everything is a discipline problem. Distracted? Lack self-control. Overwhelmed by information? Build better habits. Can’t distinguish signal from noise? Try harder.
This framing isn’t wrong. It’s conveniently incomplete.
Telling someone in 2026 that their attention collapse is a personal failing is like telling someone in a sugar-saturated food desert that their health is just a willpower problem. It ignores the asymmetry: you are one brain, running on hardware that hasn’t been upgraded in 50,000 years, facing off against thousands of engineers with real-time feedback loops and unlimited iteration cycles.
This was never meant to be a fair fight.
The discipline frame makes structural problems invisible. If attention collapse is a character flaw, we don’t need to regulate algorithms. If reality confusion is a moral failing, we don’t need to rethink information systems.
Gad Saad’s The Parasitic Mind argues that certain ideas infect our thinking like pathogens. But the really dangerous parasites aren’t the ideas themselves - they’re the delivery mechanisms that bypass our cognitive immune system entirely.
The discipline framing serves the people profiting from your confusion. That’s not an accident.
Why This Isn’t Just “Faster”
The skeptic’s best objection: humans have always been tribal, confirmation-biased, and susceptible to manipulation. The printing press triggered religious wars. Radio enabled fascism. Television homogenized culture. Every new medium sparked “this will destroy truth” panics.
What makes algorithms categorically different?
Personalization at scale. Previous propaganda was broadcast - same message to everyone. Algorithmic content is narrowcast - a unique reality tunnel for each person, optimized for their specific psychological vulnerabilities. We don’t share the same lies anymore. We don’t even share the same facts.
Real-time adaptation. A newspaper couldn’t A/B test headlines on you in milliseconds, then serve the version most likely to trigger engagement. Every interaction trains the algorithm to exploit you more precisely. The system learns faster than you do.
Zero friction. Previous information required effort. Buy a newspaper. Walk to the TV. Go to the library. That friction created natural breaks - space for reflection, for the brain to process before consuming more. Now the feed is infinite, the content is free, and the next hit is one thumb-swipe away.
These aren’t differences of degree. They’re differences of kind.
The Analogy Breaks (And Why It Still Matters)
Let me steelman the objection, because it deserves serious treatment.
Sugar causes measurable metabolic damage - insulin resistance, inflammation, cellular dysfunction. Does algorithmic content cause equivalent neurological damage?
Not exactly. The brain is remarkably plastic. People recover from social media addiction in ways diabetics don’t recover from diabetes. The mechanisms and permanence differ.
And there’s the agency problem. We don’t metabolize sugar consciously - it happens automatically. But we choose to scroll. We can close the app. The button exists.
So the analogy isn’t perfect. But it’s useful because it reframes the problem.
When we understood obesity as an environmental health crisis rather than a moral failing, we got warning labels, advertising restrictions, and public health campaigns. We didn’t abandon personal responsibility - we acknowledged that individual choices happen within systems designed to produce specific outcomes.
The same reframe applies here. You still make choices. But those choices are made in an environment engineered by people who profit when you make bad ones.
Acknowledging the trap doesn’t make you helpless. It makes you strategic.
The Information Diet Audit
If cognitive diabetes is real, it should be measurable. Here’s how to find the leaks in your cognitive infrastructure.
Step 1: Track the inputs (4 min)
List every information source you consumed in the last 24 hours. Not duration - just sources. Categorize each as:
Nutrient-dense: Information you sought deliberately that changed how you think or act
Empty calories: Information that felt urgent but changed nothing
Junk: Passive consumption that left you feeling worse
Be honest. The ratio will probably disturb you.
Step 2: Identify the triggers (4 min)
Look at your junk and empty-calorie consumption. What triggered it? Boredom? Anxiety? A notification? Habit?
Most information consumption isn’t deliberate. It’s reactive - a response to environmental cues you didn’t consciously choose. That’s the mismatch in action.
Step 3: Design one friction point (4 min)
Pick your highest-volume junk source. Add one obstacle between you and it. Move the app off your home screen. Delete the shortcut. Set a delay. Require a password.
The goal isn’t deprivation. It’s restoring the friction that the system deliberately removed.
Step 4: Protect one nutrient-dense block (3 min)
Identify one hour this week for deliberate, high-quality information intake. A book. A long-form article. A conversation with someone who challenges your thinking. Put it on your calendar. Protect it like a meeting with your most important client.
If you cannot find one hour, that is your diagnosis.
The Real Question
This isn’t about digital detox - a temporary retreat that changes nothing structural. It’s not about willpower - a framing that lets systems off the hook. And it’s not about nostalgia for some golden age of truth that never existed.
It’s about recognizing that the environment has changed faster than our cognitive hardware can adapt. And that the people designing that environment have different incentives than you do.
Persistence in the wrong environment is just a slow way to fail.
You can’t upgrade your brain. But you can redesign your environment. The professionals who maintain clarity while everyone else reacts with panic aren’t smarter or more disciplined. They’ve simply stopped treating this as a moral problem and started treating it as an engineering one.
The question isn’t whether you’re affected by cognitive diabetes. Everyone with a smartphone is. The question is whether you’ll keep treating it as a discipline problem - and keep failing - or recognize it as an environmental challenge that requires environmental solutions.
Questions worth sitting with:
If you audited your information diet honestly, what’s the ratio of nutrient-dense to junk? And what is that ratio costing you in clear thinking?
What would it take for you to support systemic interventions - algorithmic transparency, engagement friction, information nutrition labels - even if they made your own browsing less “engaging”?
Which of your current beliefs were formed through deliberate inquiry, and which were installed by an algorithm optimizing for engagement? How would you even know the difference?

