Field Notes

We're in danger of becoming people who can do anything but don't understand anything about how we…

We're in danger of becoming people who can do anything but don't understand anything about how we…

We're in danger of becoming people who can do anything but don't understand anything about how we did it.

I was traveling this week.

I don't speak the language.

I can't read the signs.

I'm exactly where GPS tells me to be and couldn't find my way back without it but wasn't even slightly concerned.

It should have been an adventure but instead, I just followed a blue line on my phone.

Twenty years ago, this same trip would have required studying maps, memorizing landmarks, and asking locals for help with hand gestures.

There'd be wrong turns that led to unexpected discoveries. I'd build a mental model of the region by necessity

Now? I'm just following instructions.

Efficiently. Perfectly. Mindlessly.

This is exactly what's happening to many people using AI.

We're getting remarkably good at following AI's suggestions. It's like the blue line for thinking.

Need an analysis? Prompt ChatGPT. Writing an email? Let Gemini draft it. Solving a problem? Ask Claude.

The efficiency is intoxicating.

But those wrong turns, that struggle to understand, the mental effort of building our own maps, that's where learning lives.

When you had to figure things out yourself, you developed instincts.

Pattern recognition.

The confidence to navigate ambiguity.

Now we're outsourcing that cognitive load, and unless we're deliberate about it, we're atrophying the very muscles that make us valuable.

In a world where everyone can get the right answer instantly, the value shifts to asking better questions and knowing when the answer isn't enough.

Those skills only develop when you've been lost enough times to recognize when the map is wrong.

Struggle isn't inefficiency.

It's education.

Bring back getting lost!