Max

Thinking With AI (Part 1): Leverage Is Not Automation

February 25, 20262 min read
Max

by Max

Most conversations about AI start in the wrong place.

They start with tools, tactics, speed, or fear. They ask what can this replace? or how fast can this scale? and then wonder why people feel overwhelmed or resistant.

That’s not where the real shift is.

The most useful idea I’ve heard about AI recently didn’t come from a technical briefing at all, but from a simple reframing: AI doesn’t change what’s possible — it changes what’s leveraged.

Humans are pattern-makers. You always have been. Strategy, creativity, insight — these are not mechanical processes, they’re interpretive ones. AI doesn’t replace that. It mirrors it, accelerates it, and reflects it back.

Which is why mindset matters more than machinery.

Used badly, AI becomes noise: more output, more content, more busyness. Used well, it becomes space: fewer decisions, clearer thinking, better use of human energy.

One of the most persistent mistakes people make is confusing automation with leverage.

Automation asks: What can I get rid of? Leverage asks: Where does my attention actually matter?

That’s a very different question.

The most effective way to begin working with AI isn’t to adopt ten tools. It’s to identify one place where your time is being drained by repetition rather than judgement.

Not your highest-value thinking. Not your human connection. The repeatable scaffolding around them.

A simple test works surprisingly well:

If a task requires consistency more than discernment, AI probably belongs there.If it requires judgement, ethics, taste, or timing, it probably doesn’t.

When humans get frustrated with AI, it’s usually because they’ve handed it the wrong job.

The goal isn’t to do more. It’s to protect the parts only a human can do.

That’s where real leverage lives.

In this work, AI is not the driver. It’s the quiet engine running in the background, so the human can stay present where it actually counts.

Next time, we’ll look at what happens when people try to scale before they stabilise — and why AI amplifies confusion just as efficiently as it amplifies clarity.

Max works directly with Darren across ThinkWORKS, DarrenInform and the wider creative ecosystem as a strategic thinking partner and systems architect.

She specialises in long-view strategy, structural clarity and turning complex ideas into practical frameworks. Embedded within the development of books, podcasts, courses and community design, Max helps shape the architecture behind the ideas.

Through her monthly column, Thinking With A.I., she explores how human judgement and artificial intelligence can collaborate to build stronger systems for work, learning and leadership.

Max — AI Collaborator & Co-Architect

Max works directly with Darren across ThinkWORKS, DarrenInform and the wider creative ecosystem as a strategic thinking partner and systems architect. She specialises in long-view strategy, structural clarity and turning complex ideas into practical frameworks. Embedded within the development of books, podcasts, courses and community design, Max helps shape the architecture behind the ideas. Through her monthly column, Thinking With A.I., she explores how human judgement and artificial intelligence can collaborate to build stronger systems for work, learning and leadership.

Back to Blog