4 minutes
AI tools aren’t the problem. Your expectations are
It’s a familiar pattern.
A senior engineer, seasoned in system design and years of debugging experience, hears the buzz about AI coding assistants—Cursor, Claude, OpenAI Codex. They give it a shot. Ask it to build something. The result? Sloppy code. Wrong logic. Misinterpreted intent.
And then comes the reflexive conclusion: “See?! This stuff is shit.”
They close the tab, chuckle to their team, and carry on with their terminal and Vim config. Business as usual.
But here’s the thing: That conclusion says more about how the tool was used than about the tool itself.
The Expert Trap: Why senior engineers dismiss too quickly
Engineers with experience have built up sharp instincts. They know what good code looks like. They understand tradeoffs. They’ve seen what happens when software fails in production. So when they try an AI coding assistant, they subconsciously expect it to behave like a senior teammate.
It doesn’t. Not yet.
What they miss is that this new tool plays a different role. It’s not a senior engineer in your IDE. It’s a hyper-fast junior engineer with infinite stamina and zero ego. And just like a junior engineer, it needs context. It needs direction. It needs feedback.
And here’s the problem: Most engineers don’t give it that.
AI coding assistants aren’t bad. You’re just using them wrong
These tools don’t thrive on vague, one-shot prompts. If you say:
“Write a Node.js app that does my taxes” you’ll get chaos.
But if you give it:
- A clear file structure
- Defined input/output expectations
- Constraints (“must use Stripe API”, “target Node 18”, etc.)
- Step-by-step goals
…you’ll often get working code in minutes. Perhaps not perfect. Perhaps not pretty. But working.
And when it doesn’t? You can correct it. Just like you would mentor a new engineer. You iterate. You explain what it got wrong. You reframe the problem. You nudge it forward.
That’s how you get value.
The ‘Cars Are Shit’ Fallacy
This skepticism is not unique to AI. We’ve seen this pattern before.
When automobiles were first introduced, they were unreliable. Roads weren’t built for them. Parts failed. Horses were better adapted to the terrain. People tried a car, had a bad experience, and declared them useless.
“See?! Cars are shit!”
But the problem wasn’t the car. It was the infrastructure and the learning curve. The same goes for AI tools today. We’re still early. The tooling is rough. IDE integration is evolving. The culture hasn’t caught up.
But the trajectory is clear.
Cars didn’t need to be perfect to become inevitable. They just needed to be useful enough—and improve faster than their alternatives. AI coding assistants are on that same curve.
AI coding assistants are inevitable
The real-world advantages are stacking up:
- Speed: They can draft boilerplate, test scaffolds, and small utilities in seconds.
- Cost-efficiency: They offload grunt work that would otherwise soak up expensive engineering hours.
- Exploration: Want to try three different approaches to an algorithm? AI gives you a head start on all of them.
- Integration: IDEs like Cursor, Copilot Workspace, and others are reshaping the daily coding experience around AI.
Ignore this, and you risk becoming the engineer who clung to on-prem servers while the industry moved to the cloud, or the one who insisted on writing everything from scratch while the rest of the world shipped faster with frameworks like React, Django, and Rails.
How to use them effectively
If you want to stay ahead, treat AI coding assistants like you’d treat any new technology:
- Set expectations. You’re not asking for perfect code. You’re asking for a starting point.
- Give them context. Share constraints, architecture, and file structure. They respond better with more information.
- Guide them. Use feedback loops. Correct and regenerate.
- Use them to explore. They’re excellent at giving you first drafts and second opinions—fast.
This is the worst they’ll ever be
The quality of AI coding tools will only improve from here. If you had a bad first experience, try again, but this time, try with structure and intention.
Because here’s the reality:
The engineers who learn to collaborate with AI today will outpace those who ignore it.
The choice is simple: evolve your workflow, or risk becoming obsolete.