Your tech interview: "No AI allowed. Reverse this linked list on a whiteboard. In 45 minutes. While we stare at you like you’re defusing a bomb."
Linus Torvalds, 5 days ago: "AI wrote this better than I could. Shipping it. ¯*(ツ)/¯"*
One of these people built Linux and Git. The other is timing how long you take to remember where the pointers go.
ACT 1: The Skills You’re Actually Testing
Let me describe your 2026 technical interview without the corporate euphemisms.
You’re testing whether they can calculate square roots by hand while a calculator sits on the same desk.
You’re testing whether they can send Morse code while holding a smartphone that autocorrects "ducking" every time you curse.
**ACT 2: The Moment Linus Made This Uncomforta…
Your tech interview: "No AI allowed. Reverse this linked list on a whiteboard. In 45 minutes. While we stare at you like you’re defusing a bomb."
Linus Torvalds, 5 days ago: "AI wrote this better than I could. Shipping it. ¯*(ツ)/¯"*
One of these people built Linux and Git. The other is timing how long you take to remember where the pointers go.
ACT 1: The Skills You’re Actually Testing
Let me describe your 2026 technical interview without the corporate euphemisms.
You’re testing whether they can calculate square roots by hand while a calculator sits on the same desk.
You’re testing whether they can send Morse code while holding a smartphone that autocorrects "ducking" every time you curse.
ACT 2: The Moment Linus Made This Uncomfortable
Five days ago, Linus Torvalds posted a commit message that should be framed in every engineering office:
"Is this much better than I could do by hand? Sure is."
This isn’t some AI hype bro on Twitter. This is:
- The guy who created Linux because he was bored in 1991
- The guy who created Git after getting mad at another version control system
- The man whose code reviews are so brutal they could be war crimes
- The person who literally wrote the operating system that runs most of the internet
- A man who once said "talk is cheap, show me the code" and now says "AI’s code is better than mine"
He didn’t write a think piece about "the soul of programming." He didn’t tweet "just shipped AI code, feeling cute might delete later."
He just... committed it. Like it was no big deal. Like using a better screwdriver.
If Linus Torvalds isn’t too proud to admit AI outperformed him, what exactly are you protecting? Your ego? The sacred art of manually reversing linked lists?
ACT 3: The Real Reason for "No AI" Interviews
Let’s name what this actually is: skillset displacement anxiety. Or as the kids call it, "I’ve become the technology I used to make fun of my dad for not understanding."
You’ve spent 10-20 years mastering patterns, algorithms, and implementation details that AI just... does. That’s terrifying. Your competitive advantage—built over decades—feels like it’s evaporating faster than a junior’s confidence during a system design interview.
So you retreat to what you know: test the skills you have, not the skills the job requires.
But here’s the brutal truth: Your ability to write bubble sort from memory is now worth approximately the same as your ability to write cursive. It’s a party trick.
What is valuable:
- Evaluating whether AI output actually solves the problem or just looks confident while being completely wrong (like a junior dev but faster)
- Knowing when to regenerate vs. fix (the art of "nope, try again with less hallucination")
- Writing tests that validate outcomes not just implementation details
- Debugging production issues regardless of whether the code was written by a human, AI, or a raccoon with a keyboard
- Shipping working features faster than that startup with the weird name that just raised $50M
You’re testing for museum curation. The job is product engineering.
ACT 4: What "Better Than I Could Do" Actually Means
When Linus says "better than I could do," he’s not saying he’s a bad programmer. He’s saying AI explored a solution space he wouldn’t have bothered with because he’s got better things to do—like maintaining Linux.
This is the critical insight: AI doesn’t just make you faster—it makes different tradeoffs than you would. Sometimes better. Sometimes worse. Usually weirder.
Your interview process tests none of this. It tests: "Can you perform a skill that Linus Torvalds publicly admitted is now secondary to AI evaluation?"
That’s like testing a pilot’s ability to navigate by the stars when they have GPS. Cool skill, bro. Not what we’re hiring for.
ACT 5: The Interview That Actually Tests 2026 Skills
Here’s what you should be doing:
The Setup:
"Here’s a repo. AI-generated, messy, kind of works, definitely has bugs. It uses a library that hasn’t been updated since 2019. The documentation is in a language that might be Elvish. You have 2 hours. Make it production-ready. Show your work."
What this reveals:
- Can you quickly grok unfamiliar code or do you just pretend to while crying internally?
- Do you know when to fix vs. regenerate vs. burn it all down and start over?
- Can you write tests that validate outcomes not just "the code does the thing I think it should"?
- Are you effective with AI tools or do you just ask ChatGPT "write code plz" like a toddler?
- What’s your quality bar when the code doesn’t have your name on it?
- Can you ship under pressure without having an existential crisis?
This is the actual job. The rest is nostalgia, like insisting on using a mechanical keyboard that sounds like a machine gun because it "feels more authentic."
ACT 6: The Gatekeeper’s Greatest Hits
I know the objections. They’re like a greatest hits album from 2015. Let’s remix them:
"But AI code is hard to maintain!"
Is it? Or is it just different from the patterns you’re comfortable with, like seeing tabs when you use spaces? Linus maintains his just fine. The real issue: you’re comfortable with code you write, and uncomfortable with code you don’t fully understand. That’s human. It’s also irrelevant to outcomes, like complaining that your replacement car doesn’t smell like your old one.
"But fundamentals!"
Linus understands fundamentals better than everyone reading this combined. He still chose AI. The question isn’t "do you know fundamentals?" It’s "are fundamentals the rate-limiting factor in shipping?" Spoiler: They haven’t been since Stack Overflow existed. Now they’re a commodity, like knowing how to use a mouse.
"But we need to see how they think!"
You do. You see: do they validate? Do they test? Do they iterate? Do they focus on outcomes? That’s thinking. Memorizing quicksort is trivia. Knowing the exact runtime complexity of a B+ tree is something you Google and then promptly forget again.
"But what about engineers who can’t code without AI?"
What about pilots who can’t fly without instruments? What about surgeons who can’t operate without anesthesia? What about you who can’t work without Stack Overflow and 47 tabs open? We’ve been augmenting cognition for decades. This is just the first time the augmentation talks back.
The real fear: If AI can write code better than you, and new hires use AI, what’s your value?
The answer: Your value was never the code. It was the judgment, experience, and system-level thinking. AI makes those more valuable, not less. Stop protecting the wrong skill. It’s like a blacksmith insisting on testing horseshoe-making skills after the invention of the car.
ACT 7: The Zen of Linus
Linus didn’t announce this. He didn’t defend it in a Hacker News thread. He didn’t write a Substack about "the soul of programming."
He just did it.
- Had a goal ✅
- Used AI ✅
- Got a result ✅
- Evaluated it ("better than I could do") ✅
- Shipped it ✅
- Moved on to maintain Linux ✅
No drama. No identity crisis. No "but what about the craft."
Just: tool worked better, used the tool, shipped, probably yelled at someone about something unrelated.
That’s the lesson.
Not "AI is replacing programmers."
Not "code is dead."
Not "everyone will be obsolete."
Just: Use the best tool for the job. The best tool is increasingly AI. Adjust accordingly, or become the guy who still uses a flip phone ‘because it’s more reliable.’
ACT 8: Your Choice (Choose Your Own Adventure)
You have two paths, Obi-Wan:
Path A: Keep testing linked lists.
- You’ll hire people who are amazing at linked lists
- They’ll join your team and immediately face 500,000 lines of production code they’ve never seen
- They’ll use AI tools you didn’t evaluate them on
- They’ll realize your interview had the predictive accuracy of a horoscope
- Your competitor will hire people who know how to use AI effectively
- Those people will ship faster
- Their company will win
- You’ll tweet about "kids these days" from your increasingly irrelevant position
Path B: Test what matters.
- Give them a messy repo
- Watch them use AI like an adult
- See if they can ship outcomes
- Hire people who can actually do the job
- Watch your team ship faster
- Get promoted
- Retire to a beach where you can finally admit you never liked writing boilerplate anyway
The AK-47 is on the table. It’s been there for a year. Your interview candidates are staring at it, then at their bow and arrow, then at you like you’ve lost your mind.
Stop testing who can shoot the straightest arrow. Start testing who can win the fight.
P.S. The Authority Gap
If you’re a senior engineer defending "no AI" interviews, you need to answer one question:
What do you know about code quality that Linus Torvalds doesn’t?
Because he’s shipping AI code. You’re forbidding it in interviews.
One of these positions reflects reality. The other is a museum exhibit with a "do not touch" sign.
P.P.S. The Secret
Between you and me? Everyone already uses AI in interviews. They just hide it like teenagers hiding vape pens from their parents.
You’re not testing "pure coding skills." You’re testing "ability to appear to not use AI while strategically using AI."
The only difference is whether you’re testing their acting skills or their engineering skills.