A note before we begin: This isn’t about declaring everything wrong or throwing away decades of wisdom. Software engineering processes evolved for good reasons, and many principles remain valuable. But when the fundamental constraints change—when what took months now takes days—we owe it to ourselves to step back and re-examine our assumptions. Not to rebel for rebellion’s sake, but to honestly ask: which practices serve us today, and which are relics of constraints that no longer exist? This is that re-examination.
The software industry has been telling a story for decades.
It’s a story about Agile "improving quality" and Scrum "ensuring better outcomes" and processes "keeping teams aligned."
But here’s the uncomfortable truth: **The industry invented software …
A note before we begin: This isn’t about declaring everything wrong or throwing away decades of wisdom. Software engineering processes evolved for good reasons, and many principles remain valuable. But when the fundamental constraints change—when what took months now takes days—we owe it to ourselves to step back and re-examine our assumptions. Not to rebel for rebellion’s sake, but to honestly ask: which practices serve us today, and which are relics of constraints that no longer exist? This is that re-examination.
The software industry has been telling a story for decades.
It’s a story about Agile "improving quality" and Scrum "ensuring better outcomes" and processes "keeping teams aligned."
But here’s the uncomfortable truth: The industry invented software processes because building software took forever, and we needed to manage that time.
That’s it. That’s the whole story.
Everything else—the quality talk, the collaboration benefits, the "engineering excellence"—was post-hoc rationalization for what was actually just time management theater.
And now that AI has collapsed development timelines from months to days, the theater is still running... but the audience left months ago.
What We Said vs. What We Meant
Let’s be honest about what each process actually solved:
SPRINTS
What the industry said: "Sprints create focus, improve predictability, and enable iterative delivery."
What the industry meant: "Projects take 6+ months and without artificial checkpoints, teams drift, scope creeps, and executives panic. We need milestones so people can pretend we’re making progress."
The real problem sprints solved: Long timelines made it impossible to maintain focus or measure progress. Sprints chunked months into digestible pieces.
Why it made sense: When a feature took 3 weeks to build, a 2-week sprint provided structure.
Why we need to reconsider: When a feature takes 3 hours to build, a 2-week sprint is like using a calendar to plan your lunch break.
STORY POINTS
What the industry said: "Story points help teams estimate complexity and improve delivery predictability."
What the industry meant: "Executives keep asking ‘when will it be done?’ and teams have no idea because software is unpredictable. Story points let us say numbers that sound scientific while we’re actually just guessing."
The real problem story points solved: Executives needed dates. Developers couldn’t give dates. Story points created a translation layer between "I don’t know" and "Q3 2024."
Why it made sense: When you couldn’t easily try something, you had to estimate how long it would take.
Why we need to reconsider: When you can build it three different ways in an afternoon, estimation is slower than experimentation.
CODE REVIEWS
What the industry said: "Code reviews improve quality, share knowledge, and catch bugs early."
What the industry meant: "If this bug makes it to production, it’ll sit there for 2 weeks until the next deploy window, cost us customers, and someone might get fired. We need a gate."
The real problem code reviews solved: Deployment was risky, slow, and infrequent. Bugs were expensive because they persisted for weeks. Reviews were insurance against costly mistakes.
Why it made sense: When deploying twice a month, you couldn’t afford mistakes.
Why we need to reconsider: When you can deploy 20 times a day and rollback in 30 seconds, is pre-deployment review still the highest-leverage quality mechanism? Or is rapid iteration + monitoring better?
DAILY STANDUPS
What the industry said: "Standups keep the team aligned, surface blockers, and maintain communication."
What the industry meant: "Work moves so slowly that someone can be blocked for 24+ hours without anyone noticing. We need a daily ritual to catch this."
The real problem standups solved: When work took days/weeks, blockers that lasted 24 hours were productivity killers. Daily check-ins prevented day-long waits.
Why it made sense: When your blocker was "waiting for another team’s API" and that could take 3 days, daily standups mattered.
Why we need to reconsider: When you can implement the API yourself in 2 hours, or use AI to mock it, or just try a different approach entirely, what exactly are we standing up about?
DETAILED SPECIFICATIONS
What the industry said: "Specs ensure alignment, prevent rework, and document decisions."
What the industry meant: "Building the wrong thing costs us 2 months of engineering time. We need to be REALLY sure before we start coding."
The real problem specs solved: The cost of building something was so high that you couldn’t afford to build the wrong thing. Specs were risk mitigation.
Why it made sense: When building a feature took a month, spending a week on specs was 20% overhead. Reasonable.
Why we need to reconsider: When building five different prototypes takes a day, writing a spec first is like creating an architectural blueprint before rearranging your furniture.
RETROSPECTIVES
What the industry said: "Retros drive continuous improvement and team health."
What the industry meant: "Projects take so long that by the time we finish, everyone forgot what went wrong. We need a formal ceremony to remember."
The real problem retros solved: Feedback loops were 6-12 months long. You needed explicit reflection because natural learning was too slow.
Why it made sense: When you shipped major releases quarterly, quarterly retrospection made sense.
Why we need to reconsider: When you’re shipping and learning daily, having a bi-weekly ceremony to discuss "what we learned" is like scheduling a meeting to remember what you had for breakfast.
The Uncomfortable Truth
Here’s what the industry doesn’t like to admit:
None of these processes were invented to make software better.
They were invented to make slow software development manageable.
Think about it:
If software was built instantly, would you need sprints? No—there’d be nothing to chunk.
If you could try five approaches in an hour, would you estimate with story points? No—you’d just try them.
If you could deploy 100 times a day with instant rollback, would you need extensive code review? Maybe, maybe not.
If work took hours instead of days, would you need daily standups? What would you even say?
If building something took less time than speccing it, would you write specs first? Of course not.
The processes exist because software was slow.
The industry just convinced itself they existed because we were disciplined.
The Rationalization
But here’s where it gets interesting psychologically:
The industry couldn’t just say "we do sprints because work takes forever."
That sounds... defeated. Pessimistic. Like we’re just managing our limitations.
So the industry reframed time management as quality management:
"Sprints create focus" (not: "sprints make long timelines bearable")
"Code reviews improve quality" (not: "code reviews prevent expensive mistakes when deploys are rare")
"Standups improve communication" (not: "standups catch blockers before they waste entire days")
"Planning prevents waste" (not: "planning prevents us from spending months building the wrong thing")
The industry took tools invented for slowness and rebranded them as tools for excellence.
And then we started measuring ourselves against the tools instead of the outcomes:
"We have good velocity" (we do sprints properly)
"We have high code coverage" (we do testing properly)
"We have low defect rates" (we do reviews properly)
The industry optimized for process compliance and called it engineering discipline.
Why This Matters Now
Because something changed.
Software isn’t slow anymore.
Features that took weeks now take hours. Prototypes that took months now take days. Entire products that took quarters now take weeks.
And all these processes—designed for slowness—are still running.
Like a factory optimized for hand-crafting furniture that just got industrial automation but is still measuring "items completed per artisan per week."
The metrics are obsolete. The processes are obsolete. The assumptions are obsolete.
But teams are still running standups. Still pointing stories. Still planning sprints.
Not because they make us faster. Not because they improve quality.
But because admitting they’re obsolete means admitting everything we learned about "proper software development" just expired.
And that’s uncomfortable.
The Way Forward
This isn’t a call to abandon all process and embrace chaos.
It’s a call to re-examine our assumptions.
To ask honestly:
Which processes solve problems that still exist?
Which are solving problems that disappeared?
What new problems emerged that our old processes don’t address?
The tools weren’t wrong when we invented them. The context was different.
But context changed. Constraints changed. Speed changed.
And we owe it to ourselves to step back and ask: What does "good process" look like when execution is no longer the bottleneck?
Closing
The software industry has been running on processes designed for a world that no longer exists.
These processes weren’t created for quality, for collaboration, or for excellence.
They were created because software was slow, and we needed to manage that slowness.
And now that software isn’t slow anymore, it’s time to admit it:
Most of what we call "software engineering best practices" was just time management theater.
The question isn’t whether these practices were valuable—they were, for their time.
The question is: Are they still valuable now?
And that’s a question worth asking honestly.