What I Learned After Losing Code to AI (Twice) (opens in new tab)

observer.bearblog.dev·11w·Open original (opens in new tab)
  • 23 Jan, 2026 *

The first time Cursor nuked my code, I blamed myself. I should’ve committed more, read the diff closer, been clearer in my prompt.

The second time I realized my workflow was broken.

What Happened the First Time

I’d been refactoring for about three hours. Cursor was being genuinely helpful, cleaning up old patterns, modernizing some syntax, breaking apart this massive god-object I’d been meaning to fix for months.

Then I typed "remove the deprecated utility functions."

It removed them along with every file that imported them and every file that imported those. Eighteen files total.

I hadn’t committed in over an hour. Cursor’s checkpoints only went back to my last Composer message, which was after the damage was done. Local History saved some of it, but figuring out which version of each file was the right one took forever.

Two hours of reconstruction later, I promised myself I’d commit more often.

Three Weeks Later

I was being good about it. Committing every fifteen or twenty minutes. Feeling responsible.

Then I got into one of those sessions where Claude Code was absolutely flying, building out a whole feature faster than I could keep up. I was reviewing, accepting, steering, completely locked in on what the AI was doing.

Forty minutes disappeared.

Somewhere in there, Claude decided my config files were "inconsistent" and helpfully "fixed" them. Everything broke. My last commit was from before I’d even started on the feature.

Another hour of work gone.

The Actual Problem

Both times I knew I should be committing. I knew the risks. I’d even read this piece on why AI agents delete files and understood the technical reasons behind it.

But discipline doesn’t survive flow state. When you’re moving fast with AI, your attention is on the AI and version control isn’t even on your radar. And those gaps between commits are exactly when things tend to go wrong.

What I Did About It

I stopped pretending I’d remember to commit and set up something automatic instead.

There’s a tool called mrq that runs in the background and captures every file change on its own:

npm install -g mrq-cli
mrq watch

Now when AI breaks something:

mrq restore abc123

I can get back to any point from the session, not just my last commit. Their blog explains how it works if you want the technical details.

What I Learned

The lesson wasn’t really about committing more often since I already knew that. Any protection that depends on you remembering will fail exactly when you need it most.

Working with AI creates this different rhythm that’s fast and exploratory and kind of chaotic. Traditional commits have gaps, and AI coding finds every single one of them.

Loading more...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Save / unsave
s
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help