A lot of people, I think, are very uncomfortable with the fact that they are essentially CRUD monkeys. They just make systems that create, read, update, or delete rows in a database and they have to compensate for that existential dread by over complicating things.
How it started
I have had many iterations of my personal blog, but they can be split into three distinct phases: database with problems, no database, and, finally, database acceptance.
When I started building web applications, I thought the MERN stack (MongoDB + ExpressJS + React + NodeJS) was the final form of web dev. The first phase of my blog used a JavaScript framework plus MongoDB. This was back in 2022 when …
A lot of people, I think, are very uncomfortable with the fact that they are essentially CRUD monkeys. They just make systems that create, read, update, or delete rows in a database and they have to compensate for that existential dread by over complicating things.
How it started
I have had many iterations of my personal blog, but they can be split into three distinct phases: database with problems, no database, and, finally, database acceptance.
When I started building web applications, I thought the MERN stack (MongoDB + ExpressJS + React + NodeJS) was the final form of web dev. The first phase of my blog used a JavaScript framework plus MongoDB. This was back in 2022 when JavaScript frameworks were at the peak of their powers.
This worked well, for a while. Deployment was quick and easy; all it took was git push
, and I had what I thought was some pretty sweet DX (developer experience). At least, compared to the vanilla PHP monstrosity I was dealing with at work.
Then the timeouts started.
At the time, I had about a dozen monthly visitors - it wasn’t exactly Hackernews. And Atlas (Mongo’s cloud DB service) has (or had?) this charming feature where low-traffic databases would get “idled” to save resources. The first request after an idle period could take up to 30 seconds to wake up the cluster, assuming it didn’t timeout completely.
I had friends sending me screenshots of the Vercel error page after waiting 30 seconds for a blog post to load. It was bad. I aimed my ire - perhaps unjustly - at Vercel.
“I’ll just self host,” I thought. I spun up an always-free VPS from Oracle with 1GB RAM, setup Docker, and deployed my NextJS
app.
That’s when I discovered the RAM vortex that is Node.js
.
My beautiful blog was serving (optimistically) 5 pages a day, and sitting there consuming 400MB of memory at idle. MongoDB added another cool 200MB. On my modest VPS, I was using most of my available resources to serve a blog that could fit in a single HTML file. I would later realise that I was loading a ton of JavaScript on the client too, which wasn’t helping my case.
My blog was the architectural equivalent of a meeting that could have been an email.
That’s when it dawned on me: the problem wasn’t the hosting, or Node, or even MongoDB specifically. The problem was that I didn’t need a database at all.
After all, I had all my blog posts in Obsidian as markdown files. They mostly never changed. They were just... files. Static files.
Go and the 50MB dream
Fast-forward a couple of months, and I had a “full-featured” (foreshadowing) blog, built with Go. At rest, I was using a cool 47MB of RAM. If I could downgrade my server to 256MB, I would have.
In my opinion, it was an elegant solution: write posts in Markdown, and some yaml
frontmatter for metadata, parse them into HTML (using goldmark
), render them as Go HTML templates (and, later, with Templ), and serve them with Echo. No database. No complex migrations. No ORM headaches. Just files, folders, and pure Go simplicity.
I was proud of this build. It was lean and beautiful, but most importantly, it was all mine. It showed that I had learned from my mistakes.
Little did I know that I was setting myself for an important lesson because those smart guys who came up with databases... they kind of knew what they were doing.
The bubble bursts
The problem with dreams is that you eventually have to wake up.
The simplicity of it all had made me much more productive. I actually enjoyed working on new content again. I passed thirty blog posts, and that was when it dawned on me - eventually, someone would need to search for something.
“It can’t be that hard, right?”
My first attempt was a bit naive, but it worked; kinda.
// simplified version
func searchPosts(query string) []Post {
var results []Post
for _, post := range allPosts {
if strings.Contains(strings.ToLower(post.Content), strings.ToLower(query)) {
results = append(results, post)
}
}
return results
}
With about 30 posts to parse and some hacks to handle case-sensitivity, it kind of worked. I kept telling myself that these headaches were a small price to pay for simplicity. Minimal was the name of the game.
The files fight back
Search wasn’t my only problem. Other developer’s blogs had “related posts” at the bottom of each article. Their readers could view posts by tags. I also wanted those nice, quality-of-life features for my blog. Heck, I wanted to write multi-part blog posts and have them be part of a “series”!
So, I started with tags. I went through all thirty-something blog posts, adding tags one at a time.
---
title: "Building APIs in Go"
tags: ["go", "api", "backend"]
date: 2023-03-15
---
I would have to parse those tags. Then build an index. Then create tag pages. Then make sure the search worked with tags. Then handle plurals and synonyms. I’m sure you see where this is going.
I had basically dug myself into a hole. I was reimplementing database functionality, one new, tiny method at a time.
At that point, I started to feel the heat. I rage quit for a while, and stopped adding features to my blog. Without the desire to add features, I also lost the desire to write anything, too. What was the point of a blog where people couldn’t find anything?
The breaking point came when I tried to show “related posts” at the bottom of each article. It felt like a simple problem to solve. With a database, this would be a JOIN query, maybe using tags or something for similarity matching. In my file-based system, I had to loop through all my posts, match the tags, compare and find similar words, assign scores to each other post, and finally return the top n
related posts
func findRelatedPosts(post Post, limit int) []Post {
// some code
for _, otherPost := range allPosts {
if otherPost.Slug == post.Slug {
continue
}
score := 0
for _, tag := range post.Tags {
for _, otherTag := range otherPost.Tags {
if strings.EqualFold(tag, otherTag) {
score += 10
}
}
}
// ...compare the words and post content in lower case
if score > 0 {
matches = append(matches, match{otherPost, score})
}
}
// return "limit" matching posts
return results
}
Even after writing the code, I knew this was impractical. What would happen when I reached 50 posts? 100? 200? At some point, this would require loading hundreds of posts into memory. I was just kicking the can forward but eventually, I’d have to pay the piper.
The Ghost detour
At this point, I started to doubt myself. I just wanted to write, and dealing with all this had drastically reduced my output. So, I decided to outsource.
I switched to self-hosted Ghost. It was simple to setup - all I needed was a single Docker compose file. I even threw in Plausible analytics with its Postgres DB for good measure.
The setup process was smooth. The admin interface was polished and clean. But performance... well, that was concerning.
At idle, I was constantly over 800MB memory utilisation (almost 20x what my Go blog demanded) and dangerously close to maxxing out my tiny VPS. I’d wake up every morning and try to ssh
into my server to make sure it hadn’t run out of RAM and seized up. And when it did, I’d have to go to the Oracle cloud console and force a reboot.
Then, I tried adding paid subscriptions. Ghost only supports Stripe out of the box, which isn’t available in my country. I was suddenly faced with the prospect of having to implement a custom payment solution. I was back to having to build everything myself, but now with the overhead of Ghost’s architecture.
I was back to square one: wrestling with custom solutions on top of a foundation that wasn’t quite right for my needs. Except now, I was doing it while consuming 15x more memory.
The Laravel revelation
So, last weekend, after about 14 months with my Go blog, I finally admitted what I had known for a while: I needed a real database, and I needed a framework that would let me build exactly what I wanted with little resistance.
Fortunately, I had been writing Laravel code daily for the better part of 2 years. I knew its ecosystem. I knew I could build subscription systems, payment processing (even without Laravel Cashier which also only supports payment providers not available in my country), user management, and content management without reinventing the wheel. I already had the database schema in the form of my post’s frontmatter.
The migration was both painless and liberating. I built a simple action in Filament to import my old posts - I could upload a file and have it parse the YAML frontmatter and markdown content and create the posts. I was already using tailwindcss
on my Go blog, so it was easy to translate everything to blade templates.
Within a few hours, I had a working prototype.
The first time I ran a search query - a real SQL search with proper ranking and highlighting - I actually laughed out loud.
// simplified
$posts = Post::search($query)
->where('published', true)
->orderBy('updated_at', 'desc')
->paginate(10);
One beautiful line that would have taken me days to implement properly in my file-based system.
Auth came built-in with the Livewire starter kit. I got my RSS feed working, again. Newsletters and subscriptions are still a work in progress, but coming soon. These things that would have taken me weeks in my Go setup only took me hours in Laravel.
Sure, my memory usage went up. But, the difference between 50MB and 120MB at idle is academic. It was a small price to pay for a good night’s sleep.
Lessons about tool selection
I write this because I learned some very important lessons from all this. For starters, my Go setup wasn’t wrong. It was perfect for my needs at the time: a minimal, static blog with no dynamic features. It only became a problem when I wanted to grow beyond that scope. I had to let go of my attachment to the old architecture.
I also realised that I had misdiagnosed the original problem entirely. The issue with my original NextJS + MongoDB setup wasn’t that I was using a database, but that I was using a cloud database (I’ll spare you the details of my brief experiment with Supabase’s hosted Postgres) and a runtime (Node.js) that was overkill for my needs.
I fell into the trap of thinking the solution to my problem was ruthless simplicity and minimalism. With all the naivety of someone who thought they could reinvent the wheel, I told myself that databases were a problem. With the benefit of hindsight, I understand now that it was more of an architectural issue, than a tooling one.
Most importantly, I realised that my Go blog, beautiful and elegant as it was, was a dead-end. Every feature I added to make it more functional moved it further away from the elegant simplicity with which I’d started. Meanwhile, Laravel + MySQL on the same VPS gave me all the database benefits I had thrown away, without the cloud hosting gotchas or massive runtime overhead.
I just needed to find the right implementation that suited my needs without throwing the baby out with the bath water.
Conclusion
My biggest lesson in all this was that there’s no shame in outgrowing your tools. One one hand, I acquired a deep appreciation for minimalism and efficiency. On the other hand, I realised that sometimes the most direct solution is the best solution. The right choice isn’t static. It evolves as your needs evolve, as your understanding deepens, and as your priorities shift.
I crave novelty and excitement as much as the next developer, but my new priority is productivity. Being able to ship features in hours instead of weeks is so underrated.
If you’re going to take away one thing from my experience, its that sometimes the right tool may not be the most efficient, minimal, or innovative (yes, I almost rebuilt this in Rust
, too). Sometimes the right tool is simply the one that lets you build what you actually want to build with joy instead of frustration.
Most importantly, I’m enjoying the work again.