Before I had a home server or a home NAS, I used external hard drives for backup storage while keeping the most essential, need-to-have-access files in the cloud. Then I moved to mostly cloud storage because it was the easiest way to sync across the family’s devices that run multiple operating systems. I still run that for simplicity of sync, but my thoughts about cloud storage have changed, and now I run my own private cloud from the safety of my own hardware.
Now, I’d be the first person to say that this isn’t for everyone. Even the …
Before I had a home server or a home NAS, I used external hard drives for backup storage while keeping the most essential, need-to-have-access files in the cloud. Then I moved to mostly cloud storage because it was the easiest way to sync across the family’s devices that run multiple operating systems. I still run that for simplicity of sync, but my thoughts about cloud storage have changed, and now I run my own private cloud from the safety of my own hardware.
Now, I’d be the first person to say that this isn’t for everyone. Even the most technical user might prefer to have a life of simplicity and not having to be tech support while at home, and offload all those services and storage needs to a cloud provider. And that’s perfectly okay, but it’s not how I wanted to operate. With a combination of hardware, the right operating systems, and some local AI, I’ve got a system that works for me and my family, whether they know it’s being used or not.
The cloud has a problem
It doesn’t work when my Internet goes out
My biggest issue with the cloud has always been that it doesn’t work when my Internet goes out. That’s less annoying when it’s backups of local storage, but when my smart home doesn’t work, my voice assistants won’t reply, and other services I consider essential are unreachable, it makes me angry.
Recently, my data storage needs have increased, and the 2TB limit on most personal cloud storage services isn’t enough. It’s fine for essential things, but it doesn’t have anywhere near enough capacity to meet my data-hoarding needs. Enterprise data storage plans are incredibly more expensive, mainly because they have data retention and recovery tools that keep weeks of backups for you.
I needed a solution, and a home cloud was the answer. Along the way, I’ve learned containerized workloads using a variety of orchestrators, more YAML than I wanted to, even more JSON than I ever wanted to experience, and some Python (that I actually enjoyed, much to my surprise), and a whole lot about networking, security, operating systems, and more.
My smart home deserves better
I absolutely love smart home devices, but I don’t love losing control of them when the cloud conveniently disconnects. To fix that, I’ve brought every smart device I could into Home Assistant, so that my smart home stays working even when the Internet doesn’t.
That handles device control, but one of the things I also love is voice control, and I’ve got that working as well, thanks to integrating a local LLM with Home Assistant. Add Matterbridge to the mix, and I can expose Home Assistant devices to Alexa, and with local voice control being enabled, I have voice control from the smart speakers I already own. The LLM means I can use natural commands to control HA entities, and it’s fantastic, but it does need to be on a faster computer, otherwise the responses can be slow.
Self-hosting is (one) answer
And it’s never been as easy to pull off
My home cloud is a sprawling, complex entity these days, but it wasn’t always like this. I started small, with a media server running on my gaming PC. The latest addition is a Proxmox server with plenty of power, which will handle most of the active storage. The NAS it’s replacing will be used for backups, and I’m happy to back up my private cloud before sending another to the actual cloud.
Self-hosting does take some effort, but it’s more approachable than ever before, and much as I hate to say it, Docker is probably the best way to start. Don’t worry if you slowly replace services, because self-hosted services take time, and if you’re trying to troubleshoot multiple things at once, you’re more likely to give up.
Even security and remote access are trivial to pull off
Is the cloud, the cloud, if it can’t be accessed from anywhere? That’s the question for home labbers everywhere, and requires a remote access tool of your choice, often paired with a reverse proxy and DNS solution. This does several things, but the most important thing, when you’re trying to access your NAS from anywhere in the world, is that it doesn’t require any ports to be open to the Internet, so only your approved clients can get through your firewall.
My current favorite is NetBird, which is easy to set up, easy to connect to, self-hostable if you want, and makes it simple to share your services with friends and family. All data is encrypted in transit, and there are no bandwidth restrictions when you’re self-hosting, which is vital if you want to stream your media server through it. But whichever solution you pick, it has never been easier to set up a secure remote access solution that turns your home NAS into a private cloud.
But now uptime is your problem
When the cloud goes out, a swarm of engineers gets released from wherever they’re stored when not in use and restores services for everyone. Okay, part of that is true, but the point is that once you self-host your own cloud, you’re the one responsible for uptime. Less of a problem when you’re the only one using them, but increasingly more so as the number of family and friends using them grows.
I’ve got Uptime Kuma monitoring my services, which sends notifications when things drop offline. It doesn’t just keep an eye on my NAS, server, services, and virtual machines, which is pretty much the bare minimum for monitoring. It’ll track SSL certificate expiry dates and monitor response times for your services, both of which are important for optimizing your home cloud. And it gives me a wide range of places to get notified, and since I have never had Discord closed, I’ve got those messages going to its own channel on a private server I run.
Recently I started adding automation into the stack, with self-hosted n8n connected to a local LLM, connected to my Proxmox server that runs all my virtual machines and containers. This looks for Uptime Kuma’s monitoring messages, reads them with the LLM, and decides what to do to fix the issue. It’s incredibly powerful and I’ve only scratched the surface of what it could do, but the next stage is to turn this into an interactive AI assistant to manage my home cloud services.
And my whole services stack is connected to a huge backup battery because if the power blips, I don’t want to lose any precious data. The next addition will be a 5G router for failover when the fiber connection drops, because uptime is essential for the cloud and I should embrace that in my home cloud as well.
Credit: Source: Home Assistant
Home Assistant
Credit: Source: iXsystems
TrueNAS SCALE
Credit: Source: NetBird
NetBird
Syncthing
Ollama
I’m now running my own cloud but not for everything
I have learned so much since the early days of self-hosting Plex on my computer, through several successive NAS enclosures, and now to the setup I’ve currently got with a server, several NAS enclosures, and a VPS. The latter is essential for my setup as that’s where my remote access tool sits, so I guess I’m running a hybrid cloud since some of the hardware isn’t under my control. I could have used Tailscale or several other remote access options that don’t require a self-hosted server, but this helps get past the CGNAT situation my ISP put me in.
I don’t know if you should build your own home cloud, partly because it’s a constant stream of maintenance and tinkering after the fact, but it has been gratifying for me. And really, that is sometimes all you need as a motivation.