Featuring
Sponsors
Depot – 10x faster builds? Yes please. Build faster. Waste less time. Accelerate Docker image builds, and GitHub Actions workflows. Easily integrate with your existing CI provider and dev workflows to save hours of build time.
Tiger Data – Postgres for Developers, devices, and agents The data platform trusted by hundreds of thousands from IoT to Web3 to AI and more.
Notion – Notion is a place where any team can write, plan, organize, and rediscover the joy of play. It’s a workspace designed not just for making progress, but getting inspired. Notion is for everyone — whether you’re a Fortune 500 company or freelance designer, starting a new startup or a stu…
Featuring
Sponsors
Depot – 10x faster builds? Yes please. Build faster. Waste less time. Accelerate Docker image builds, and GitHub Actions workflows. Easily integrate with your existing CI provider and dev workflows to save hours of build time.
Tiger Data – Postgres for Developers, devices, and agents The data platform trusted by hundreds of thousands from IoT to Web3 to AI and more.
Notion – Notion is a place where any team can write, plan, organize, and rediscover the joy of play. It’s a workspace designed not just for making progress, but getting inspired. Notion is for everyone — whether you’re a Fortune 500 company or freelance designer, starting a new startup or a student juggling classes and clubs.
Fly.io – The home of Changelog.com — Deploy your apps close to your users — global Anycast load-balancing, zero-configuration private networking, hardware isolation, and instant WireGuard VPN connections. Push-button deployments that scale to thousands of instances. Check out the speedrun to get started in minutes.
Notes & Links
People
- Techno Tim - Tim Stewart’s website, YouTube channel, and documentation hub
- Crosstalk Solutions - Chris’s channel, mentioned for building custom Ubiquiti API tools
Virtualization & Infrastructure
- Proxmox VE - Open-source virtualization platform for VMs and containers
- TrueNAS - Enterprise-grade open-source storage operating system built on ZFS
- HexOS - Consumer-friendly NAS OS built on TrueNAS (in development)
- Proxmox VE Helper Scripts - Community-maintained scripts for easy LXC and VM deployment
Self-Hosted Software
- Paperless-NGX - Self-hosted document management system with OCR
- Paperless-GPT - AI-powered enhancement for Paperless-NGX using LLMs
- Ollama - Run large language models locally on your own hardware
- Open WebUI - Self-hosted web interface for interacting with local LLMs
- Plex - Media server for organizing and streaming your personal media library
- Home Assistant - Open-source home automation platform
- Pi-hole - Network-wide ad blocking via DNS filtering
Document Intelligence & RAG
- Dockling - IBM’s open-source document parsing library for AI/RAG pipelines
- PaddleOCR - Multi-language OCR toolkit for document recognition
AI & Agents
- Claude - Anthropic’s AI assistant, used for homelab automation in this episode
- OpenCode - Open-source AI coding agent (mentioned as potential homelab tool)
- Model Context Protocol (MCP) - Protocol for connecting AI models to external tools and data
Networking
- Ubiquiti - Enterprise networking gear popular with homelabbers (UDM Pro, UniFi)
- Tailscale - Zero-config VPN for secure networking between devices
Container & Orchestration
- Docker - Container platform for packaging and running applications
- Kubernetes - Container orchestration for managing containerized workloads
- Fly.io - Platform for running containers close to users globally
Monitoring & Observability
- Grafana - Open-source analytics and visualization platform
- Prometheus - Open-source monitoring and alerting toolkit
Security & Authentication
- Bitwarden - Open-source password manager (self-hostable)
- Authelia - Open-source authentication and authorization server
Databases
- MariaDB - Community-developed fork of MySQL
- Redis - In-memory data store for caching and messaging
- PostgreSQL - Advanced open-source relational database
Hardware Mentioned
- Intel Optane - Ultra-low latency storage drives (discontinued but prized for ZFS special vdevs)
- NVIDIA GeForce RTX 3090 - GPU used for Plex transcoding and local AI inference
Concepts & Techniques
- ZFS Special Vdevs - ZFS feature for accelerating metadata and small file operations
- PCIe Bifurcation - Splitting a PCIe slot to support multiple NVMe drives
- Medallion Architecture - Bronze/Silver/Gold data lake pattern discussed for document ETL
Chapters
| 1 | 00:00 | Let’s talk! | 00:42 |
| 2 | 00:42 | Sponsor: Depot | 02:21 |
| 3 | 03:04 | New year. Same Tim. | 02:02 |
| 4 | 05:06 | 2026’s changes for Homelab | 06:56 |
| 5 | 12:02 | The year of self hosted software | 03:29 |
| 6 | 15:31 | Down with AI assistance | 02:38 |
| 7 | 18:09 | Does Paperless handle books? | 02:26 |
| 8 | 20:35 | Markdown all the things | 03:22 |
| 9 | 23:57 | ETL pipelines | 05:34 |
| 10 | 29:31 | Son’s friend exposed guest VLAN issue | 03:40 |
| 11 | 33:12 | Proxmox CLI (pxm) coming soon!! | 03:42 |
| 12 | 36:53 | Sponsor: Tiger Data | 02:34 |
| 13 | 39:27 | Agent Ralph Wiggum in the homelab | 04:42 |
| 14 | 44:09 | Navigating Tim’s homelab world | 09:16 |
| 15 | 53:25 | What’s on your Kubernetes? | 07:41 |
| 16 | 1:01:06 | VDEV config!!! | 08:52 |
| 17 | 1:09:58 | Homelab in 2026 | 14:35 |
| 18 | 1:24:33 | Sponsor: Notion | 02:08 |
| 19 | 1:26:41 | Fly.io but in my homelab | 11:40 |
| 20 | 1:38:21 | MCP for Proxmox | 03:43 |
| 21 | 1:42:04 | proxmox helper scripts | 06:03 |
| 22 | 1:48:07 | What’s on Tim’s list? | 05:58 |
| 23 | 1:54:05 | 3 months later with Tim??! | 03:02 |
| 24 | 1:57:06 | Search Techno Tim OR technotim.com | 01:40 |
| 25 | 1:58:46 | Used to be on Mercer | 01:16 |
| 26 | 2:00:02 | Wrapping up | 00:48 |
| 27 | 2:00:50 | Closing thoughts and stuff | 01:58 |
Transcript
Changelog
Play the audio to listen along while you enjoy the transcript. 🎧
[00:00] Well, friends, we’re back. It is a new year. Not a new Tim, same Tim. Get your hat on Tim. I heard you got some strife on the internet recently.
Yeah.
You took your hat off and you started to just have your non-hat Tim going, you know, and a slight uproar. What happened there?
Yeah. People freak out when I don’t have my hat on. In my last video, no glasses, no hat. I broke my glasses and I didn’t have a backup pair and I thought, yeah, I’ll go no hat too. And so like, you know, you got this crazy person on Tim’s channel that kind of looks like him and sounds like him, but it doesn’t really look like him. So yeah, people get really confused. Every now and then I do it on purpose though to try to throw off the algorithm for YouTube because I don’t know. They’re like, maybe we target people who like glasses and now we’ll target people who don’t like glasses, you know. So the backward hat, you know, I’ve done that forever, but I’ve noticed some people are like, take your hat off you’re inside, you know. And so sometimes I switch it up because maybe the algorithm will now target those people that think that. Right. So you never know, you know, those games you play with the algorithm.
Well, of course, man, you got to A/B test. A/B test. Your A/B test, you know.
Yeah, I mean to. Yeah, A/B test. Yeah. And then a C. And then you take, your C is your B and your next. It’s a constant.
Yeah, fighting for ears and eyeballs, you know how it goes. Ears and eyeballs. Well, we love these people out here listening to our podcasts and our content. We just go on this journey because we’re just nerds. We can’t help it. We just have to pursue the inevitable, I suppose. And somehow put ourselves to that pain slash pleasure and share it.
That’s right. And that is what you call what we do.
Man, did you think we would be where we’re at right now last year, Tim?
No, no, not at all. A lot’s changed. Even I feel like for homelab, it’s changed even more. But yeah, a lot has changed.
[02:03] What do you think has changed? I mean, it’s obvious, but I want to hear your own words. What do you think has changed for homelab in particular?
If I could sum everything up in one word, it’d be availability, just availability. And that goes a few ways, you know, availability in parts. It’s very difficult right now to get your hands on server parts. Oh my god. Whether that be used server gear or motherboards, just CPUs. It’s very hard to get your hands on those because, you know, I suspect that most of these companies have contracts with really big companies. And so your onesie-twosies orders, or even your massive orders from stuff like Newegg aren’t as big as say, you know, Microsoft or something like that. It’s really hard to get your hands on server grade hardware.
It also, that also has been true for the secondhand market too when it comes to CPUs, motherboards and everything like that. I mean, homelabbers have always been used to paying the homelab tax. Well, let me take this back. We used to not pay the homelab tax. We used to get the used server gear free, take it away, really cheap, you know, type of hardware. Then people started realizing, hey, there’s homelabbers out there and I can make money off this. All right, it still has some value. I guess I should put it that way. It has a lot more value than it used to because, you know, people are using this stuff in their home for homelabs, which was awesome.
Secondhand market, you know, it was great. And now we’re at a point where not only do we have the homelab tax because people realize that they could start making money off their secondhand gear. Now you can’t even find it. And so that’s one big change I see is just availability of server parts, you know, and the same goes for RAM. As you know, prices are through the roof. If you can even find it, you know, prices are through the roof. You’re paying, I don’t know, double, triple. I’m scared to even look anymore of how much RAM costs. Hard drives too. Hard drives up. I was looking earlier today and I paid $159 a year ago for a 14 terabyte hard drive refurbished. That same one now on eBay, if you could even find it, you know, is almost $100 more.
Really?
Yeah. Anywhere from $70 to $100 more. And, you know, storage is just gone through the roof too. It’s easy to find storage, but you’re paying a lot more than you were before.
What about CPUs? CPUs the same?
It is the same. If you can find them. Yeah. Secondhand CPUs, they’re expensive. I mean, most people aren’t, I mean, let me take this back. If you’re building a server with server grade hardware, a lot of people are buying them used. We can’t afford new ones. Or don’t want to afford, I should say, in some of the cases. But even those secondhand ones are through the roof, because people are still getting a lot more life out of them. Or they can’t buy the ones that they want to upgrade to, you know, the latest whatever EPYC CPU. Most of those are allocated to some big customer. So even the midsize customers can’t upgrade because they can’t get those. So they’re not releasing that gear to then trickle down to the rest of us. So it’s tough for CPUs. CPUs too. Yeah, DDR5, CPUs, hard drives, motherboards. I feel like the only thing that’s really cheap right now are cases and enclosures because no one can build them. No one can build them. So I think like, you know, enclosures are really cheap right now because they’re like, please build something, but we can’t.
So, I saw who was it? Gamers Nexus was talking about cases recently. It was about four months back saying the cases were actually up because of tariffs.
Yeah, yeah.
So then there’s that. Then there’s tariffs on everything, which is, you know, whatever percentage increase across the board for everything. There’s that for sure. But a lot of this has to do with, you know, just the AI race that’s going on and, you know, build, build, build, and DRAM prices are through the roof because there’s a shortage, shortage of hard drives. There’s a shortage of everything because everyone’s building data centers right now. So yeah, there’s definitely tariffs across the board on everything. But this is like beyond that. This is like beyond hard drive prices, RAM prices, GPU prices. I mean, through the roof, GPUs are another one. Like you can’t even get your hands on them. So, you know, if you bought one four years ago, you’re pretty lucky. You know, if you bought a 30-series, I still have my 3090. It’s actually back there as something I’m testing with.
Yeah, right at the beginning of COVID, right at the beginning of COVID. I thought, how am I, I don’t want to pay $1,300 or $1,100 for a GPU. Like I thought, you know, am I really going to pay this price? I’m so glad I did.
Wow.
You know, I’ve had it for four years. You know, I paid retail for it. And so, you know, if you think, you know, the 4090s, 5090s, they’re more expensive than that, too.
So I bought my 3090 last year for $300 less than your retail price.
Wow. Yeah. Yeah. Yeah. Yeah. Yeah.
Because that’s when probably the 5090s were just coming, just about to come out.
That’s right. 4090s were out. Yeah. 5090s were being announced and everyone was dumping them. But now it’s like, you can’t even get your hands on them. But yeah, that’s a good deal. That’s a good deal.
Yeah. It’s a good GPU. I may have some fun things happening on that GPU as we speak right now. Training a RAG system. Just doing some RAG right now.
Cool. Man. I was just looking up that stuff, too. Yeah. Yeah. Yeah. So I thought you would say that. Glad you mentioned the hardware shortage because that’s key. But I thought you would say this your availability remark would have been not unavailability, but abundance of availability in terms of capability. These, you have minted homelabbers that can now tend their lab garden, so to speak. That is right.
That is right. That is right. So that’s my second piece as the explosion. So then my second piece to availability, that was the other side of the coin. And I’m not just saying that, it’s in the notes right here, is the explosion of self-hosted software that we can now run at home. Like it’s incredible. And so that’s my prediction for this year. This year is year of the self-hosted software. We can’t get hardware. We’ve got to make do with what we have. And so this is the year for software.
And it’s the year for software for many reasons. First of all, we have way more capabilities at home. Like I’ve been running Ollama at home, Open WebUI, you know, to play with models and do chat. I’ve even done some coding assistance with some agents. That stuff’s fun. Models aren’t as good as the open ones. All right. I’m sorry. Models, the open models aren’t as good as the closed models, the ones you pay for. Right. Obviously, or you wouldn’t be paying for them. But they’re good enough to do a lot of tasks, especially just, you know, like you were mentioning RAG and stuff like that.
Like I’ve been playing with Paperless, Paperless-NGX, Paperless, and Paperless-NGX is a self-hosted document scanning solution. So if you think about, if you think about, you have lots of documents that you want to store, you want to store on your own hardware, you know, they might be private documents, whether they’re I don’t know, financial statements or subpoenas or marriage license. You name it.
Yeah.
Whatever you have that’s private that you keep, just think of what you keep in My Documents. Think about keeping that on your own servers and then being able to scan those documents and then getting metadata and data about those documents. That’s kind of what Paperless does in a nutshell.
Well, now all of these not really sidecar, but these kind of sidecar solutions are starting to pop up where you can feed them to a model and get better data out of those. So right now, Paperless-NGX, it’s super cool. That’s actually my next video. So you’re getting a sneak peek. And that’s why I’m so like, yeah, that’s why I’m so gung-ho about it right now.
No, Paperless uses traditional OCR. So optical character recognition. And for the most part, it’s okay, right? It’s okay. It’s way better. It’s faster than humans. But it is nowhere near the accuracy of a model that’s been trained for vision. And so this is kind of the next evolution in OCR. It’s not use optical character recognition. Use a model that’s been vision trained or multimodal is what they’re saying now, where you can feed it text or feed it image and you get text out.
And so I’ve been playing with this thing called Paperless GPT and Paperless AI, which hooks into Paperless. And now I can scan documents, scan images, and get high fidelity data out of those images. So for example, I scanned serial number on one of my devices, you know, took a picture, scanned a serial number, OCR did terrible. It got like made in Japan, right? It’s about it. Serial number wrong. Everything was wrong. You feed it to an LLM that has been trained with vision, like a super small one from Ollama. And everything works perfectly. It even was able to figure out that the FCC trademark, their little FCC logo actually said FCC, even though it was
F with circular C’s inside of it. So it’s really cool. It’s really cool solution, self-hosted solution.
That’s where I’m thinking that this year is the year for software, not only because people are making all of these awesome solutions to self-host, it’s that people have a lot of assistance now to get those ideas, to make them come to fruition. They have agents to help them. Like I was just talking to a guy the other day who just built the piece of software he always dreamed of, but never had the ability to do it because he’s not a developer. He’s not a developer.
And say what you want about that code and whatever. I’m a developer too.
Yeah. Good.
Because I think that’s, that doesn’t work. This is a problem. That’s good code. This solves my problem that didn’t exist before this moment. That’s good code.
That’s right. And so people who are driven by results or who want, you know, code is just means to an end for a lot of people. You talk to product developers. You talk to people who are in IT, but don’t do code, but have these ideas. Those are the people right now who are creating these awesome solutions. And finally, being able to get those ideas out of their head.
So anyways, long story short, I just talked to a guy who built this whole solution on top of Ubiquiti’s API. It’s actually Chris from Crosstalk Solutions. He built this whole solution on top of Ubiquiti’s API. And it’s like the solution he’s wanted for years, but never took the time to do it or paid a developer to do it.
So anyways, we’re seeing lots of that now. And I’m hopeful. And it’s super exhilarating to see these ideas coming out because when you have people who are generally software developers, I don’t mean to buck people up, but a lot of times there’s, you know, deep focus on some super technical solution, looks perfect, runs perfect, structured in a certain way. But now I’m seeing these solutions by people who think way outside of the box and they’re just trying to solve a problem. And I get to see how they solve that problem. And it’s really cool to see because like, I might not have approached the problem the same way that they did. And so I get to see like a new perspective on coding.
So I don’t know. That’s the other pieces of availability. So I think this year is the year for self-hosting software, for open source software or for solutions in general to be able to run on your homelab because a lot of people are just going to be okay with the hardware that they had for a couple of years. And people like me who love to self-host stuff are always looking for like the next container app to run on your server. I’ll fill you on that front there.
I mean, I’ve been scratching little itches. That’s how I’d say it. Question on Paperless is, can it handle books? I suppose this does the OCR and or the vision version of that. Books are cool too because one thing I’m doing is I’m trying to figure out how to get knowledge out of certain books I only own in paper that I can’t even get in digital. Like, there’s a lot of books that are just not like that. And I think, and great, I have this book, it’s on my bookshelf over there. I’ve had it for years. I paid the author. I’ve paid the publisher. It’s my copy. But I’m not going to go pick it up because that’s the old way. Not that I don’t read books. I still read, okay. I still read.
But you can read good. I can be good, you know.
I was trying to build a little center for people who can read good. But I’m not there yet. I’m not as good as you like me yet. School for ants.
That’s right. That’s a good clip. That’s a good clip.
So can you do books with this Paperless world?
Yeah. So you can. You can because you can have multi-page documents. So you could. I mean, you would have just basically had this whatever 300 page document. Right. And then you could feed that to the LLM, which has vision.
And yeah, it should be able to parse it all out. Like, Paperless by itself should do pretty good on books as long as you get a good scan on it. But I’d still feed it to the LLM anyways for it to use its vision because it’s going to be, you’re going to go from whatever 80% accuracy, probably a lot lower to like 90 high 90s accuracy.
OCR in general, you don’t realize how bad it is until like you actually try to scan something in the real world. And you’re like, oh, yeah, this used to be amazing, but it’s not amazing anymore because we have vision-based LLMs that are amazing. Yeah. So yeah, you absolutely could. So I’m thinking, okay, you would scan it. You would get it in Paperless. It would put it in PDF form. And that’s the thing that’s cool about Paperless too. It tries to get everything in a PDF form.
So you would basically get it in a PDF. I assume the reader that you’re using uses PDFs too or maybe uses EPUB or whatever the weird extension is that I can’t think of. But I think if you got in a PDF, that would probably be good enough, I think.
Yeah. Yeah. My preference is to Markdown. I want to get things to Markdown.
Yeah. Yeah.
And I got some solutions I’m working on around transcription and just really pure good stuff, which is the same. And that’s my next goal is to be able to transcribe with really good accuracy because there’s a lot of jargon out there and whatnot. And I’m close. I’m like 98%. I was just 99% there. So I got some curiosities in that front.
It’s so funny. You mentioned that too because I’ve been going down this rabbit hole on document scanning. This is kind of how it goes when I start researching the video or something that I’m doing. Like this is like the third rabbit hole within this whole Paperless thing.
You know, I learned that there are solutions and people probably know this. I don’t because this isn’t the world that I live in. But for document scanning, there are solutions out there that prepare your documents for AI. And so it will take say, a document and identify and break it up into its parts so that you can feed it to like what you’re trying to do, a system for RAG. So it will understand a title, a footer, and all of these pieces of the document and not just the text itself.
So there’s two solutions out there. One’s called Dockling, which is from IBM and it’s open source. And it takes any document you want, whether it be MP3 PDF Excel and will break that up into its parts and then feed it to your LLM so that you can do RAG against it. The other one is Paddle. And so Paddle is another one, Paddle OCR that I don’t want to say does the same thing because people who know this stuff are going to be like, it doesn’t do the same thing. But for me, from the outside looking in, it’s a solution trying to solve the same problem where it’s trying to not only get the data out of the document, but lots of metadata about it too.
So those are two solutions that might help you. And I say that because you’re saying you want everything in Markdown, that’s going to help you big time because if you scan a document that has a table and you do OCR against it, the text you get out isn’t a table, right? And so same with even an LLM. So what you want to do is use Dockling or Paddle to do the transformation or the recognition of the individual parts. So if you took a picture of a table, workbook table, Excel table, then your output could still be a table, but in Markdown.
And so this is like the next, I don’t know, I feel like this is like on the frontier of document scanning. And anyone who’s doing document scanning in the industry, they’re probably like, this has been around for four years. This is coming from like a web developer who does infrastructure at home. And so this stuff is new to me.
So those are two things I would look into. And I get to mention them in my next video because they’re pretty cool. But I think that these two solutions, Paperless AI and Paperless GPT are trying to solve that thing. And the funny thing is Paperless GPT can hook into Dockling to do that for you too. So it’s getting wild, man.
I was just thinking about architecture there. So if I, maybe you’ll go here with me, Tim, I’ve been thinking about ETL pipelines. I feel like the world is an API. The world is a CLI. And the world is an ETL. And that means extract, transform, load. And I feel like that’s exactly what you’re doing there.
So if I were building that pipeline, and I were, you know, using Paperless and I was behind the scenes in your little nerd research lab there, whatever, I would want to keep the original images. And the reason I would want to keep the original images, I would want to extract whatever the purest original copy of it would be, which would be an image, right? Let’s not take the transformed version of it. Let’s extract literally what we get from it. The raw data. So let’s have a raw layer.
That’s, if you went with the, I think it’s the medallion process, I believe it was called, you got bronze, you got silver, you got gold. And so the bronze layer would be this original raw layer. And so that would be simply images of every page you have. It could be the simple image of your serial placard, or it could be all the pages of the book. Store those in the raw copy as an image. Boom, you got that. That’s your bronze layer.
Then the T comes into play. The transform comes into play. You say, okay, let’s now take all those images. And this is great as technology or models change or vision models get better is you can go back to that original raw source. It’s almost like how they do mastering for films. They go back to an original film that was shot on film, remastered for 4K, but they’re going back to those original slides. And so that’s kind of the same process. I would want to get original image though.
Oh, I agree. I agree. I like it. This is similar to like meta fields, you know, in developers and APIs when you scrape stuff. It’s like, let’s pull out the stuff that we can use and put it in our API. But oh, by the way, we’re going to have this meta field that has everything we found to begin with. Just in case we need to come back and process it a little bit later, a little bit better.
Yeah. We’ll leave it on the table if you don’t do that. You put on the floor. You’re not capturing it. So you tend to throw away in that process to get to the pristine, you throw away what was not really that good to you. In the ETL world, you want to keep that original raw source now in so far as that it does hold value.
But if you go back to that original raw, if you need to ever, as your technology changes in the transform layer, well, then you’ve got lots of things you could do that goes back and gets more accuracy. If you can’t get full accuracy, now if your score is 50 out of 50, you’re 100 out of 100 in your quality score and your raw is not really in the way anymore, we’ll throw the reference pile, but I want the images. I want the images so that when it comes down to that table, I can actually have the LLM examine the image of the table and then the Markdown we get from it and be like, that’s good. Let’s go.
Yeah. I like it. Man, ETL is taking on such a different, I guess, a different perspective. Last time I was talking about ETL was SQL, trying to pull data out of one database and put it in another. In the perspective of this, it really makes a lot of sense because LLMs in general are best effort, best guess every time. That best effort, best guess is going to be different every time and it could be better in the future or it could be worse. Who knows? But saving the source image, yeah, that’s awesome. It sounds like a fantastic way to treat analyzing images like this.
The pipeline is medallion and I said, it’s bronze is the base layer, which is your raw. Your silver, which is maybe an augmented version of that that’s been cleaned up a little bit.
Then you finally transform it in the final layer in your gold layer, which could be your production database, for example, or your production layer. You’ve got that first layer raw, that middle layer where you’re evaluating things, maybe you’re doing some joins, maybe you’ve got multiple databases, and the final transform is in the gold layer, where you’re taking maybe two or three different databases or two different data sources and you’re merging them in production.
That’s cool. Yeah. Yeah. That’s cool stuff.
Let’s go back to Chris from Crosstalk, and unleashing, let’s just say Claude. Claude Opus 4.5 on your UDM Pro or whatever, if you’re Tim, Techno Tim, maybe you’ve got the latest greatest, I don’t. Tim, I’m roughing it over here. I got to buy my own things. I know you probably buy your own things, but you get gifted a lot of stuff. I don’t mean negatively.
No, I mean, you get the fun stuff. I’m envious.
So here I have my UDM Pro that’s not even the special edition. It’s just the one that’s not special, and so that’s what I’m using.
Yeah. Yeah. I wouldn’t worry about it. At the end of the day, their software continues to evolve, and so you’re getting the latest and greatest everything, even though your hardware might not be up to stuff. That’s the cool thing about Ubiquiti in general.
It just does like light dances and stuff. You got all your RGB stuff, bro. I mean, when I put on my beats and I take my dance break, I want it to dance and do a light show for me. I just don’t have that capability like you do.
Oh, yeah. No, no. Or you could play Snake. Do you see that video? Someone playing Snake on there?
Yeah. It’s pretty wild on their display.
That’s right.
So this world, something that happened recently with me, one of our neighbor friends came by, one of my son’s friends came by, and he brought his Switch. His Switch 2, as a matter of fact, after Christmas. It was one of his presents. And like anybody who’s inviting somebody with a device into their home, where do you think I said? I said, you got to be on my guest network. Well, for some reason, they just couldn’t get on. Like the authentication happened to the Wi-Fi network, all things checked out as good, couldn’t get DNS.
And so I thought it was my newly homegrown Rust project, which is called DNS Hole. So I rebuilt Pi-hole in Rust. If you didn’t know this, it’s not available yet. dnshole.dev in the future very soon. I’m waiting for one or two more things to happen before I can do that. But right now, even as we speak, my DNS is being resolved by my own DNS server that has fully replaced what Pi-hole is. I think you’ll love it when I can release it. As a matter of fact, I’ll share it with you soon if we can.
All that to say is that he couldn’t get on via DNS. And I’m like, gosh, DNS Hole. Maybe you messed up here. It was not DNS Hole. Okay. It was, DNS Hole was perfect. You know what it was?
What’s that?
It was VLANs, man. It was my VLAN rules.
Okay.
So of course, I popped out Claude. And I’m like, what’s going on here? Because I couldn’t figure it out on my own. And I’m like, gosh, why didn’t you just pull out Claude and let it just log into your Ubiquiti and just check out some things. And so through investigation, it turns out I had some jacked up VLAN rules.
And while I was in there, I was like, you’ve done this all wrong. You know, this is great work. You’ve done a bit like, you got old rules. You got these rules that conflict. You got this one rule that does nothing. You got this whole set of rules. It doesn’t make any sense. Can I fix a few, please? Sure, Claude, please, please help me out.
Five, ten minutes later, beautiful VLANs near you again. All the routes great. He’s on the internet. They’re playing and having fun. They’re playing Mario Kart and life is good. So I mean, like, that’s the world we live in, Tim. I can’t even get my VLANs right, but Claude can.
That is awesome. So I haven’t used Claude in that way. Like, to be honest, I haven’t used Claude all that much. I use Copilot with models, you name the models. But no, that’s interesting. So did it do it through the CLI?
Okay. Yeah. Got you.
Well, those are like, it has my, I’ve got an SSH key. So I’ll SSH into my UDM Pro.
Got you. So it’s you.
Yeah. Yeah. Yeah. So logged in. It could do. Listen, I’m okay. Hold your seat. It reads the Mongo database directly. It updates the Mongo database directly.
I know that’s, it’s not cool. Production. Don’t read the database.
Yeah. But I’ve done it before and I was confident. And then it will trigger whatever it does to let the UI catch up, essentially, like the cache layer that’s in the UDM Pro, whatever. Just because you change the database, it doesn’t mean that the reads come back quickly. You got to sort of re-cache the cache kind of stuff.
And oh, yeah, man, it’s so cool.
It will log in to the UDM Pro via SSH as if it’s you. Or in the case of Chris, which I’m sure he did or was thinking about doing, is you can use your own, you can use the API, the UI API, or you can just log right into it and just SSH around, just CD directory and, you know, like you’re on a system, like you’re a sysadmin, no different.
[30:03] I think that’s such a wild world. I think that’s what’s making homelab more special with me. Proxmox has got a little more fun. Tim, if you— I’m about to show you some things. Okay. I have a CLI built called PXM, stands for Proxmox. And in a one-liner, Tim, I can have a brand new Ubuntu machine running. I can specify the IP address. I can specify the CPU, the RAM, and the disk. It already has my SSH key. And literally in less than 10 seconds, it’s reporting the IP address back to me via the CLI.
Yeah, yeah.
And one liner later, that same— I can do PXM info and then whatever the VM ID is. So PXM info 104, for example, and it reports back to me. SSH user is Ubuntu at whatever IP address, all that good stuff, whatever the details are of that machine. And moments later, my agents can be building on brand new infrastructure. And that is awesome.
Yeah, no, it is awesome. This is exactly what I’m talking about. This is exactly what I’m talking about. It’s like AI and agents in general are just letting people get these ideas out of their head and tinker way more and go way deeper than they used to before.
And this reminds me of when I went from IT to a software developer, I went from using other people’s tools to using my own tools. And for me, that was like a light bulb. I was like, I don’t need the UI anymore. Give me an API or even a CLI. And I can figure it out. And for me, that was just like a light bulb went off. And it was just like this moment where I was like, I felt like so much freedom, you know, to be able to build software that I wanted.
And so now, it’s just so awesome to see like other people being able to do that, you know, take that step from using other people’s stuff to using my own stuff. And so now I feel like— like you, you know, I mean, would you have ever written that thing out in Proxmox, you know, five years ago? Maybe. But it would have taken a long time.
It would take you way too long. I wouldn’t have the time. It’s just too daunting of a task to do.
Exactly. Because it’s really time. It’s not necessarily ability.
And I suppose it’s probably both time and ability. But yeah, I would have just never tackled it because it had just been too hard of a mountain to climb, really. Because I mean, even with the augmented AI tools, it was still hard. I mean, it didn’t get easier. It got easier to move faster and to get past the hurdles. But gosh, I had to solve so many problems. I forgot so many ways to deal with— how do you store the image on Proxmox? Well, that’s kind of obvious to most people, but like getting through the whole lifecycle.
And that was one of the first things I built with Claude. So I’ve learned a ton since then. So I want to rebuild it. I want to go from— because now I know what the tools should do. And before I was trying to make this— I don’t want to try to make it. I was just trying to explore really. And now I know exactly what I want it to do and what I don’t really care that it does that I just don’t need. And so I would just wouldn’t waste my time on that part of it.
Because I was trying to make this— I guess, I didn’t want to have to log into my Proxmox machine every single time and navigate the web UI and click all the things. And it’s not that it’s a bad UI. It’s just that that’s just— that’s not the way. And you’re more— you know, I wanted to be able to do a CLI version of it. I wanted to get JSON back and feed it to my agent. And now that’s all possible, really.
So have you played with or heard of this latest thing, which is called Ralph Wiggum?
Ralph Wiggum? No. That name sounds familiar, though.
From the Simpsons.
Okay.
Yeah. Gosh, why is it called Ralph Wiggum? I forget. I think it’s because they just keep trying despite setbacks is how I— if I can phrase Ralph Wiggum is keeping the loop going despite setbacks.
And so I believe it was, you know, I don’t have it here. I was going to try to figure out who actually created— I think his name is— I don’t know, I can’t remember, but it was somebody who discovered this loop essentially. So you essentially keep feeding back the loop of the input output that you would normally do with your own typical Claude scenario, which is, you know, you entering the prompt, it doing something and returning some sort of, you know, response back to you and doing work in between.
Well, they have found this way to create this Ralph Wiggum loop. So that you can essentially define a pretty clear instruction set. You might call it a spec or a spec, but they actually just call it prompt.md. And so in this prompt.md, which you would feed into Ralph— can do a loop. It could be a small loop like, you know, build this one part of the feature end to end and you just go until it’s done.
Well, the reason why I’m telling you this is because I feel like now if I— now that I know what it could do and what it should do, I would want to— and if hardware was more available, I would be more inclined to do this, but I would build a test subject hardware machine. That is Proxmox. And then I would just set loose this— now that I have a pretty clear vision. I would set loose this thing on that machine. Just have it build this Proxmox redo, I suppose potentially, you know, because I’m just trying to get the value out of it, not so much the pristine code. Sometimes that’s the value part too and you enjoy the process, but just for an exercise, because I want to automate Proxmox.
Why not do it via this Ralph Wiggum loop, which is just do it until it’s done and you can kind of give it repetitions. You can say, okay, do, you know, one, you know, one version of it— not a version, but I think like tries. I forget what the terminology is for it. Let me see if I can find it real quick. It’s like one iteration, two iterations and you can specify because you have, you know, only so many dollars to spend. I don’t want to spend more than 20 bucks on this feature, 10 bucks on this feature. So either spend 20 bucks on the feature or, you know, 10 or 15 iterations until you get to some result. And then I’ll come back and examine it. And to run it again, all you do is just run it again. It’s like— I don’t put it in that way. It’ll just go back and do it again. Like that’s such a cool world to be in, man.
Yeah. No, it is a little homelab garden with that kind of loop. So cool.
Yeah. Yeah.
No, that is for sure. Because yeah, a lot of times, you know, agents will stop and I know exactly what you mean now because, you know, you could tell it to do something to completion, but it’s either going to stop or check in or do something, you know. And I know, I know the prompts that I give— I have a lot of my prompts saved because they’re, you know, annoying to keep explaining to AI, you know, like, you know, fix all unit tests and run all linting and do all this, you know, do these things. Don’t do these things. Go, you know.
And to be able to not be a human in the loop anymore until the very end is pretty cool to think about. It’s like, no, you loop, you figure it out. You do so many iterations of this piece of software or you do one really good iteration. And let me see the final result. At that point, you’re just like a director, you know what I mean? You’re just like, you know, a director.
You see, that’s right.
Yeah. Give me, give me something.
That’s right. A parent. Go, go clean your room. Don’t come back until it’s all the way clean. And you know what? I’m going to check under the bed. I’m going to check in the closet. So, you know, make sure you don’t put stuff under the bed and in the closet. And when I come back in, you know, an hour, it better be done.
Right. I’m not a parent, but, you know, I remember those days, you know, thinking— thinking I think I can outsmart my parents by hiding everything.
Yeah. Yeah. Yeah. Yeah.
They’re pretty clean though, you know, and they don’t clean up after themselves. I mean, not yet. Well, generally, unless, you know, something— yeah, I won’t go there, but yeah.
[38:17] What is the centerpiece of your homelab right now? Like, what are the centerpieces? I imagine Proxmox and TrueNAS is still there in the center. Unifi hardware is obviously probably part of the center. What’s around that center? What are you building on?
It is. So this kind of goes into, you know, another one of my predictions for this year too is— if we could ever build anything— is one big box. I think people are going to return to this one big box idea. Where they’re— yeah, only because things are hard to get a hold of. And, you know, while you might be able to get a hold of lots of little older machines, you know, the one-liter machines, you know, to do, you know, clustering, I feel like now that things are so scarce, people might be going back to one big box that’s your storage, that’s your AI, that’s your compute, that’s your, you know, virtualization, that’s your NAS, that’s your everything.
And it’s kind of the way I’ve been going too. So, I do have my TrueNAS box. It’s one big box, you know, has a video card, has RAM, has 10 hard drives, you know, GPU, all that stuff. And so, that’s, you know, not only my NAS running ZFS, but it’s also where I’m running my applications now too. So, I’ve moved a lot of my applications on my NAS.
Yeah. So, I’ve been doing this— you know, where you were doing stuff with containers, you had to do that sort of sidecar load, which I did follow. But I didn’t get the same results you did. Maybe is that the way you’re doing it with that whole— you have to create the YAML file that it knows about it in the app container world?
Yep. Well, if you’re talking about TrueNAS, yeah. So, I— so you don’t have to use the YAML and do it that way. I do, because I want YAML, because I’m a developer, but also it’s a lot— I’d much rather edit YAML than fill out a form any day, even if you could give me, you know— yeah. And so, I’m also— because then you get CLIs and you get help from AI, you get all the stuff you get with YAML. And I can do it in VS Code. So, there’s that too.
So, yes. So, I’m now running my applications on top of my NAS. So, I’ve always gone back and forth. Like, you know, do I want my NAS to just be a NAS and just be storage or do I want my NAS to be an application server too, and then run those applications on top of my NAS. And so, I’ve done both. And I’m still kind of doing both, but for the most part, what I’m calling now my home production is on my NAS.
And so, my home production, you know, I’ve gotten a little bit wiser over the years and a little bit crazier, but, you know, I have a home production now. And my home production is where the services are that need to be up. They need to work, or I’m going to hear about it, you know. That’s Plex, that’s my NAS, that’s, you know, whatever else I have running, which is a lot of stuff.
And when I say, I’ll hear about it, I don’t just mean my wife, because she will say something if Plex is down, because we record a lot of stuff. And if it doesn’t record Survivor on whatever night— I hear about it. So, that needs to be up. But also, you know, alerts and stuff I have set up and running too. So, you know, my h