Homelab Chronicles 13: A Year in Review

As I said in my last post, I recently moved. Which mean that all the work I did in my last apartment is gone. That was mostly physical infrastructure work, particularly the cabling. So now I get to do it again; joy!

But before I get into the new place, I should visit some topics from the past. An update of sorts. Just because I haven’t posted in a year doesn’t mean the homelab has sat untouched for a year.

UPS Delivered!

Back in April, I finally bit the bullet. I bought a “CyberPower PFC Sinewave Series CP1000PFCLCD – UPS – 600-watt – 1000 VA.” I wanted something that would communicate with ESXi to initiate a “semi-graceful” auto-shutdown if wall power was lost.

PowerPanel dashboard, showing the UPS status as normal and full charged.

I say “semi-graceful” since with my setup, I only have ~17min of battery life. That covers three devices: my server, my Unifi Secure Gateway (USG), and a 5-port Unifi Flex-mini Switch.

Via the accompanying PowerPanel software, I can monitor battery status, while also configuring the shutdown behavior in for ESXi. I actually have PowerPanel installed as a separate VM in ESXi. Probably not the best idea, but it works.

Back to the semi-graceful shutdown, VMs sometimes take time to properly shutdown, especially Windows Server. So between ESXi and PowerPanel, I give some time for the VMs to gracefully shutdown. But if they don’t shutdown in time, then ungraceful shutdown of VMs occur, before ESXi gracefully powers down.

Took a bit of testing to get it all figured out, but it does work. At my last apartment, there were a couple of brief blackouts or brownouts. The UPS did exactly what it needed to do.

My (Home) Assistant Quit on Me

The very last post of 2023 was me futzing around with Home Assistant. I had a “concept of a plan” to automate some of my smart home devices.

After getting it installed, though, I didn’t really do much more with it. Stupid, I know.

Unfortunately, at some point towards the end of 2023 or beginning of 2024, the HAOS VM bit the dust.

“Failed to power on virtual machine…File system specific implementation of LookupAndOpen[file] failed.”

I don’t exactly know what happened, or when it happened. But I know there was a power outage one night. A storm, I think. Before I had my UPS…

I can’t say for certain the power outage was the reason. After all, it was at least a couple weeks later that I realized HAOS wasn’t turned on. When I tried turning it back on, I got that message, and have ever since.

I did look into this error message a little bit. But I think reinstalling HAOS is the better choice. Especially since I didn’t do any further setup anyway.

Glad I have the UPS now 🙂

Playing with Proxmox

Over the last year or two, the big news in the virtualization space is VMWare selling out to Broadcom. And Broadcom absolutely trying to squeeze out every last penny in licensing. Recently, AT&T disclosed that Broadcom was seeking a 1050% price increase on licensing.

I’m still using ESXi/vSphere 6.5 U3 on older than 10yrs old server. Which is fine, but at some point I’ll need to replace the hardware and software. Unfortunately, there are no more perpetual, free licenses for non-commercial purposes. I never even got a 7.0 license.

With that in mind, I thought it’d be interesting to play with Proxmox, which is a FOSS virtualization platform. As such, I installed it on another server I had lying around. Unlike ESXi, Proxmox is nowhere near as user friendly. And the documentation that’s available is pretty poor, in my opinion.

That’s one of the reasons FOSS sometimes annoys me: it’s often not very accessible to anyone who’s not already an expert. But that’s a topic for another post.

The first thing I wanted to do was connect my current NFS-based OS ISOs storage to Proxmox. This way, I wouldn’t have to use extra drive space on the new Proxmox server by copying over an ISO. If the data exists on the network, use it! This NFS share is hosted on my primary Windows Server VM.

I was able to point Proxmox to the NFS. However, Proxmox wanted to use its own directory structure within that NFS. I found that rather annoying. This wouldn’t be where the Proxmox VMs live, after all. It’s simply where the ISOs are. Why should I have to rearrange the directory structure and files just for Proxmox?

I honestly don’t remember if I created a VM after all that. I don’t remember if it worked or not. But given the situation with VMWare, that won’t be the last time I play with Proxmox.

Let’s Get Physical, Physical (Again)

The last thing to report is minor, but worth mentioning. I ended up adding two more Ethernet runs. The important one being from one room to another, underneath the carpet and along the baseboards. Ah, the joys of apartment living.

Anyway, that’s not that big of a deal. I had already done it once, after all.

Rather, it was the idea that led to this. In my old apartment, the Google Fiber jack (ONT) was in the living room. The guest room down the hall served as the “server closet.” The server, Unifi AP, main switch, UPS, and other devices were in the guest room. But the Unifi Secure Gateway (USG) was in the living room, since the Internet point of entry was there. Which seemed strange to me. I wanted all the main gear in the guest room.

It’s hard to explain without a map or diagram, so I’ll use some. This is the diagram of my original layout:

My original setup at my last apartment. Simple, if not a bit overkill.

To move the USG to the guest bedroom, there were two ways to achieve this. One was by adding a second run from the living room to the guest bedroom. One run would connect the fiber jack to the WAN side of the USG. The other run would connect from the LAN side of the USG back to the switch in the living room.

But I wondered if it was possible to do this:

Note that the USG doesn’t have that many ports; I forgot to add a second switch in that room, but the idea still stands.

Essentially, could a WAN connection go through a VLAN? Because if it could, I wouldn’t have to run another cable. I looked it up and even asked on reddit. And the answer was yes, this is entirely possible and not that unusual!

Unfortunately, when I tried to do this, it didn’t work. It even caused me some more issues with the Unifi controller being inaccessible while it wasn’t working.

In the end, I just laid a second down a second Ethernet run. Maybe I gave up too quickly, but sometimes the easiest solution…is simply the easiest solution 🤷‍♂️


So that was last year in the life of the homelab. Not as much as I wanted to do, but it was at least something. And that’s the point, right? To at least play around with it and learn something.

Homelab Chronicles 07: ALERT! – Unauthorized Access

It’s been awhile since I’ve done anything with my Homelab. I’ve been busy with work, travel, and lounging around. There’ve even been extended periods over the last 2-3 months where my server has been completely turned off so I can save some money on electricity during the hot, hot summer. Plus, when it’s 100°F (37.7°C) outside and my AC is trying to keep things at a “cool” 78°F (25.6°C), the last thing I need is a server putting out even more heat.

But I was forced to take a look at things the other night when my Ubiquiti USG was making strange sounds. Fearing that it was going out, I wanted to look around to see how much a replacement would cost. I needed some information on my USG, so around midnight before going to bed, I booted up the server — it hosts an Ubuntu VM that itself hosts the Unifi Controller — and signed-in to the Unifi Controller via Web.

Almost immediately I was struck by how many clients were supposedly connected to the network: 34 devices.

Now, I’m a single guy with no kids, living in a 2-bedroom apartment. But I’m also an IT professional, a geek, and a gamer. I have several computers, cell phones, tablets, consoles, and such. I also have some smart home stuff like plugs, thermostat, cameras, etc. But the number of devices connected is pretty stable. Like 20-25.

So to see 34 clients was surprising.

I started with the list of wired connections. About 10 devices that I mostly recognized, even with just MAC addresses. Unifi has a neat feature where it’ll lookup MAC addresses to find manufacturer information. Anyway, all good there. So I went to the list of wireless connections.

At the very top of the list, I saw 10 devices that I didn’t recognize. One had a hostname of “Emilys-iPad.” I’m not an Emily. I don’t know an Emily. And I certainly don’t have an iPad named Emily…’s-iPad.

List of Devices in Unifi Console
Who the hell is Emily and why does she have an iPad on my network?

My heart started racing and I got jitters. Devices were on my WiFi network that were not mine. Devices that I didn’t authorize, by someone that I didn’t know. There were a couple Amazon devices, an LG device, and other hostnames I didn’t recognize. But I don’t have any Amazon devices, nor LG.

How long have these been on my network? Whose are these? But more importantly, how did they get on the network?

I didn’t spend much time answering those questions, as the situation needed to be dealt with. Instead of going to bed, I took a screenshot of the device list with hostnames and MAC addresses, and then immediately got to work.

To start, I disconnected and blocked all the devices from connecting to my WAP. I noticed that all the devices were connected to a secondary WLAN with a separate SSID; more on that in a second. I disabled and then deleted that WLAN. I then powercycled the USG and the Unifi WAP to make sure those devices were off the WLAN and wouldn’t be able to connect again. When it restarted, nothing was connected to that WLAN and only my devices were connected to the “main” WLAN. The threats were removed.

OK, so now about this WLAN. Some months ago, I whipped out my old Playstation Portable (PSP). I was feeling nostalgic and wanted to find some old games on the Playstation Store, so I needed to connect my PSP to the Internet. I have a modern WiFi 6 (802.11ax) Unifi AP. Unfortunately, the PSP, being so old, can only connect to 802.11b or 802.11g networks. I can’t remember the decision making process, but I eventually created a secondary WLAN, that was specifically for b/g devices. And of course I password protected it. However, since the PSP is old, I used the old-school WEP (Wireless Equivalent Privacy) as the password protocol.

Devices were on my WiFi network that were not mine. Devices that I didn’t authorize, by someone that I didn’t know.

Anyway, after I was finished with my PSP, I didn’t take the network down. “Never know when I might want to use it again,” I thought. So I left it up. Nothing was connected to it since. Since then, I’ve signed-in to the Unifi Controller a handful of times and never noticed anything other than my devices on my main WLAN. I honestly forgot that I even had it up. Until this happened.

With the threats neutralized, I could finally start doing some investigating. And my first question was obviously how they got on the network.

I’m assuming I password protected the WLAN. Because I’m not an idiot. Usually. But if it was only with WEP…well, there’s a reason why we’ve moved to WPA (WiFi Protected Access), WPA2, and WPA3.

According to Wikipedia, WEP was created in 1999. 23yrs ago. And over time, major vulnerabilities were found quickly. Without getting into the nitty-gritty, it’s not hard to crack a WEP password. There are programs out there online that are easy to find to sniff packets, analyze data, and eventually crack the password. Possibly in minutes.

That said…it’s not exactly something I’d expect my average neighbor to be doing. I’ve known about cracking WiFi passwords and “wardriving” for a long time. But even I’ve never done it.

I got a little nervous thinking about that. What kind of adversary is one of my neighbors? Are they also an IT person? Maybe a security professional?

And if they were on my network, what else did they see or even touch? In retrospect, it was dumb of me to do this, but I didn’t put that WLAN on a separate VLAN. I mean, why would I? I’m the only one connecting to it, with my one device. What that means is if anything connects to that b/g network, they’re on THE network. They can see my computers, my server, my consoles, my smart devices…everything.

Do I now have to wipe all my computers and VMs? I mean, some need it, but it’s still an undertaking to have to redo everything. It’d likely take a whole weekend and then some.

"Ain't Nobody Got Time For That"

That led me down another path, concerning my “main” WLAN. Did I use the same password for that b/g network, too? If so, they’d know the password to my main WLAN, as well, which has a different, but similarly-styled SSID.

So I nuked my main WLAN and created an entirely new one with a new SSID and new complex password. I then had to reconnect my smart home devices.

At that point, it was already around 2:00am, and I had to go into the office in the morning. What started as me wanting to find some model information on my USG turned into DEFCON1 at home.

But with the unauthorized devices off the network, a new WLAN, and the important devices back online, I felt somewhat comfortable going to bed. The investigation would have to wait until I got home the next day.

—To be continued.

Homelab Chronicles 06 – “Hey Google…” “I’m Sorry, Something Went Wrong”

I woke up early today, on a Saturday, to my alarm clock(s) going off. I was planning to go to a St. Patrick’s Day Parade and post-parade party with a friend. After turning off my phone alarm(s), I told my Google Nest Mini to stop the alarm that was blaring.

Unfortunately, it informed me that something went wrong. Though it did turn off. Usually when my Google Nest Mini has issues, it’s because WiFi messed up. So I stumbled out of bed, still half-asleep, to the guest bedroom, where the network “rack”—a small metal bookshelf—and the Unifi AP was at. My main 24-port switch had lights blinking. I looked up at the AP high up on the wall and saw the steady ring of blue light, indicating everything was working. OK, so not a WiFi problem, nor a network problem. Probably.

In the hallway, I passed by my Ecobee thermostat to turn the heat up a little and then noticed a “?” mark on the button for local weather. Ah, so I didn’t have Internet. Back in my room and I picked up my phone: 5G, instead of WiFi. On my computer, the Formula 1 livestream of the Bahrain track test, which I fell asleep to, had stopped. And reloading the page simply displayed a “No connection” error. I opened a command prompt and ran ipconfig /all and ping 8.8.8.8. The ping didn’t go anywhere, but I still had a proper internal IP in the subnet. Interesting. Guess the DHCP lease was still good.

Only one last place to check: the living room where the Google Fiber Jack and my Unifi Secure Gateway router were. Maybe there was a Fiber outage. Or maybe my cat had accidentally knocked the AC adapter off messing around in places he shouldn’t. Sunlight was streaming in from the balcony sliding door, making it hard to see the LED on the Jack. I covered the LED on the Fiber Jack with my hands as best as I could: it was blue. Which meant this wasn’t an outage. Uh oh. Only one other thing it could be.

Next to the Fiber Jack, surrounding my TV, I have some shelving with knickknacks and little bits of artwork. Hidden behind one art piece is my USG and an 8-port switch. I removed the art to see the devices. The switch was blinking normally. But on the USG, the console light was blinking with periodicity, while the WAN and LAN lights were out. Oh no, please don’t tell me the “magic smoke” escaped from the USG.

On closer inspection, it looked like the USG was trying to boot up repeatedly. It was even making a weird sound like a little yelp in time with the console LED going on and off. So I traced the power cable to the power strip and unplugged it, waited 15 seconds, and plugged it in again. Same thing happened. I really didn’t want to have to buy a new USG; they’re not terribly expensive, but they’re not inexpensive, either.

I tried plugging it into a different outlet on the power strip, but it kept quickly boot-looping. I then brought it to a different room and plugged it into a power outlet; no change. Great.

But then I noticed that there was a little green LED on the power brick. And it was flashing at the same frequency as the USG’s console light when plugged in. Hmm, maybe the power adapter went bad. I could deal with that, provided I had a spare lying around.

The Unifi power brick said “12V, 1 amp” for the output. So I started looking around. On my rack, I had an external HDD that was cold. I looked at its AC adapter and saw “12V, 2 amps.” That was promising, but could I use a 2 amp power supply on a device that only wants 1 amp? I looked online, via my phone, and the Internet said, “Yes.” Perfect.

I swapped the AC adapter on the USG. The little barrel connector that goes into the USG seemed to fit, if not just a smidge loose. Then I plugged it back into the wall.

It turned on and stayed on! Ha!

I brought it back to the shelf and reconnected everything. It took about 5 minutes for it to fully boot up. Afterwards, I went back to my computer and waited for an Internet connection to come back, and it did.

All in all, it was a 15-20 minute troubleshooting adventure. Not what I preferred to do straight out of bed on a Saturday morning, but it got fixed. I already ordered a new AC adapter from Amazon that should arrive in a few days.

Afterwards, I got ready and went to the parade. A bit nippy at about 25°F (about -3°C), but at least it was bright and sunny with barely any wind. I went to the party and had a couple beers. It definitely made up for the morning IT sesh.