The Home Assistant VM will be roughly based on my previous Home Assistant VM, and the raspberry pi before that. There is a guide on how to deploy Home Assistant in a docker container . Docker is one of the technologies I have been using in my professional work, and playing around with in my home labs. I would not say I am an expert at it, I still wish the guide had a sample docker configuration file to base everything off of. I do have a previous working docker config I found somewhere online a few years ago, I will likely start with that as the baseline. However, this particular deployment will be different than the previous deployments. This time I want to have Home Assistant itself create and manage the Z-Wave network.
I led the engineering department at a Smart Home for apartments company, and was quite familiar with all of the options out there. There were 4 main methods for smart home when we examined this, Z-Wave, ZigBee, Wi-Fi, and Bluetooth. There were 3 main factors in my decision around which protocol to focus on, reliability, power requirements, and number of devices.
For reliability, I focused around the frequency that is used, as lower frequency signals penetrate solid objects (walls) better. On this front, Z-Wave has an advantage, it uses the 908 MHZ frequency , whereas Wi-Fi , ZigBee , and Bluetooth  use 2.4 GHz. There has been a response from Wi-Fi and ZigBee around this issue, Wi-Fi introduced 802.11ah for ~900 MHz bands, it has little market share. ZigBee technically can go down to the ~850 MHz range, but most ZigBee devices do not.
The reality is that Z-Wave is the only protocol that widely uses the 908MHz band for the vast majority of its devices, and that is a distinct advantage because of two reasons. One is a repeat of the above, 908 MHz penetrates walls better. The second is that this frequency has far fewer devices that use it. 2.4GHz is crowded. It has the three of the previously mentioned protocols, old wireless plug in phones (not mentioning any special chips on cell phones), microwaves, and more. (Note: I live in the US, so most cell phones are CDMA, I am aware that GSM uses 900 MHz, and thus the calculation on this point might work out differently elsewhere). With so many devices using 2.4 GHz (think of all the Wi-Fi devices from phones to refrigerators), that is a recipe for interference. I’d rather have the less used frequencies.
A minor note for reliability is that Z-Wave and ZigBee build a mesh network that has an advantage over Wi-Fi, which even with their MIMO antenna, since every device that is plugged into main power also can act as a repeater. Bluetooth technically has a mesh network, but I have not seen it be advertised with their home automation devices .
Power requirements is a shorthand for saying, how long until I need to replace the batteries? Wi-Fi (the non-802.11ah kind) is a high power protocol. Phones need to recharge daily. Even a sparse protocol still has to deal with all of the issues around Wi-Fi connectivity (keep alive requests, pings, etc.). My experience was about three months between replacing batteries on my August Wi-Fi lock I previously had . My real world experience on Z-Wave has been about 12 months for my Schlage locks . I also had a distinct advantage here, since this is something my company tracked. All I can say here is, the manufacturer matters in terms of time until replacement. There are quite a few ways to have a power issue in all of these protocols, and one tends to get what one pays for.
Lastly is the number of supported devices. ZigBee, at least its early generations, only supported 64 devices. I think it is up to 256 now (240 with the reserved space) . Now, this is the theoretical maximum. Philips Hue implemented their own custom version of ZigBee, which in addition to security vulnerabilities , limited me to 50 bulbs per hub. There is a reason for that, beyond that point reliability gets spotty, and I experienced this directly. Sometimes when turning on or off a room, three of the four lights would work, one didn’t. I personally think ZigBee is limited to around 50 devices per hub. This isn’t to say that they aren’t working to make it better, I am sure they are, but practically, there is a reason I use four Philips Hue hubs to control my lights.
Bluetooth is limited to 7 devices . I know they are working on this, and it wouldn’t surprise me if at the time someone reads this, that is no longer true. But the fact that this is where they are starting from is concerning. Lastly is Wi-Fi, which has all of the IP range available to it. Provided one is a network engineer and is willing to mess with all the slash-24 networks they want to in order to make it work. I don’t like managing DMZs or connecting larger networks, it gets annoying. Z-Wave supports 232 devices . I have experienced a Z-Wave network managing several hundred of them well.
I think we can all see where these are converging. We are likely to end up with something around the 900 MHz band, with a mesh network, low power antenna, custom protocols, and an ever growing address space for devices. The closest approximation of that when I decided three years ago (for my home) was Z-Wave, I still think it is the farthest along, having made the best predictions of where the market is going at the time, though ZigBee is not a bad bet either, off by only frequencies. My supposition is that the frequency interference and lack of wall penetration of 2.4GHz is the primary cause of the unreliability beyond 50 devices.
With Z-Wave as my home automation of choice, which hub was I going to use? I really only considered Samsung Smart-Things, Home Assistant, and Wink. I am not a fan of Smart-Things. There have been outages , lots of them. When Smart-Things goes down, all access is down. That was not ideal. Home Assistant worked, but it did not have a phone app at the time (not sure if it does now yet), but I did test it by adding a sensor to it. It worked fine, but I wasn’t sure if I wanted to just jump into my own personal management yet.
I prefer having paid people actually work on the software. Wink had a robust developer program , and Home Assistant suggested they had a local API . This meant that Home Assistant could potentially control the Wink devices even in the event of a Wink outage. That seemed reasonable. I went with them and bought a Wink Hub 2 .
They have since locked all API access . This, to me, breaks the one reason I felt comfortable with them over just doing it myself. I don’t want access to my home devices to be dependent upon Wink’s cloud infrastructure working. I really don’t want to be locked out of my house because of something like that. So, now, I am doing it myself.
To do that, I still want to use Home Assistant. I am only moving my Z-Wave network, not all of my smart devices. Despite being ZigBee, Philips Hue are still the best smart bulbs, even with the extra expense. When I installed them 3 years ago, there was nobody close to them. Today, some are coming close, but Hue just works great. This is a testament to Philips building the best product. Therefore, I still need to use Home Assistant, as most other control software can only sync to one Philips Hue hub, I use four to control my 149 lights.
In order to move the Z-Wave network, I will need to get a Z-Wave dongle working. Luckily, I know a good one from work, the Aeotec Z-Stick . However, as you can see from the photo, it is a rather large dongle. That will become important later. I also have a ZigBee dongle I’d like to get working too, but that is more of a bonus than a requirement. Therefore the Home Assistant VM will need to have direct control over this USB device. That is where a lot of my struggles on this particular VM came from.
VM Setup and Install
Let’s cover the basic setup and install here, like I have for the rest of the VMs. I started with the creation of the VM. (As before, I have already completed this, but this is for demonstration purposes) I navigated to the Virtual Machine section, clicked
I then clicked next to
create a new virtual machine.
Then I gave it a name and selected the
Ubuntu Linux (64-bit) operating system. The truth of this deployment is that the operating system isn’t as important. I intend to run Home Assistant in a docker container, so I have the flexibility to pick the OS I am most familiar with. I have extensive experience with Fedora, Debian, and Ubuntu, and Ubuntu is my favorite.
Next I selected the
Then I selected 4 CPUs and 4 GB of memory. Raspberry Pi 3Bs are quad core, I was just mimicking what they had, though the cores I have here are much more powerful.
Then I reviewed and finalized the VM.
Next, I needed to upload the Ubuntu ISO image. This can be downloaded from Ubuntu’s page . They do hide it behind recommended cloud deploys, which I don’t like. It should just be straightforward.
I also selected Ubuntu Server instead of Desktop. The Desktop version includes a window manager and a bunch of extra things I don’t plan to use. Server is a minimal install expecting the user to select what they actually need. To start I navigated to the storage page, and clicked on the
Primary storage page
Next I clicked on the
Then I selected
upload, navigated to the ISO, and uploaded it. I have already completed this task, which is why the ubuntu server ISO is already there.
I then navigated back to the new VM’s page, and clicked on
That brings up the edit page.
I clicked on the CD/DVD
Then I clicked on the dropdown which currently says
Host Device, and selected
Datastore ISO file.
I navigated to the primary storage and selected the Ubuntu install ISO.
Next I expanded the CD/DVD section and clicked
Connect at power on.
The next thing I needed to do was enable passthrough for a USB Controller. So I navigated to the
PCI devices section.
toggle passthrough for one of the USB 3.0 controllers.
Then I navigated back to the VM screen and clicked
I added a new
PCI device and selected the USB controller.
Since I am adding a PCI passthrough device I need to reserve the memory for this VM. PCI devices have what is called DMA, or Direct Memory Access. This is when the operating system allows a PCI device to write directly to memory. As controlling this direct writing is not easy for the hypervisor without assuming control of the PCI device directly, it makes sense that the memory needs to be reserved from the rest of the system, since a PCI device could be writing to it at any time. I clicked
Then, I tried to start the VM, but it failed.
So I clicked on the link information in the drop down
It told me that this is not supported. This was my first warning that this was not going to be as simple and straightforward as I had hoped. I went ahead and removed the PCI passthrough device so I can complete the install. I clicked on start. It may be necessary to go back and make sure the CD/DVD connect and power on start up checkboxes are still checked.
The installer had me select which keyboard I am using.
It informed me there is an update to the installer available. After my experience staying up all night in New Jersey hacking up the anaconda installer to get HFT servers upgraded, yes, I definitely want to download the latest.
The screen went blank and finished the download. The next prompt asked me to confirm keyboard layout.
It then asked me to configure a network interface if I need something special. I didn’t, so I selected
Next it asked me to insert a Proxy address. I have not used a Proxy address in decades. Is this still a thing? I just selected
It then prompts for the mirror it should use in case of failure. I selected the default by clicking enter for
Next it asked me to select an option for formatting the disk. I didn’t have special requirements. It defaulted to
Use an entire disk. Since the only hard drive I have is the VM one, it auto selected
/dev/sda. So I just selected
Done. Note, this one required navigating down to the selection. I used the spacebar to make selections.
Next it asked for confirmation on the review page for the disk partitions.
Followed by extra confirmation because of data deletion.
Next it prompts me for my name, server name, username, and password.
Then it is optional to install the SSH server. I tend to keep this enabled on home labs in case it is not easy to get to a console. I selected to
Install OpenSSH Server and hit
Next it provided a list of optional software to install. Since I intend to run Home Assistant in a docker container, I selected to install
docker, which I did by using the arrow keys and spacebar.
Finally it started installation. It does give the option to
cancel update and reboot after the minimum install has been done. I prefer to be patched up to date when the VM first boots. There are a surprising number of security holes that only appear during initialization and installation, probably because it is a one time event and thus tends to get overlooked.
Once complete, the system needed to reboot.
Next the VM prompted to remove the installation medium. I just clicked
enter. It skipped for me, if not, just shut down the VM and remove the CD/DVD.
After it completed booting, it still had startup log information printed over the login. I hit
enter to just get the prompt clean
Once it logs in the first time it will generate SSH keys before the command line prompt. I won’t be posting those here.
From here I won’t be referencing this particular Home Assistant VM again. I just deleted it. This should give everyone an idea of how this install works.
USB Set Up Issues
I am now going to go through a lot of the things that I had to try to get the USB configuration working. I want to cover a couple of items up front, as they define the physical characteristics of what I am working with.
First is that there are limited numbers of USB ports on the motherboard. There are two ports on the back panel, and one USB 3.0 header. This means I have a total of four ports and two controllers for passthrough.
As can be seen, with how close these ports are together, I cannot just directly plug the dongles together, which would have been a slight preference. But on another level, I only have four ports. These four ports are also close together. I cannot plug in both my Z-Wave and ZigBee dongles without blocking all available ports physically. One in the front and one in the back blocks all four.
In addition, as my install shows, I do not seem to be able to enable passthrough of these controllers yet. This appears to be an AMD controller, so I don’t expect this state to last long, but I cannot use them yet. This is a complication for the Virtual Reality VM, as I do believe I will need that to be a pure passthrough to work right. Later confirmed, when I tried to use USB passthrough and it did not work at all.
I also wanted to have an open USB port for physical troubleshooting. It is sometimes easier to get into the BIOS for configuration and changes. I also need sometimes need to work with the local ESXI interface by plugging directly into an available USB port. So for my final configuration, I think I need to use a USB hub to prevent the dongles from blocking available ports. Even if I wanted to use USB extension cables to avoid this, I will still need a USB controller that is actually capable of passthrough.
So I wanted to do some tests to see if I could verify that buying another USB PCI card would even work. I very much do not like using a PCI slot for something as mundane as a USB card.
First, I plugged in both the Z-Wave and ZigBee dongles into a 4-port USB hub I had. I then tried to add them to the Home Assistant VM.
Strangely I only see the Future Devices FT230X Basic UART. I think this is the ZigBee dongle, or at least the USB hub. However, the Z-Wave dongle is not even an option. After a ton of searching, I cannot find anywhere to query the hypervisor for all of the USB devices it is aware of, not just the ones it has prepared for VM use.
From inside the Home Assistant VM, I still only see the FT230X.
Checking the PCI devices yields no other USB devices either.
Okay. Here is where I make another minor mistake. I didn’t want to go and purchase a high end USB card without at least verifying that I could expect this to work. If I could see a USB PCI card passthrough configuration boot, and boot with the Z-Wave dongle successfully passing through, I would feel much better about the plan.
I figured that USB, especially USB 2.0, is an extremely mature technology. I should be able to go to a store, buy a USB card, and use that as a test before buying a more expensive long term solution.
I went to my local store and bought the cheapest USB 2.0 PCI card I could. It has a SATA power connector on the back. This is OLD. Modern PCI can provide upwards of 75 watts power natively .
Quick aside, this is a big reason why it’s difficult to just split a PCI slot, there is only one power connector for all of the lanes, and if the two or more cards being plugged in both want 75w, then the slot can’t provide the power. This was another solution I was considering since I wanted to keep a PCI slot and add a USB controller PCI card, which seems to max out at a 4x PCI slot.
I ended up trying to plug all of this USB PCI card in with long SATA connectors.
This is… not ideal? But that wasn’t the only warning sign.
When I attempted to power on the server, it did a minor beep, then nothing. That is typical of a power supply issue, like a safety shutdown. I found I could disconnect one of these SATA power extension cables, start the boot, then plug it back in and get the card to show up. I figured, this isn’t supposed to be the long term solution anyways. It is just the proof of concept. As long as I can get to the point where I’m seeing the passthrough with the Z-Wave dongle working, I can just buy a better one and move on. So I found a new USB controller, the Renesas uPD720202. It identifies as USB 3.0, even better. I toggled passthrough.
Then I passed the PCI device through to the Home Assistant VM, and plugged in the hub with the Z-Wave and ZigBee dongles.
I did’t see the Z-Wave dongle again (it looks like I don’t have a screenshot for this one, sorry).
The mistake I made was to assume that because USB is a very mature technology, there wouldn’t be a compatibility issue with it. USB passthrough is not so simple I can just expect it to work. After some more research, it looks like VMWare has some information about USB passthrough  . I should say, USB passthrough and PCI passthrough are not the same thing. However, there appears to be a fair bit of overlap here. A USB controller that supports USB passthrough is likely the same as compatibility support for a PCI USB card for PCI passthrough.
After more research, I found that High Point RocketU USB cards are reported to work by quite a number of people  . I ordered a High Point RocketU 1344A . I think I really just wanted to make some progress, and because of that I bought a cheap card, which didn’t even help me prove the point I wanted to make.
After receiving the new card, I plugged it in (it didn’t need power from any other connectors), and toggled passthrough on it. I felt it was a little inaccurately marketed, it says four dedicated USB 3.1 ports, which I took to mean four independent controllers, each of which could be passed through, but it is only two, it just has four ports.
Then I went to the VM and set up PCI passthrough for the new card.
Next, I booted into the Home Assistant VM, and I see the hub, but I don’t see the Z-Wave dongle. This does feel like progress, because from where I started, I am at least seeing the VIA Labs hub now.
I thought about it. There is one thing that I have always used here, the hub. I just assumed that because the FT230X is always there, that meant the hub works fine. I decided to plug the Z-Wave dongle in directly. First, though, I switched back to disabling PCI passthrough. Then, I added a new USB device, and I see it as an option now!
Next, I booted up the VM and I saw it available
Then, I went ahead and toggled passthrough and re-added the PCI device to the Home Assistant VM, checked the USB and:
There it is!
From here I think I have solved this issue to the extent I can expect it to work. USB passthrough should be possible on this dongle, but not through a hub. I am not certain where all of this went wrong.
During debugging, I tried multiple things on this including a couple steps I did not walk through above where I just did passthrough on a keyboard to try with a simpler device, it worked. I had even tried plugging the dongles in without the hub before. I saw the Z-Wave dongle on the starship USB 3.0 controller once, but it listed itself as disconnected and kinda disappeared. See below. It was very odd. From a reproducibility stand point, I had trouble seeing it over and over, because of that I wasn’t sure how useful it would be to list it. I walked through everything I could easily reproduce.
I think overall the following things are true:
- AMD Starship USB 3.0 PCI passthrough support is not yet present. The USB passthrough support seems unreliable.
- The hub I used does not work in all cases, I am not sure if this is a general rule that applies to all hubs or just a weird interaction between the Aeotec Z-Stick and this hub that makes it not work.
- Buying the cheap uPD720202 is not actually as bad as I thought, this appears to be a reasonable USB controller, save for the not booting on power up. Only being one controller is the deal breaker here. I want to have at least 2 USB controllers for passthrough, so the High Point RocketU is the better choice for multiple reasons.
I am going to proceed from here with the RocketU card. One of the controllers will be used for the Virtual Reality VM, the other controller for the Home Assistant VM, where I have the Aeotec plugged directly into the card. I may buy a small extension cable so I can use the other port for a hub. I will keep the AMD controllers for local USB access. If/when the AMD controllers support PCI passthrough, I may try to reconfigure this whole setup to reclaim the PCI slot that this card is in, as that appears to be a legitimate possibility. However, this still appears like it will work for my current plans.