I wasn’t really sure whether to write this article. It isn’t very long and may not be worth an entire post all on its own. However, every other major component got its own article, so I am going to include it.
There really isn’t much to say. After figuring out how to setup a VM with a working NVidia consumer card in Virtual Reality VM Part 1, I just repeated that process for the NVidia 2080 Ti I had.
I was thinking the Streaming VM is being used as the primary media consumption device. However, I want it to be able to handle light gaming. Think like Jackbox . To that end, I gave it base resources of 16 CPUs, 32 GB of ram, and 125 GB of disk space. Adding in the Nvidia 2080 Ti, and the whole system is not too shabby.
There was only one small part that was a bit weird. When I was toggling PCI passthrough on the four PCI components of the Nvidia 2080 Ti, the GUI (Graphical User Interface) got into a weird blinking effect.
I selected the first option.
Then it started selecting and deselecting each of the four options in a random order, or at least no order I could see.
And so on. I am not actually clicking anything, it just toggles on its own.
It appeared to work correctly when I clicked
Toggle Passthrough, though the GUI does get locked up when doing this. This appears to be a Web GUI bug, not an actual system issue though. I had no issues booting a VM up with these in passthrough.
When doing this install I used the same 4k TV to do the initial setup as with the Virtual Reality VM. I didn’t need to go through any of the previous hoops, as I had a pretty good idea of how to complete this. I still needed to set
hypervisor.cpuid.v0 = FALSE
before installing the NVidia drivers. I definitely needed to set the TV as the primary display once the drivers were installed. After that it was pretty simple and the Nvidia 2080 Ti was booting correctly. It took a lot less effort this time. I then plugged it into the Just-Add-Power system and it worked fine from the start. The back of my server is starting to look pretty complicated here, but everything is working.
There is only one question I am thinking people might still be asking, how exactly do I control my streaming server? There are no keyboard or mouse inputs! I spent a lot of time looking into this a few years ago, and I found a great solution. Silex makes a set of USB servers that just plug into the network. I use Silex DS-510s .
These devices have a pair of USB ports on them and an ethernet socket. Just plug that into the network and these act as USB devices for a bit of software that runs on Windows 10 called SX Virtual Link.
In SX Virtual Link, I just select, or let it auto-connect these devices on the Streaming VM itself, and voila, I have a keyboard and mouse wherever I have network access. I think they even make a wireless version. In any event, I use four of these so each room that needs to control the streaming VM can. The screenshot shows what this looks like from the workstation. In a pinch I used this for primary work while I was deducing what happened to my thunderbolt 2 cable.
This is a success on the streaming front. I didn’t even have any issues to really work through. It was just one of the primary servers being replaced.
Now, way back in the server components section I mentioned that the old streaming server was having some blinking issues. Since I decided to install the Nvidia 2080 Ti, I am happy to report that the blinking issues are completely gone. Now, since I have not been able to acquire either a new Radeon or Nvidia 3000 series, I moved the old streaming 1080 GTX to the workstation temporarily. The workstation now has a blinking issue and the color is off on one of the monitors. It seems clear the issue is with this specific card. Hopefully I can get my hands on one of the new cards sometime soon as this blinking is driving me crazy.
On the Home Lab front though, I think I can put to rest all of the Windows VMs at this point. They are working well and as I wanted. Victory!