So it has been a few months since i finished my CK-37 button box, and since then i have obviously been thinking about my next project to add an array of nice clicky things to my desk. I use a empty 19’ rack as monitor stand, and what would be a better choice than to turn it into a engine/autopilot control panel for my beloved swedish plane?
Now, you might wonder, what is the giant empty space in the centre for? Thats where the HyperPixel 4.0 fits in. My plan is to stream the radar screen from a virtual second monitor via a webserver to the Raspberry Pi attached to the screen. The switches, buttons and LEDs will be connected to one or multiple Arduinos that talk to the Raspi either over I2C or USB. Inputs will be send back to the webserver, which in turn acts as a virtual joystick.
The first step is now to get the screen up and running, followed by the easier but expensive part of ordering all the other hardware.
Getting the Screen and Raspi running was the easy part, getting the screen was where it got hairy. Since i dont have a second monitor to export the radar onto, a virtual solution was needed. I would prefer a purely software based approach, but that would require some driver magic so i am currently using an HDMI capture stick.
Now, lets get the video from the fake second monitor to the Pi. My first idea was to use OBS to screencap the second monitor, relay the RTMP stream over to a nginx server, turn it into an HLS stream and display that as webpage on the Pi. If that sounds unnecessary complicated, it was and it produced video with an obviously unacceptable 20 second delay. After that i tried a multitude of things: NVIDIA Shield (Only worked with the main monitor), Steam Link (Same), RDP (No working client on the Pi). Thats when i, in all my intellectual might, finally decided to google “screen share web realtime” and discovered WebRTC.
Now it was only a matter of setting up a NodeJS webserver, copy pasting tutorial code, writing a simple web page with the video element embedded, cursing because i was getting no video, reading a different tutorial, rewriting everything from scratch, rewriting everything again, finally discovering a dumb mistake i made in the beginning and writing “donkey” on my forehead before i finally got this beautiful result.
There is at most a 0.5 sec delay and the resolution is great, making it way more readable than the in-game screen. The next step is to learn some more CSS so the video on the Pi looks better and properly fills the screen, before ordering 170€ worth of electronics and aluminum.
That goes way over my head. But good to see you found a solution.
I have a BMS setup with two extra 8 inch monitors on which it is fairly easy to extract the MFD’s to in BMS. Just a matter of dragging the MFD windows to their right positions. I haven’t found a way yet to export the Viggen radar to one of these extra monitors. I did find LUA script suggestion in a DCS forum but couldn’t get it working. Is there another -fairly simple - way to do it or does it indeed need a raspberry to get it done?
If your MFDs are just screens connected to your computer, you don’t need any extra hardware. This is the (german) tutorial I used, and I can show you my specific lua code when I get home in a week