IRE – Video & audio transmission (Part 3 of 7)

How can I wirelessly transmit 1080p@60fps with short latency? That was one of my biggest questions. Most FPV systems are analog, which results in a bad image. That’s why I wanted to have a digital solution. First I thought of using Wifi, but soon I realized that it would have too much latency. Some people reported having over 150ms. After advices of Oculus VR the motion-to-photons latency (time from moving head to the matching image) should be under 20ms. So Wifi wasn’t an option. Perhaps it would be possible to get the latency down with a lot optimizations, but I didn’t have the time for that.

NyriusAriesPro

After searching for a while, I found the standard Wireless HDMI. Most of the systems work on 5.1 – 5.8 GHz. I decided to take the Nyrius Aries Pro, because it promised reasonable range (~40 meters) with <1ms latency. In addition the transmitter can be powered over USB. Because the transceiver acts like an HDMI cable, it also transmits audio. The downside is that the final image has directly to be transmitted to the Rift, so the video procession part has to be done on the robot.

In order to not only move the vision to another place, I added a microphone to the onboard computer. This sends the input to the HDMI output. To get the audio to some headphones, I put an HDMI splitter between the receiver and the Rift. The HDMI splitter was then connected to a Samsung TV, into which the headphones were plugged. In addition, the public could also see what the Rift-wearer sees.

I am quite happy with Nyrius Aries Pro; it has a feasible range to cruise in the office. But for an immersive telepresence robot, it will be necessary to find a solution for transmitting the video stream over the internet.

The next article will cover the processing the live video of the stereo cam for the Oculus Rift and audio of the microphone.

If you have any suggestions or questions, please use the comment form. I am always happy to learn something new.

IRE – Control system (Part 6 of 7)

IRE ElectronicsThe last post covered how the video signal is transmitted. However, a communication channel back to the robot is also necessary. Over this channel the signals for the gimbal and the robot are sent. The smartest solution would have been to transmit the USB of the Oculus Rift and the gamepad directly to the onboard computer. In fact, there are a few Wireless USB systems out there, but all have high latency. That’s why I decided to do the processing of these inputs on a stationary computer and only send the control commands to a microcontroller on the robot. But what wireless technology has low latency and a reasonable range?

Bluetooth

A bit of internet research shows that the latency of WiFi and Zigbee is too high. I asked an electrical engineer at my university and he recommended using Bluetooth modules by the company u-blox. These are optimized for short latency and as well as a long range. I took his advice and bought two connectBlue OBS421I-26-0 and one connectBlue ACC-34. The ACC-34 is just a serial-to-USB adapter to connect the Bluetooth module with the stationary PC.

Maestro Mini

To drive the servos of the gimbal and robot motors, I used the fantastic Pololu Mini Maestro 12-Channel board. It has many setting options, simple serial protocol and is able to output the servo (PWM) signal with up to 333Hz.

Code

The application on the stationary computer is written in C++. With DirectX’s XInput interface it reads the analog stick of the gamepad and converts it to command controls for the robot.

The Oculus Rift SDK is used to get the head orientation. It returns a quaternion, which easily can be converted to Euler angles. After some correction and tuning these are used to control the gimbal.

You will find the whole code on GitHub. The control system’s code is in the project IREController.

Goodie: Commander Control

Commander ControlBecause we also wanted to use the system with a broad audience, I needed a handy control to pause the system before someone damages it. Because I didn’t want be bound to a keyboard, I used a standard wireless presenter. The presenter just sends normal key commands, which I only had to map to the functions.

The next article will be the last one. It contains my conclusion of the Open Day and the project.

If you have any suggestions or questions, please use the comment form. I am always happy to learn something new.