Low latency multi-camera video system for teleoperation

Teleoperation is a vital technology that makes it possible to operate machines in conditions unacceptable to humans. Some tasks remain either impossible or are associated with great risk to people. An example is the tasks of demining, working on steep slopes, diving to great depths, work in airless spaces and in aggressive environments. The obvious way to solve such problems is to remove the operator from the workplace of the mechanism physically. An engineering complex that allows the operator to control remote mechanisms is called Teleoperation System.

It is known that we get more than 90% of information via vision. Delays, low resolution, low frame rate, poor exposure and whitebalance are the major problems in visual feedback in remote-controlled systems.

For effective teleoperation, it is required to make the remote operator feel himself working directly in the cockpit. The operator must see everything that happens outside the remote cabin in real time and in a comfortable way for a man.

MRTech has developed a unique low-latency image-processing solution which is integrated with Robotic Systems Lab’s (ETH Zurich) teleoperation platform for a hydraulic excavator for an autonomous purpose (HEAP) built in collaboration with Menzi Muck AG – the excavator manufacturer from Switzerland.

HEAP side view

 

HEAP top down view

 

Operator at the workplace

 

MRTech low-latency system for a wide range of unmanned vehicles runs on NVIDIA Jetson TX2 platform and is based on XIMEA industrial cameras and XIMEA carrier board for NVIDIA Jetson TX2:

  • 2x PCIe Ximea xiX 3.1 Mpix (Sony IMX252 1/1.8” 2064×1544) 122 fps cameras
  • NVIDIA Jetson TX2 + Ximea Phoxy (xEC2) carrier board
  • Theia MY125M Ultra Wide angle lenses.

MRTech low-latency solution developed on CUDA runs on any NVIDIA GPU. This makes it extremely flexible to build a wide range of machine vision solutions from very lightweight embedded systems (limited in space and power consumption) to extremely powerful systems with high-resolution cameras (60 Mp and more) processed on high-performance graphic processors.

The visual feedback is provided via four 2-D monitors that display live camera streams from three different perspectives from the front window and two side windows respectively. As an extension, the monitors can be replaced with a head-mounted display (HMD) to completely visually immerse the excavator operator in a remote reality projection of the excavator’s true site. The glass-to-glass latency is about 50 ms which gives an operator the feeling of being in a cabin.

Ferngesteuerter Bagger im Einsatz an der Axenstrasse UR

28.08.2020

SRF Schweizer Radio und Fernsehen,
Zweigniederlassung der Schweizerischen Radio- und Fernsehgesellschaft

Remote-controlled excavator in use at Axenstrasse UR

SRF video (in German).

ETH Zurich has developed an excavator that can remotely remove rocks from dangerous areas. It is currently in use on Axenstrasse between Brunnen SZ and Flüelen UR.

The Robotic Systems Lab designs machines, creates actuation principles, and builds up control technologies for autonomous operation in challenging environments.

Read more ->