RSL teleoperation system

IFF SDK-based image processing pipeline in teleoperation


The MRTech IFF SDK is a software development kit that allows software developers to create low latency image and video processing applications on a GPU. IFF SDK is based on a pipeline concept that enables the development of sophisticated high-performance image processing systems for machine vision.

A unique computer vision application for a teleoperation system of a customized Menzi Muck M545 walking excavator has been developed at the Robotic System Lab from ETH, Zurich.

The unmanned autonomous legged machine has been designed to perform dangerous operations without putting any operator into harm’s way. The operator sits in a separate cabin at a safe distance and can operate the machine while getting visual feedback on what is happening at the machine.

A lower latency is beneficial for remote control systems as it allows easier and faster control of a remote vehicle. Latency is affected by transmission delays and processing delays. From the excavator to the operator station four camera streams (which is about 50 megabits per second average) are transmitted over 5G networks to provide high bandwidth and minimize network latency. To minimize image processing delays the researchers at the RSL have deployed a front end of their computer vision system using IFF SDK.

Four XIMEA industrial cameras are mounted on the remote excavator: 1x 12 MP PCIe front facing camera, 2x 3 MP USB3 side view cameras and 1x 3 MP USB3 arm view camera. The cameras are connected to an industrial PC with an Nvidia RTX A400 GPU. An IFF SDK-based pipeline processes raw data received from the cameras and streams four RGB8/12 video flows to the teleoperation side. The teleoperation module reads images directly from the GPU memory and realises special operations required for further processing in teleoperation vision systems.
In addition IFF SDK provides programmers with a mechanism for online camera configuration and manual adjustment (if necessary) of parameters such as exposure, white balance, gain, FPS.

The following figure illustrates how the teleoperation system processes camera streams.


The teleoperation setup runs on Nvidia RTX A400 and is based on four XIMEA industrial cameras:

  • 3x USB3 XIMEA xiC 3.1MP Side-Arm view cameras with Kowa lenses, focal length: 3.5mm
  • 1x PCIe XIMEA xiX 12 MP Front facing camera with Kowa lenses, focal length: 6.5mm

The live camera streams are displayed on three monitors on the operation platform which gives the remote operator the feeling of working directly in the cockpit.

SDK basic functional requirements:

  • Very low glass-to-glass latency
  • Reliability
  • Clear and simple development tools

IFF SDK enables building any type of image pipelines. Simple and common pipelines such as Image Acquisition -> Color Preprocessing -> Operation Algorithms -> Encoding -> Streaming -> Receiving Images -> Rendering do not need a lot of software development and the application built by the RSL researchers is a good example of that.

The IFF SDK pipeline description language is a human readable text, so It didn’t take much effort to write a small program and edit the JSON config file to build the desired image processing pipeline.

The teleoperation module is a heavy application with its own processing, streaming, encoding, rendering, image post-processing, artificial intelligence and other useful systems which run their own threads such that these parts can be upgraded and additional blocks can be added in a modular way. The IFF SDK-based application is a fundamental element of the project as it provides the entire system with the input data, that is, high resolution images processed in near real-time.


The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments. With a large focus on robots with arms and legs, our research includes novel actuation methods for advanced dynamic interaction, innovative designs for increased system mobility and versatility, and new control and optimization algorithms for locomotion and manipulation. In search of clever solutions, we take inspiration from humans and animals with the goal to improve the skills and autonomy of complex robotic systems to make them applicable in various real-​world scenarios.