BRL Researchers Demonstrate Cyber Security Risks Against Teleoperated Robots

201501-22_BioRobotics-044-2
Picture credit: Matt Hagen

A team of BRL researchers recently demonstrated that next generation teleoperated surgical robots are vulnerable to cyber attacks. Their research, described in a recently published ArXiv paper, comes at a time when medical robot sales are increasing by 20% per year, and it underscores the need to consider cyber security risks and defenses not just for laptops, desktops, and web servers, but for any form of computing technology, including cyber-physical systems like robots.


The research was conducted with the Raven II, a next generation teleoperated robotic system designed to support research in advanced techniques of robotic-assisted surgery. Raven is an open source surgical robotic system that was originally developed by Blake Hannaford (UW – Dept. EE) and Jacob Rosen (formerly at UW, now at UCLA – Dept. MAE) along with their students. The development of the original system was funded in part by grants from the US Army (MRMC) and the National Science Foundation (NSF).  Raven II is currently manufactured and sold by Applied Dexterity, Inc. It is the first experimental platform in surgical robotics capable of supporting software development, experimental testing, and medical (surgical) training. As an open source research platform, it is used by researchers to explore the boundaries of this technology as part of an effort to extend it capabilities. Raven II, in its current configuration, is not clinically used, and it is not FDA approved.

The current generation of surgical robots typically uses a different communication channel, and typically does not rely on publicly available networks. That makes some of the presented attacks harder to mount. That said, we as a society should always strive towards better, safer, more secure and more privacy protecting systems, including those used in safety-critical medical applications.

As summarized in by a response article by Sophos: “The paper does go on to make a number of recommendations for minimum safety features for remotely operated robots, whether they’re surgery bots making incisions or robots remotely controlled as they carry out military operations, as do drones doing surveillance work or dropping bombs, or remote-controlled mobile land robots that carry equipment, shoot weapons, and defuse bombs.

Those recommendations should sound familiar; so familiar that we’ll repeat them as requirements:

  • An eavesdropper should not be able to work out what the robot is up to.
  • Only authorized operators should be able to command the robot.
  • Interlopers should not be able to modify commands sent to the robot.”

Paper authors are Ph.D. students Tamara Bonaci, Jeffrey Herron and Junjie Yan, Electrical Engineering Professor Howard Chizeck, Computer Science & Engineering Department Professor Tadayoshi Kohno and recent CSE graduate Tariq Yusuf. Their work is funded by an NSF grant.

See also:

Other sources: