Using Kinect and a Haptic Interface for Implementation of Real-Time Virtual Fixture

Ryden, F. and Chizeck, H.J. and Kosari, S.N. and King, H.H. and Hannaford, B. (2011) Using Kinect and a Haptic Interface for Implementation of Real-Time Virtual Fixture. In: Robotics Sciences and Systems, Workshop on RGB-D: Advanced Reasoning with Depth Cameras, Los Angles, USA.

[img] Text

Download (1MB)


The use of haptic virtual fixtures is a potential tool to improve the safety of robotic and telerobotic surgery. They can “push back” on the surgeon to prevent unintended surgical tool movements into protected zones. Previous work has suggested generating virtual fixtures from preoperative images like CT scans. However these are difficult to establish and register in dynamic environments. This paper demonstrates automatic generation of real-time haptic virtual fixtures using a low cost Xbox KinectTMdepth camera connected to a virtual environment. This allows generation of virtual fixtures and calculation of haptic forces, which are then passed on to a haptic device. This paper demonstrates that haptic forces can be successfully rendered from real-time environments containing both non-moving and moving objects. This approach has the potential to generate virtual fixtures from the patient in real-time during robotic surgery.

Item Type: Conference or Workshop Item (Paper)
Subjects: D Haptics
Divisions: Department of Electrical Engineering
Depositing User: Brady Houston
Date Deposited: 14 Jul 2015 18:40
Last Modified: 14 Jul 2015 18:40

Actions (login required)

View Item View Item