This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

A novel clinical application for haptics

12 September 2011

A group of postgraduate students from the School of Mechanical Engineering at the University of Leeds has developed a system to measure and simulate the forces perceived by a surgeon during palpation (examination through touch) in robot-assisted surgery. The technique has considerable potential in the area of tumour detection

Recent years have seen a transfer of surgical procedures from traditional open surgery to minimally invasive surgery (MIS), and more recently, to robot-assisted laparoscopic surgery. These have shown significant benefits over open surgery, but the lack of direct physical contact has resulted in the loss of haptic (force and touch) feedback, which is required for assessing tissue features through palpation.

A team of post graduate students at the University of Leeds has developed a simulation system that delivers haptic feedback to a user during a virtual MIS palpation exercise. Potential applications for the system include surgical training, and further development into a master/slave palpation device. The long term goal is to overcome the drawbacks of new technology in surgery, particularly for the detection and improved resection accuracy of tumours through palpation.

Essential to this project was the ready availability of off-the-shelf hardware I/O, third party hardware interfacing, virtual graphics and custom data handling and processing. The team decided it could achieve all of this functionality using National Instruments (NI) LabVIEW and NI CompactDAQ, delivering inherent compatibility between the various project functions.

To simulate the palpation of human tissue, LabVIEW was used to create a virtual environment in which the user is presented with a probe and tissue sample within a patient’s abdomen. Haptic interaction with the virtual environment is provided through the use of a haptic device. LabVIEW was also used to control a custom built physical testing environment, where silicone tissue models were palpated with a force sensing probe.

The physical tests were primarily performed to validate the data obtained from finite element analysis (FEA), but in addition, establishing communication between the physical testing environment and the haptic device as an opportunity to explore the system’s remote palpation capabilities. The response forces that are provided to the user in the LabVIEW virtual environment were determined using FEA.

In order to measure response forces from silicone tissue models during palpation, the team developed a tri-axial Cartesian robotic system capable of moving an instrumented palpation probe relative to the tissue models. Using LabVIEW and CompactDAQ, progressing from concept to solution was accomplished in a matter of weeks. The system produces response surfaces of tissue models by recording force measurements during palpation at specified in-plane positions.

CompactDAQ offered a quick and elegant method of sending signals to the motor controllers and allowed the position and force measurements to be recorded. The system was programmed to run autonomously using a LabVIEW state machine architecture and allowed parameters such as indentation depth and palpation resolution to be adjusted from the front panel.

Haptic surgical system
In order to simulate the visual and haptic aspects of palpation during surgery, the team created a bespoke Dynamic Link Library (DLL) to interface with the haptic device (a PHANToM Omni from, SensAble Technologies). This allows two-way communication between LabVIEW and the OpenHaptics API - for example, to measure the device end-effector position and to implement forcing through the device.

The ‘Call Library Function Node’ is used to export and import data to and from the DLL, enabling the system parameters to be set up. This allowed the team to access the device’s functions and build ready-made subVIs, enabling the rapid and easy creation of flexible haptic scenes without having to access the low-level device functions.

Forces are generated by sending pre-determined forcing variables to the DLL from LabVIEW, which are then implemented dynamically using a Gaussian function to generate a force in a haptic control loop operating at 1kHz. A stiffness function is then used to adjust the force as a function of the indentation depth. This results in the generation of high-fidelity haptic feedback giving smooth forcing during tissue interaction.

This short article is based on a paper written by James Chandler, Matthew Dickson, Earle Jamieson, Thomas Mueller, Thomas Reid, Dr Peter Culmer and Dr Rob Hewson, School of Mechanical Engineering, University of Leeds
 


Contact Details and Archive...

Print this page | E-mail this page