May 8, 2008 - The day may be getting a little closer when robots will perform surgery on patients in dangerous situations or in remote locations, such as on the battlefield or in space, with minimal human guidance, according to engineers at Duke University.
These engineers believe that results of feasibility studies conducted in their laboratory represent the first concrete steps toward achieving autonomous robotic surgeries. Also, on a more immediate level, the technology developed by the engineers could make certain contemporary medical procedures safer for patients, they said.
For their experiments, the engineers started with a rudimentary tabletop robot whose eyes used a 3D ultrasound technology developed in the Duke laboratories. An artificial intelligence program served as the robot’s “brain” by taking real-time 3D information, processing it and giving the robot specific commands to perform.
“In a number of tasks, the computer was able to direct the robot’s actions,” said Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. “We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots – without the guidance of the doctor – can someday operate on people.”
The results of a series of experiments on the robot system directing catheters inside synthetic blood vessels was published online in the journal IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control. A second study, published in April in Ultrasonic Imaging, demonstrated that the autonomous robot system could successfully perform a simulated needle biopsy.
Advances in ultrasound technology have made these latest experiments possible, the researchers said, by generating detailed, 3D moving images in real-time.
In the latest experiment, the robot successfully performed its main task of directing a needle on the end of the robotic arm to touch the tip of another needle within a blood vessel graft. The robot’s needle was guided by a tiny 3D ultrasound transducer, the “wand” that collects the 3D images, attached to a catheter commonly used in angioplasty procedures.
“The robot was able to accurately direct needle probes to target needles based on the information sent by the catheter transducer,” said John Whitman, a senior engineering student in Smith’s laboratory and first author on both papers. “The ability of the robot to guide a probe within a vascular graft is a first step toward further testing the system in animal models.”
For more information: www.duke.edu
© Copyright Wainscot Media. All Rights Reserved.
Subscribe Now