Robot Surgeries One Step Closer
For those of us fearing eventual domination by our robotic masters (I’m getting in early with the sucking up), the latest research out of Duke University does not bode well. Engineers who have just completed a feasibility study in their laboratory, have concluded that they have made the first real steps towards creating robots that will perform surgery on patients.
Such advances will one day provide robots that can perform surgery in dangerous or remote locations, such as battlefields, or maybe one day in space. However, there are also more immediate gains provided by the Duke engineering studies.
Their experiments focused on a rudimentary tabletop robot, which used a 3-D ultrasound technology developed in the Duke laboratories as its eyes. An AI program acted as the robot’s brain by taking real-time 3-D information, processing it, and then giving the robot specific commands to perform.
“In a number of tasks, the computer was able to direct the robot’s actions,” said Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. “We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots – without the guidance of the doctor – can someday operate on people.”
The Duke laboratory is a leader in ultrasound technology. They have a long track record of modifying the stereotypical 2-D ultrasounds that we are used to seeing – such as those used to witness a fetus – and modifying them so that they can take 3-D scans. The technique was originally invented in 1991, and since then the team has shown its abilities in developing specialized catheters and endoscopes that provide real-time imaging of blood vessels in the heart and brain.
It is these advances in ultrasound technology that have made these new robotic advances possible.
The latest experiment saw the robot successfully direct a needle on the end of its robotic arm to touch the tip of another needle within a blood vessel graft. The needle was guided by a tiny 3-D ultrasound transducer that collected the images, which was itself attached to a catheter.
“The robot was able to accurately direct needle probes to target needles based on the information sent by the catheter transducer,” said John Whitman, a senior engineering student in Smith’s laboratory and first author on both papers. “The ability of the robot to guide a probe within a vascular graft is a first step toward further testing the system in animal models.”
I mentioned more immediate applications though; applications that could see modern medical procedures reach higher levels of safety for their patients. “Currently, cardiologists doing catheter-based procedures use fluoroscopy, which employs radiation, to guide their actions,” Smith said. “Putting a 3-D ultrasound transducer on the end of the catheter could provide clearer images to the physician and greatly reduce the need for patients to be exposed to radiation.”
Earlier experiments conducted by the Duke team saw the tabletop robot successfully touch a needle to another needle in a water bath. It then performed a simulated biopsy of a cyst. “These experiments demonstrated the feasibility of autonomous robots accomplishing simulated tasks under the guidance of 3-D ultrasound, and we believe that it warrants additional study,” Whitman said.
The researchers believe that adding their 3-D capabilities to more powerful and sophisticated surgical robots already employed in hospitals could speed up the day we see ourselves being operated on by autonomous robots.
All hail our robot overlords!
HYPERLINK “http://www.eurekalert.org/pub_releases/2008-05/du-fst050608.php” http://www.eurekalert.org/pub_releases/2008-05/du-fst050608.php