Go to content

SIEMENS

Research & Development
Technology Press and Innovation Communications

Dr. Ulrich Eberl
Herr Dr. Ulrich Eberl
  • Wittelsbacherplatz 2
  • 80333 Munich
  • Germany
Dr. Ulrich Eberl
Herr Florian Martini
  • Wittelsbacherplatz 2
  • 80333 Munich
  • Germany
pictures

Thanks to modified game console technology,
surgeons will soon be able to manipulate images by just moving their hands.

Tests at Siemens Healthcare in Forchheim, Germany

Tests at Siemens Healthcare in Forchheim, Germany

Tests at Siemens Healthcare in Forchheim, Germany

Tests at Siemens Healthcare in Forchheim, Germany

In Good Hands

Surgeons will soon be using hand gestures to scroll through or even modify digital images during surgery. This will enable them to access images and files faster without compromising sterile procedures. What makes it all possible is the latest technology from video game consoles.

Image
Image
Image
Image Tests at Siemens Healthcare in Forchheim, Germany
New technology uses infrared light to convert a surgeon’s hand movements into commands.

When Dr. Thomas Friese of Siemens Healthcare stands in front of a monitor in his laboratory in Erlangen and makes rotary movements with his extended hands, the scene brings to mind the character played by Tom Cruise in the movie Minority Report. On the display in front of him, a 3D model of a thorax starts to rotate. As soon as Friese stops moving his hands, the image stops rotating as well. Although standing two meters from the monitor, he changes the field size of the image by carefully moving his right hand from left to right in the air. Dr. Friese scrolls through a series of images, much as he would on his smartphone — with a swiping movement of his right hand. This way of using touch-free gestures is intended to enable surgeons in the future to select radiological images in the operating room or to change the way they are displayed without touching a monitor or having to leave the operating table.

“Some of the greatest changes in medical technology these days are happening in surgical practice,” says Michael Martens, who is in charge of business development in surgery at Siemens Healthcare syngo. Martens is referring in particular to the substantial increase in minimally invasive procedures, which are characterized by small incisions and are easy on the patient. While the surgeon performing a “conventional” operation actually sees the relevant organs or bones once he or she has made the incision, the minimally invasive approach results in an information loss, which medical imaging helps to offset. To prepare for an operation, surgeons therefore review not only the radiologist’s findings, but start out by viewing and discussing actual medical scan images from a prior examination.

Surgeons also like to refer to these images during an operation. Convenient access to this information improves common surgical procedures as well as management of possible complications. As a result, the number of operating rooms providing this type of display in the immediate vicinity of the surgeon’s position is steadily growing. But this also creates a problem: The surgeon has to refrain as much as possible from touching any objects or devices other than the patient and the surgical instruments, in order to eliminate conceivable risks of infection. To rule out every risk of infection, however, the surgeon would have to change all of his or her garments after contact with any surgically irrelevant device, such as an out-of-the-way touch screen, and that would greatly prolong the patient’s time in surgery and under anesthesia.

Xbox Technology in the OR. Touch-free manipulation of displays by means of voice control isn’t practical for most of the complex interactions that are required with medical images. Such an approach would require another team member to be on hand to speak the voice commands for the surgeon. This person would have to be in the operating room, thus potentially increasing the presence of germs, while also driving up costs. What’s more, in this scenario the surgeon wouldn’t be able to manipulate the images intuitively, but only by proxy.

Now, however, experts at Siemens Healthcare have found a solution — in the video games industry. “When Microsoft introduced its Kinect technology, we immediately recognized the potential of gesture recognition for surgery,” says Friese. Kinect is used in the new Xbox 360 game console from Microsoft, where its technology can recognize and interpret the motions of players.

At the heart of Kinect is PrimeSensor technology from PrimeSense, a company based in Tel-Aviv, Israel. In this system, an infrared light source projects an invisible infrared point pattern into the room. Any persons or objects in this space distort the point pattern. An infrared sensor measures the distorted point pattern, and then software compares the measured values with an internal reference pattern and computes each point’s distance from the light source — finding what’s called the “depth value.” Kinect also includes a video camera that records a color image of the room. A depth value is assigned to each relevant point in the video image. This enables the system to compute a three-dimensional point cloud, which represents the spatial structure of the imaged space. The system uses probability models to distinguish individual persons within this point cloud. Immobile points don’t represent people and are ignored by the system.

The Xbox 360’s motion recognition software is designed to identify rapid body movements by players — not the slow but precise hand movements that a surgeon would use to manipulate an image on a display. “With this in mind, we reprogrammed the measuring technique and enhanced the system’s precision,” reports Dr. Georg von Wichert, who researches the control of intelligent systems at Siemens Corporate Technology.

In order to recognize gestures, the system must first identify a user’s hand as a part of the point cloud. The system selects a sensitive zone between the monitor and the user. “We start with the assumption that if something reaches out of the point cloud into this zone it’s an extremity, such as an arm,” explains von Wichert. The software then computes where the hand at the end of that extremity is located, and to what place on the display it is pointing. Next, the system measures the width of the hand. During a gesture such as spreading the fingers, the moving object is virtually “grasped” on the display. Then a simple hand movement is all that’s needed to scroll through a stack of images. To change the size of the imaged field, the surgeon moves both hands away from each other.

It took only four months for Friese and von Wichert to complete their initial prototype. They received support from Microsoft, which had made the interfaces from Kinect to its Windows operating system available to them. This made it possible for the researchers to make use of the data from the software’s person-recognition function for their calculations. “We are really delighted with how predictably and reliably the resulting system operates,” Martens says.

The system focuses on the zone in which it recognizes hand gestures, and ignores movements that take place outside of that space. Thus the system is not confused by a nurse moving around in the immediate vicinity of a surgeon while handing him or her an instrument.

Plans call for the prototype to migrate from the lab into the operating room in the near future. In fact, surgeons at two European hospitals, in Spain and in Amsterdam, are planning to test the system in late 2011 under semi-realistic conditions — though of course not yet on patients.

As a next step, the Siemens engineers are planning to program a gesture control system that would enable the surgeon to virtually grasp and move an object on the monitor, and then release it again. This would enable the user to manipulate 3D images much more intuitively than would be possible with a conventional mouse control. The completed system is also expected to interface with other hospital systems, including image archives and electronic patient records. While performing an operation, a surgeon could look up the patient’s blood values quickly, for instance. “The system gives the surgeon ready access to a whole universe of useful information, and that means it can help to improve outcome,” says Martens.

Michael Lang