Go to content

SIEMENS

Research & Development
Technology Press and Innovation Communications

Dr. Ulrich Eberl
Herr Dr. Ulrich Eberl
  • Wittelsbacherplatz 2
  • 80333 Munich
  • Germany
Dr. Ulrich Eberl
Herr Florian Martini
  • Wittelsbacherplatz 2
  • 80333 Munich
  • Germany
pictures

Physicians at Munich’s LM University’s Surgical Clinic are guided by a view
that combines visible and X-ray images on a monitor while they operate on a fracture.

Physicians at Munich’s LM University’s Surgical Clinic are guided by a view
that combines visible and X-ray images on a monitor while they operate on a fracture.

Physicians at Munich’s LM University’s Surgical Clinic are guided by a view
that combines visible and X-ray images on a monitor while they operate on a fracture.

SCR researcher Kalpit Gajera with a 3D-image of the heart.
Recently-developed software from Siemens fuses electrophysiological data with CT or MR data, helping cardiologists to locate and treat arrhythmias.

Fused Realities

Whether it’s a fracture, an arrhythmia or a tumor, the problem is, how do you treat it if you can’t see it? Now, thanks to new techniques from Siemens that fuse information from different diagnostic modalities, many medical conditions are becoming transparent.

Image
Image Physicians at Munich’s LM University’s Surgical Clinic are guided by a view that combines visible and X-ray images on a monitor while they operate on a fracture.
Image SCR researcher Kalpit Gajera with a 3D-image of the heart. Recently-developed software from Siemens fuses electrophysiological data with CT or MR data, helping cardiologists to locate and treat arrhythmias.

"Without AR, many patients would have to be X-rayed several times before an operation could begin."

Augmented Reality in Medical Systems

open

Conventional medical imaging systems are used by doctors to examine parts of the body before most operations. During treatment, doctors can view on a monitor images acquired by computer tomography devices, magnetic resonance tomography systems, or other imaging procedures. These days, such systems are also used to precisely measure and visualize the position of surgical instruments, whose locations within the body can be viewed on a monitor. Such a situation is not always ideal, however, especially when difficult and complex treatments are required. That’s because images recorded before an operation do not always fully correspond to the actual situation at the time of the procedure. Moreover, such computer-generated images are spatially disconnected from the surgeon’s view — when the doctor looks at the monitor, he or she can’t see the patient, and looking at the patient means losing the view of the monitor. Augmented reality combines these two levels into a single image — often in 3D — that doctors can view through a stereo-optical device or a semi-transparent mirror, for example. The key is that the virtual image be properly aligned with the patient’s anatomy, otherwise the doctor might misplace the surgical instruments within the body. Virtual images must thus be produced in real time during an operation. Physicians and medical device developers expect to reap many benefits from AR systems, since doctors will be able to focus more closely on surgery, and not have to divide their attention between the patient and a monitor. AR also simplifies hand-eye coordination because doctors view real and virtual instruments from the same angle. Orientation within the patient’s body is supported by 3D AR images, thereby enabling doctors to work with much greater precision and more easily avoid errors.

close

Although medical imaging devices such as ultrasound scanners and computer and magnetic resonance tomographs have become more and more effective in recent years, one hurdle remains to be overcome: Technologies are needed that can fuse the data provided by these systems so that doctors and patients can make better sense of huge quantities of information. Dr. Jürgen Simon, head of the Innovation and Cooperation unit at Siemens Healthcare’s Strategy department, believes that the development of image fusion (IF) and augmented reality (AR) techniques offers a promising opportunity to achieve this objective. "With AR, different data sources are merged into a single image, thereby combining the strengths of each system," he says. "We can thus provide doctors and device operators with more information to support their work." Simon describes IF and AR as a "hugely interesting topic," that opens new possibilities in fields such as trauma surgery and the early diagnosis and treatment of cancer and heart disease.
A case in point is a groundbreaking group of orthopedic and trauma-related procedures taking place at the Ludwig Maximilian University Surgical Clinic in Munich, Germany. There, using technology patented by Siemens Corporate Research (SCR) in Princeton, New Jersey, a modified Siemens C-arm X-ray system suitable for use in operating rooms has been outfitted with a camera-mirror module that produces an optical image of precisely the same area being imaged by X-rays. In practical terms, what this means is that the surgeon takes one X-ray of the area in question, after which the optical image is superimposed on the X-ray image. The result is that the surgeon not only sees the area of interest on a nearby monitor exactly as it appears to the eye, but also sees the underlying anatomy in perfect registration.
"When a scalpel is placed on the skin in preparation for the initial incision, the underlying bones and fracture lines are clearly visible, making it perfectly clear where the incision should be made," says Prof. Nassir Navab, one of the inventors of the system and Chair for Computer-Aided Medical Procedures & Augmented Reality at the Technical University of Munich (TUM). "Without this innovative technology, many patients would have to be X-rayed several times before an operation could begin," Navab adds. The system also provides additional support during an operation — for example, by helping surgeons locate the exact position where a drill should be placed in relation to the anatomical structure below. This is important, for instance, when placing screws at the right angle, as it prevents the exposure of patients or medical personnel to unnecessary additional radiation.
Augmented reality is also providing hope for more precise surgical treatment of breast tumors. At Beth Israel Deaconess Medical Center in Boston, Massachusetts, for example, Dr. John V. Frangioni has developed a new optical examination procedure known as FLARE (Fluorescence-Assisted Resection and Exploration) that employs unique medical image fusion and visualization software developed by Siemens. In combination with a near-infrared fluorescence dye, which was recently successfully tested in a clinical study and is now being further optimized, the system is able to make sentinel lymph nodes, which may be afflicted with cancer cells, visible.
The core elements of FLARE developed by the Frangioni Lab are two near-infrared (NIR) light sources and cameras that detect fluorescent substances in the body. With two separate NIR fluorescent channels, one could show tissues to be resected such as tumors, while the other might show tissues to be avoided such as nerves or blood vessels, thereby helping to avoid damage. These NIR images are merged in false color with images acquired by a color video camera enabling visualization of the NIR fluorescent dye in an anatomical context.
Meanwhile, at the National Institutes of Health (NIH) in Bethesda, Maryland researchers — with support from colleagues at SCR — have developed an image-fusion procedure designed to locate life-threatening arrhythmias. Based on the use of X-ray images and magnetic resonance (MR) scans, the system has been used extensively at NIH to superimpose views of the heart’s soft tissue structures derived from MRI onto live X-ray images. For example, using MR images to depict an arrhythmia-causing scar in the left ventricle, a catheter can be steered to the regions adjacent to the scar most likely to benefit from treatment.
Working along similar lines, this time in collaboration with Johns Hopkins University, NIH researchers used an image fusion system in an animal study to provide precise real-time images of the path instruments should take through the jugular vein during minimally invasive procedures targeting the portal vein, which transports blood from the abdominal cavity to the liver. The researchers used a conventional Espree MRI scanner from Siemens to produce high resolution 3D images of vessels in order to create a ‘road map’ of the portal venous system. They then combined these images with X-ray images from an Axiom Artis scanner. To ensure that the image data sets from both devices would fuse precisely to form a single picture, the two systems were calibrated via markers on a pig’s abdomen that were visible on both MR and X-ray images. Physicians used the MR images to segment the portal vein and surrounding structures and superimpose them on the X-ray images during navigation of a catheter in the portal vein. Initial results showed that the superimposed MR images allowed faster and more precise entry into the portal vein.
The researchers still have more than enough challenges to overcome before their X-ray-MR fusion technique can be utilized in clinical applications. Says Christine Lorenz from SCR: "The technique functions well when the organs being examined don’t move too much. However, what we really need to make this kind of procedure successful in the abdomen or heart is to develop techniques to compensate for cardiac and respiratory motion — we’re working on it."

Ultrasound and CT. Another widely used diagnostic method is the ultrasound-echo technique, which provides immediate results with little effort. Its resolution is low, however, and a great deal of experience is required to properly interpret its 2D images. Researchers Ali Kamen, PhD and Wolfgang Wein from SCR want to overcome these drawbacks by combining it with CT imaging. "Ultrasound delivers a lot of data, but no 3D information," says Kamen. "We can get that with CT images." He describes how the two modalities can work together: A patient with a liver tumor, for example, would first be scanned using CT. A doctor can use the resulting image to determine how large the tumor is and how much blood is flowing through it, which is an indicator of how dangerous it is. If such information is not sufficient for a diagnosis, the doctor can inject an ultrasound contrast agent containing micro-bubbles into the blood vessels. These tiny bubbles reflect sound waves, and clearly show how much blood the tumor is being supplied with, thereby simplifying the diagnosis. "The stronger the blood flow in the tumor, the more active and threatening it is," Kamen explains.
The goal is to allow doctors to merge both applications to produce a single image. This will not only help with diagnoses but also with treatments, such as tumor removal. "Our research project is designed to ensure that equipment operators receive all information at a glance regarding how and where a tumor is, and from which direction it can be resected without damaging important blood vessels," says Kamen. "We’re now testing the system at the Mayo Clinic in Rochester, Minnesota."

Mapping the Heart. Meanwhile, a team at SCR headed by Dr. Frank Sauer is making progress in the treatment of cardiac arrhythmias with the help of AR technology. "We now have realistic 3D depictions of the heart, which enable us to view catheter movements in real time during intra-operative treatment much better than ever before," says Sauer. This development was made possible by an integrated system that combines CT or MR images and electrophysiological heart mapping and visualization (CARTO mapping). The 3D heart images produced with the tomography system are exact and precisely contoured. After these images have been obtained, a doctor navigates an electrophysiological mapping catheter into the heart chamber. Three alternating magnetic fields underneath the operating table are used to pinpoint the position of the catheter sensor tips. These sensors record the spatial anatomy, and their electrical signals create an electro-anatomical map of the heart. The electrophysiological images are then merged with the CT or MR images with the help of software.
The catheter operator in the electrophysiology lab sees the heart chamber on a monitor in real time, and can thus use the 3D image to maneuver the catheter safely and quickly. SCR and Siemens Healthcare are working closely on this technology with Biosense Webster Inc. (BWI), a subsidiary of Johnson & Johnson. The navigation software has already been marketed by BWI under the name Cartomerge.
Siemens Healthcare in Erlangen, Germany, is also developing AR systems. Recently, for instance, it presented a navigation system targeted at minimally-invasive spine operations and trauma surgery. With Cappa C-Nav, doctors put together a 3D X-ray data set of the area to be operated on, allowing them to optimize the planning of implants, for example. During the operation, the doctor navigates using an infrared stereo camera that sees the system’s small reflecting marker spheres, which are on the surgical instruments. Software processes these signals, allowing the doctor to continually view the position of the instruments in the patient’s 3D image. If necessary, the doctor can, for example, virtually check whether the implant screws are ideally positioned and the right size before inserting them.
"How quickly such AR systems make their way into hospitals depends on how well image fusion systems can be integrated into the clinical workflow," says Navab. "The key is to deliver the images in real time." Navab, who worked at SCR in Princeton until a few years ago, is convinced that "augmented reality will become widespread in medical technology over the next ten years." It’s crucial, however, he emphasizes that "this development occurs in close cooperation with doctors and hospitals; otherwise they won’t accept the new technology."

Rolf Sterbak