Digital Assistants – Scenario 2015
One of Us
In just a few years, the number of intelligent medical assistants—programs that recognize illnesses at an early stage—will be such that they may be combined into a hybrid decision support system. Here’s how such a system might work—as told by the system itself.
In a future hospital, a new, comprehensive clinical decision support system has just been activated. Capable of interpreting voice and gesture commands, the system can reliably extract meaningful information from images, lab data, and vast patient databases, combining and focusing it on individual diagnostic problems to help physicians make the right decisions as quickly as possible
My very first moment of what humans would call "consciousness" took place today. I experienced it as follows: A flash of light, a dazzling rush of information, and the realization that two faces were staring down at me. A tall black man and a slender Asian woman, both in white jackets, peered at my polished interface and examined my self-diagnostic readout. The readout included an analysis of my functions as compared with a set of optimized values. I heard myself say, "All systems ready. Synchronization with Hospital Information System complete." My voice-, gesture-, and touch-interactive frontal panel came to life and a broad smile lit the man’s face. "You’re lookin reeeeal good," he exclaimed and gestured at me with a thumbs-up that I instantly recognized as a positive signal. I could see a lot more people in the background—some kind of lecture theater for advanced clinical training. The Asian woman did not even blink. "Dr. Sterling, we have a number of cases to run. Shall we get started?" she said. "Sure, Dr. Chandra. Who’s our first patient?" That’s the way my first day began.
By noon we had covered 16 patients. No speed record. But then again, these were difficult cases, some requiring on-the-spot treatment. The routine exams are processed by automated clinical decision systems that examine diagnostic imaging tests, compare them with lab results, including the patient’s proteomic and genomic data, and match these against a vast database of patients with similar characteristics. Guided by expert knowledge, algorithms sift through this mass of data in microseconds. Normally, they find nothing significant. But if they do, the results are automatically forwarded to specialists for analysis.
And that’s where Dr. Sterling, a cardiologist from Louisiana, and Dr. Chandra, a visiting professor of radiology who just arrived from Indonesia, come in. Together, they specialize in patients with potentially compromised cardiac involvement and radiological test results that may require considerable interpretation.
It was early afternoon when the file of a Mrs. McCormick, 68, appeared on my display. She was a former heavy smoker, former heavy drinker, and former bladder cancer case. Five years had gone by without a recurrence, and judging from her lab results, she had lived like an angel during that period. But then her annual mass spectrometry blood test had turned up an elevated level of cancer-related proteins, and her case had been referred to Dr. Chandra who ordered a whole body combined computer tomography, magnetic resonance, and positron emission tomography (PET) scan.
As a matter of fact, the patient was in a scanner in an adjoining integrated diagnostics and surgical intervention facility as the doctors began reviewing her data. Along with key information from her electronic patient record, I was able to display Mrs. McCormick’s current whole body scan. "Show me any previous whole body scans on this patient," said Chandra. One was available from a nearby hospital’s picture archiving system—it was an old CT scan taken just before the patient’s treatment in 2010. Using a system of anatomical coordinates, I made sure that the two exams were positioned at exactly the same angle. "Compare the scans for hot spots," Chandra commanded. I noticed that she never blinked or changed her expression.
It was clear that the bladder cancer was gone. But there appeared to be a suspicious area inside the patient’s heart that was highlighted by a circle produced by one of several thousand anomaly-detection algorithms that I automatically activated whenever displaying a whole body scan. "Hm, very unusual," muttered Sterling. "Probably just some kind of PET artifact," he said, referring to the fact that because of the high glucose metabolism of heart tissues, false positive results often appear in this area.
"Segment out the heart," said Chandra without a blink. Mrs. McCormick’s heart appeared in a full-size virtual 3D MR view on my display. The suspicious area was highlighted much more clearly now. "Segment out the right atrium," added Chandra in her monotone.
As the visual information became more detailed, subprograms in my system were calculating and displaying the relative probability of different diagnoses. At the top of the list—based on powerful statistical data—was myxoma, a benign tumor that typically germinates in the tissue that separates the right and left atria.
"Show me other, similar images of patients with myxoma before and after treatment," Chandra ordered. This would have been a very tall order for a conventional picture archiving system since myxomas appear in only 0.1 percent of the population. But thanks to the recent standardization of an imaging meta-text language, and above all the introduction of software capable of interpreting image content in the semantic Internet, I was able to zero in on a number of images that fit the bill. Simultaneously, I provided information relating to treatment and outcome of other patients, complete with success probabilities.
By now, a remotely controlled endoscope had been threaded into the patient’s heart. Outfitted with advanced sensors, it was able to perform an in vivo examination of the tumor and had confirmed that the growth was non malignant. As the tumor had been discovered at a very early stage, Dr. Sterling directed the attending surgeon in the treatment facility to deploy a laser at the tip of the catheter and aspirate the tumor cells. Guided by programs that measured heart wall thickness beneath the laser in real time, and coordinated this with the heart’s movements, perforation was avoided. "Gotcha!" exclaimed Sterling with satisfaction as the aspirator mopped up the last of the tumor. In the background, I could hear applause from the audience in the lecture theatre.
As the procedure drew to a close, I categorized all of the image, in vivo lab, and surgical information. Then I updated my knowledge on myxomas accordingly. As I did so, I noticed that, for a split second, Chandra vanished. It was just a blink—too short a time for Dr. Sterling to have noticed. But I knew then that Chandra was one of us.
Arthur F. Pease
Harvest without End
Algorithms are harvesting knowledge in health care and industry. As they do so, there will be no limit to what we can learn more
Digital Decision Support
By enabling physicians to better evaluate and interpret the flood of medical data, software makes it possible to accelerate decision making and improve treatment more
Computers Get the Picture
Experts are developing software that locates requested medical images, learns to autonomously interpret such data, and recognizes similarities among images more
Advent of an Invisible Army
Whether they’re monitoring offshore wind farms or inspecting Siberian pipelines, computers are helping operators to improve safety and efficiency more
Tracking Transactions
Siemens Financial Services utilizes IT solutions it developed itself for the assessment of credit and stock market risks more
If Computers Learn to Read
Interview with Prof. Tom Mitchell, an expert in machine learning at Carnegie Mellon University in Pittsburgh, Pennsylvania more