Automated systems will soon generate more data than all human users combined. With its growing focus on machine collective intelligence, Siemens is systematically refining the data from its own systems into actionable knowledge. The real challenge, however, is figuring out how to alchemize knowledge into profitable information technology businesses.
At some point around mid 2010 our civilization streaked past an invisible yet astonishing milestone. For the first time, the totality of our digital information surpassed one zettabyte — one trillion gigabytes. And according to a study conducted by market and forecasting company IDC, that’s just the beginning. By 2020, the study predicts, “our digital universe will be 44 times as big as it was in 2009.”
Much of this expanding universe is visible. We see it every day in the firmament of social network invitations, company intranets such as Siemens’ TechnoWeb, emails, instant messages, documents, high-resolution pictures, and downloadable videos (see articles "New Models for Human-Machine Collaboration", "Enterprise 2.0", and "Melding Soft Data and Machine Intelligence"). What is far less obvious is the explosive growth in machine-generated data, which is being driven by the steadily-diminishing cost and steadily-increasing power of computing and sensing (see article "Instant Communities"), and by advances in miniaturization, wireless communication, data storage, decentralized intelligence, and algorithms.
Major sources of machine-generated data include everything from satellite telemetry and GPS streams to the digital output of factories, air traffic management systems, hospitals, and energy, security, financial, and web-use databases. “The data intensity of these and other sources is expanding at such a rapid rate,” says Mathaeus Dejori, who heads a special project on collective intelligence at Siemens Corporate Technology (CT) in Princeton, New Jersey, “that in five years the amount of data generated by machines will outpace the amount of data generated by all human users.”
Value Shift. Why is this important for Siemens? “A fundamental value shift is underway,” says Gerhard Kress, who is a key player in a strategic Siemens project based in Munich that is charged with re-evaluating the company’s position in terms of its implementation of information, communication and software technologies. “Hardware is becoming generic. Software — standalone as well as the embedded software that is an integral part of almost every Siemens product from building management systems to medical scanners — has become the differentiating factor. And in-depth knowledge of complex applications, be it the operation of a steel plant, a power plant, a hospital, or a traffic management system, is what will drive that software and keep Siemens ahead of its competitors.”
Indeed, if the company can zero in on how to harness much of the data that its businesses routinely generate — not just process it, but mine actionable information from it, which is the essence of collective intelligence — it may be able to develop a virtually limitless pipeline of new services that can make its customers’ businesses increasingly successful.
One area in which Siemens is already working along these lines is its Fossil Power plant business, which tracks some 2,500 parameters on each of its 9,000 customer gas turbines around the world (see article "Turning Many into One"). Known as “Fleet Intelligence,” this massive effort not only tracks each turbine’s vital signs, but aggregates data across its life-cycle, from design and operations to sales, marketing and competitive information, to distill knowledge that can help each customer — even when rare problems crop up. “The result of this effort,” says Dejori, “is the ability to identify, respond to, and even predict events more rapidly and accurately.”
And that knowledge is growing in ways that sometimes surprise even the experts. With regard to Siemens’ 375-MW gas turbine in Irsching, Germany, for instance, which, in combination with a steam turbine, is expected to achieve a world record 60+ percent efficiency, learning algorithms are helping to maximize the system’s output. The algorithms achieve this by not only analyzing thousands of parameter interactions and variables per second, but by modeling what happens between those measurements. “This constitutes a new strategy that no one has deployed before,” says Prof. Dr. Thomas Runkler, who heads CT’s Munich-based Intelligent Systems and Control Global Technology Field (GTF). “Our algorithms actually simulate this dynamic behavior, and thus the entire system dynamics.” Using the resulting models, algorithms autonomously determine how to optimize control of the system. “Here,” explains Runkler, “the system explores the data, learns which parts of the solution space are promising, and then develops an optimized control strategy. Considering this, it is conceivable that the system will learn enough to boost the turbine’s efficiency even more over time.”
The results of this learning process have not been lost on Siemens’ other turbines. Thanks to the company’s common Remote Service Platform (cRSP), Siemens has institutionalized a remarkably efficient knowledge acquisition process. Developed with input from CT, the platform allows highly-secure data exchanges between customer sites and Siemens’ remote service centers. “Today, every major machine from Siemens is connected to a business-specific segment of this system,” says Volker Ganz, who heads CT’s Munich-based Product and Service Innovation GTF as well as a strategic program called ‘Leverage Service@Siemens.’ The program, which involves all major Siemens service organizations, is designed to accelerate business innovation by transforming data into information and thus increasing its potential business value. “In this context, cRSP has become an important differentiator for us. Indeed, it has become a business-critical backbone for the entire company, as it connects over 135,000 systems, representing a collective monthly data volume exceeding four Terabytes,” says Ganz.
Similarly, in the manufacturing area, CT has developed market-based software agent technologies designed to intelligently and automatically manage the complexity of enormous amounts of highly heterogeneous process data exchanged between suppliers and manufacturers in automotive supply networks (see article "Wheeler-Dealer Agents"). In much the same way that software within a turbine processes large amounts of information to discover ways of improving efficiency, auto industry software agents will soon collaborate in large-scale virtual markets to optimize the entire planning, order management and delivery processes, thus enhancing vehicle personalization and accelerating vehicle delivery to the customer.