Tools


Siemens Worldwide

Pictures of the Future

Contact

Contact

sts.components.contact.mr.placeholder Sebastian Webel
Mr. Sebastian Webel

Editor-in-Chief

Tel: +49 (89) 636-32221

Fax: +49 89 636-35292

Werner-von-Siemens-Straße 1
80333 Munich

Pictures of the Future
The Magazine for Research and Innovation
 

Autonomous Systems

A Glove that's Worth a Thousand Words

Siemens researcher Dr. Yue Zhuo uses a data glove to help a robot grasp an apple. In the future, robots will not only be able to manufacture complex products, but will also be flexibly adaptable to individual requirements for small batches, and above all will be able to work in direct cooperation with humans.

In China, a team of Siemens researchers is exploring the fundamental functions that will enable tomorrow’s human-robot interactions in industrial environments. A key part of this effort is the development of a so-called data glove that can be used to capture and transfer the movements, gestures and pressure levels of human hands, and thus describe complex commands and the safe handling of a wide variety of objects to robots.

Ever since they first appeared back in 1952, industrial robots have been portrayed as being large, heavy, static, dangerous to humans, and capable of performing only repetitive, pre-programmed tasks. Although these characteristics generally still hold true, ongoing developments in microelectronics, sensors, networking and algorithms are setting the stage for a new era of semiautonomous robots.

Major changes are also being propelled by the convergence of two seemingly contradictory trends: growing product complexity and increasing demand for individualized solutions. The result has been a new focus on the concept of flexible manufacturing characterized by light-weight industrial robots that can be easily and economically re-deployed and which, above all, will be capable of safely working side-by-side with humans. As these new features evolve, they will open the door to the kinds of short-cycle, small-batch production envisioned in Germany’s concept of “Industrie 4.0” – otherwise known as the Industrial Internet – as well as taking the first steps on the evolutionary path to becoming autonomous assistants.

When robotic arms and hands are trained to perform certain tasks, they must learn how much pressure they can exert on objects to ensure an accurate grasp and optimum movement sequences.

“Talking” with a Glove

That path is now the subject of considerable research at Siemens. In China, for instance, as part of Siemens Corporate Technology’s (CT) “Autonomous Systems Revolution” project, a team of researchers is exploring the fundamental functions that will enable tomorrow’s human-robot interactions in industrial environments. At first glance, natural language processing may appear to be a promising candidate for facilitating such interactions since it is not only highly efficient, but also a mature technology (e.g., Siri from Apple Inc.). However, it is by no means ideal when implemented in noisy factory settings. In view of this, CT researchers are developing a so-called data glove that can be used to capture the movements and gestures of human hands and thus describe complex commands to a robot.

The data glove is now in Version 2.0. Outfitted with  flex sensors and inertial sensors and benefitting from biomechanical modeling, it can sense the movements of a user’s individual fingers and arm, then transfer associated data regarding position and attitude to a “trainee” robotic arm and hand for remote control or movement optimization. Leveraging the power of wirelesstechnology, inertial sensors can also be mounted on the legs, chest and other parts of the human body to capture a user’s related motions, thus helping to easily configure a biomechanical model.  

Voice recognition has proven to be poor in loud production halls. CT is therefore developing a data glove, which helps to train robots by transferring hand movements and gestures to them.

As robotic arms and hands are trained to perform tasks, they will need to learn how much pressure to apply to a wide variety of objects in order to optimally grasp and move them. Here too, data gloves are set to perform essential tasks. Our prototype glove, for instance, is outfitted with a force feedback function that closes the loop of interaction between human and robot. Thanks to microcontroller-driven piezoelectric ceramic components on the fingertips of our data glove, the glove can generate vibrations at different frequencies and amplitudes that correspond to the correct level of pressure to apply when handling a given object. Once exposed to this data, a robot hand can apply the correct pressure thanks to the real-time feedback provided by its own pressure sensors.

Data gloves can also be applied to record the movements of human workers, to instruct new workers, and to detect and correct erroneous operations. As this technology evolves, workers in digital factories will be able to learn more about the use of data gloves by using them in  augmented reality scenes.

Pass the Wrench Please!

In tomorrow’s highly flexible manufacturing environments robots will have no time for extended training sessions. Like human workers, they will be expected to learn new tasks on the spot. As a first step in this direction, CT China researchers have developed algorithms that allow robot assistants to perform one indispensable task: grasping unknown objects. Although self evident for humans, this is a challenging task for a robot, which will have to autonomously determine the correct grasping gesture and points of contact with the target object. Perhaps surprisingly, a solution has been developed based on Kinect, a Microsoft device used in the Xbox game console.

Kinect can simultaneously capture depth and red-green-blue (RGB) data from an image. The former is used to determine the grasping gesture, while the latter is used to calculate grasping points. Based on this combination of data, the target object is separated from its surrounding imagery and input into a previously-acquired model based on extreme learning. This process in turn generates a group of optimized grasping points. As mentioned above, when grasping an object, extreme care should be paid to the level of force applied. On the one side, the force should be sufficient to ensure a firm grasp; on the other, it should be small enough to prevent damage. Today, this is done by setting a predefined force level, which is monitored by pressure sensors on a robot gripper. This is, however, set to change since a self-adaptive force control mechanism is under development. 

Impendence-controlled robot assistants will open a new horizon in training – one in which robot programming and learning take place simply via demonstrations.

Safety First

If, as expected, robot assistants are to work side-by-side with humans, safety will be a critical concern. With this in mind, CT China researchers are now developing a new control mechanism which will simultaneously manage the position and the force of a robot arm. Given a clearly defined maximum force limitation by its human associate, such a robot will extrapolate optimized movements, thus making it a safe and autonomous assistant while maintaining a high level of accuracy. A typical application will be polishing pieces with a constant level of force.

One of the most exciting applications of the new way of  this new feature is robot programming. Traditionally, robot programming has been a specialized, time-consuming and expensive process. It requires professional training and plenty of practice in the field. But those new capabilities open up a new horizon in training – one in which robot programming and learning take place simply via demonstrations. In essence, all that’s necessary is a kind of drag and drop process in which a robot arm is maneuvered through a series of “teach points” in which it learns specific positions and gestures. This process allows a robot assistant to learn an entire procedure and repeat it accurately. All in all, this amounts to an easy and flexible solution to rapidly changing orders and demands.

Dr. Yue Zhuo