By Jeannette M. Wing, Corporate Vice President, Microsoft Research
“Self-driving cars. Robots at work, at home, at play. Drones for package delivery and pathogen surveillance. Energy-efficient, earthquake-proof buildings. Jet engines monitored and controlled by networks of sensors and actuators. Embedded medical devices. Unobtrusive assistive technology. What is common to these systems? They have a computational core that interacts with the physical world. These cyber-physical systems require tight conjoining and coordination between the computational (discrete) and the physical (continuous) worlds. Cyber-physical systems are penetrating every aspect of our lives, with profound impact on sectors including aerospace, agriculture, automotive, chemical production, civil infrastructure, energy, entertainment, finance, healthcare, manufacturing, materials, retail, and transportation.
In the future, cyber-physical systems will rely less and less on human control and more and more on the intelligence embodied in the computational core. In some cases, such as an automated brake system in a car, this computational core may be able to detect and respond faster than a human; in some cases, such as robotic surgery, this computational core can be more precise than a human and not prone to fatigue; and in some cases, such as a minefield, an icefield, or a volcano, we would rather risk the expense of a machine over the life of a human. In all cases, it will be the software that provides much of the intelligence of the computational core.
Our daily lives will depend more and more on these systems. Our lives, our money, our welfare. How can we design cyber-physical systems that we can bet our lives on?”
—From “Cyber-Physical Systems,” by Jeannette M. Wing, Computing Research News, Vol. 20, No. 1, January 2008.
To ensure cyber-physical systems are safe, we need to address two fundamental scientific challenges. First, we need to reason about the discrete and continuous at the same time. Fortunately, much progress in formal verification has been made in the past 20 years on this front. One approach is to model a cyber-physical system as a hybrid automaton, which is a finite state machine where each state’s behavior is defined by a set of differential equations over continuous variables. Model checking technology can be applied to hybrid automata, making it feasible to prove properties about and find bugs in models of cyber-physical systems.
Another approach is to write logical formulae describing the behavior of a hybrid system and then use theorem proving technology to prove properties from the formulae. An example of an appropriate logic in which to write such formulae is differential dynamic logic, developed within the last decade along with rich tool support. Active research addresses the scalability of these techniques, since currently they support only tens of state variables, whereas an operational cyber-physical system typically has orders of magnitude more.
Second, cyber-physical systems operate under the presence of uncertainty. This uncertainty is due external conditions not under system control: Mother Nature, e.g., earthquakes, hurricanes and snowstorms; and The Human, acting mistakenly, surprisingly or maliciously.
Uncertainty is also due to system failures or approximations, such as faulty sensors and actuators, and noisy or inaccurate data. Again, the research community is exploring many approaches for modeling and reasoning about uncertainty. Common to many of these approaches is the use of probabilities, which can characterize the likelihood of a certain event happening. Probabilistic models and probabilistic logics are well-studied and, as applied to cyber-physical systems, active areas of research. An emerging area of relevant research is probabilistic programming. Here, values of program variables are randomly drawn from probability distributions and probabilistic inference calculates the implicit distribution specified by a probabilistic program.
Intelligent cyber-physical systems get much of their intelligence from the use of machine learning, which also introduces approximation and requires probabilistic and statistical reasoning. For example, a camera acting as a sensor on a drone might use a classifier trained to determine whether the drone is too close to a wall. It is in this context that Dorsa Sadigh and Ashish Kapoor are presenting their recent paper, “Safe Control under Uncertainty with Probabilistic Signal Temporal Logic” at the 2016 Robotics: Science and Systems conference. Their work marries formal verification and machine learning through a logic that embeds Bayesian Graphical models. In their Probabilistic Signal Temporal Logic, we can express safety properties of control systems that are aware of whether a predictive system would work or fail. Microsoft researchers and their academic colleagues are now investigating how to expand PrSTL into full-fledged probabilistic programs, including efficient Bayesian inference, which would allow real-time decision making. This work is part of the larger Microsoft Research Safe CPS Expedition.
Now, here is the irony. To prove the safety of cyber-physical systems, we need to deal with uncertainty. To reduce this uncertainty, cyber-physical systems are outfitted with more and more sensors to “see” their environment and to monitor their internal state. To interpret signals from a multitude of these sensors, we rely on statistical machine learning and optimization techniques. But these techniques give us only an approximation of the system’s external world that then determines its next state. And so, inherently these approximations add another dimension of uncertainty. Eek!
The meeting of formal methods, programming languages, control theory, real-time systems, machine learning, optimization and robotics is exactly what we need to ensure that future cyber-physical systems are safe: They should do no harm. Much exciting research remains for the scientific and engineering communities to tackle the fundamental and practical challenges to achieve this goal.
We are moving toward a world where self-driving cars, commercial drones and mobile robots share the same physical space as humans, where embedded medical devices improve through software updates, and where machine learning routinely supports smarter decision-making by these systems. This world requires that we, as scientists and engineers, take responsibility for building cyber-physical systems that you can bet your life on.
Photo credit: Scott Eklund/Red Box pictures
Related: