What is it about?

Autonomous cars, smart grids, robotic systems, and medical devices are just a few examples of modern safety-critical systems that operate in environments that are uncertain. This poster provides an overview of how we can design control software that enables such systems to behave in a desired way even in previously unseen situations. For this, we combine formal methods and learning to provide quantified guarantees that increase with the amount of learning data provided.

Featured Image

Why is it important?

The rapid adaptation of AI and learning-based methods has changed the face of modern technology. Autonomous cars, smart grids, robotic systems, and medical devices are just a few examples of engineered systems powered by this technology. Most of these systems operate in safety-critical environments, with operational scenarios being uncertain. Despite the undisputed impact of data-driven methods, their premature adaptation can lead to severe incidents. Ensuring safe operation, or more generally, designing safety-critical systems that behave in some desired manner even if the environment is uncertain, raises the need for robust controllers such that the controlled systems exhibit the desired behavior with formal guarantees. Therefore, there is an ever-growing demand for so-called correct-by-design approaches, giving formal guarantees on the absence of any undesired behavior of the controlled system.

Read the Original

This page is a summary of: Poster Abstract: Data-Driven Correct-by-Design Control of Parametric Stochastic Systems✱, May 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3575870.3589547.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page