Resilient and safe control of cyber-physical systems under uncertainties and adversaries
The recent growth of cyber-physical systems with a wide range of applications such as smart grids, healthcare, search and rescue and traffic monitoring, to name a few, brings new challenges to control systems due to the presence of significant uncertainties and undesired signals (i.e., disturbances and cyber-physical attacks). Thus, it is of vital importance to design resilient and safe control approaches that can adapt to the situation and mitigate adversaries to ensure an acceptable level of functionality and autonomy despite uncertainties and cyber-physical attacks.This dissertation begins with the analysis of adversaries and design of resilient distributed control mechanisms for multi-agent cyber-physical systems with guaranteed performance and consensus under mild assumptions. More specifically, the adverse effects of cyber-physical attacks are first analyzed on the synchronization of the multi-agent cyber-physical systems. Then, information-theoretic based detection and mitigation methods are presented by equipping agents with self-belief about the trustworthiness of their own information and trust about their neighbors. Then, the effectiveness of the developed approach is certified by applying it to distributed frequency and voltage synchronization of AC microgrids under data manipulation attacks. In the next step, to relax some connectivity assumptions in the network for the resilient control design, a distributed adaptive attack compensator is developed by estimating the normal expected behavior of agents. The adaptive attack compensator is augmented with the controller and it is shown that the proposed controller achieves resilient synchronization in the presence of the attacks on sensors and actuators. Moreover, this approach recovers compromised agents under actuator attacks and avoids propagation of attacks on sensors without discarding information from the compromised agents. Then, the problem of secure state estimation for distributed sensor networks is considered. More specifically, the adverse effects of cyber-physical attacks on distributed sensor networks are analyzed and attack mitigation mechanism for the event-triggered distributed Kalman filter is presented. It is shown that although event-triggered mechanisms are highly desirable, the attacker can leverage the event-triggered mechanism to cause triggering misbehaviors which significantly harms the network connectivity and performance. Then, an entropy estimation-based attack detection and mitigation mechanisms are designed.Finally, the safe reinforcement learning framework for autonomous control systems under constraints is developed. Reinforcement learning agents with pre-specified reward functions cannot provide guaranteed safety across variety of circumstances that an uncertain system might encounter. To guarantee performance while assuring the satisfaction of safety constraints across variety of circumstances, an assured autonomous control framework is designed by empowering reinforcement learning algorithms with meta-cognitive learning capabilities. More specifically, adapting the reward function parameters of the reinforcement learning agent is performed in a meta-cognitive decision-making layer to assure the feasibility of the reinforcement learning agent.
Read
- In Collections
-
Electronic Theses & Dissertations
- Copyright Status
- In Copyright
- Material Type
-
Theses
- Authors
-
Mustafa, Aquib
- Thesis Advisors
-
Modares, Hamidreza
- Committee Members
-
Mukherjee, Ranjan
Zhu, George
Li, Zhaojian
- Date Published
-
2020
- Program of Study
-
Mechanical Engineering - Doctor of Philosophy
- Degree Level
-
Doctoral
- Language
-
English
- Pages
- xiii, 173 pages
- ISBN
-
9798664738759
- Permalink
- https://doi.org/doi:10.25335/jt3g-wh98