You are here
Search results
(1 - 11 of 11)
- Title
- Role of flexibility in robotic fish
- Creator
- Bazaz Behbahani, Sanaz
- Date
- 2016
- Collection
- Electronic Theses & Dissertations
- Description
-
"Underwater creatures, especially fish, have received significant attention over the past several decades because of their fascinating swimming abilities and behaviors, which have inspired engineers to develop robots that propel and maneuver like real fish. This dissertation is focused on the role of flexibility in robotic fish performance, including the design, dynamic modeling, and experimental validation of flexible pectoral fins, flexible passive joints for pectoral fins, and fins with...
Show more"Underwater creatures, especially fish, have received significant attention over the past several decades because of their fascinating swimming abilities and behaviors, which have inspired engineers to develop robots that propel and maneuver like real fish. This dissertation is focused on the role of flexibility in robotic fish performance, including the design, dynamic modeling, and experimental validation of flexible pectoral fins, flexible passive joints for pectoral fins, and fins with actively controlled stiffness. First, the swimming performance and mechanical efficiency of flexible pectoral fins, connected to actuator shafts via rigid links, are studied, where it is found that flexible fins demonstrate advantages over rigid fins in speed and efficiency at relatively low fin-beat frequencies, while the rigid fins outperform the flexible fins at higher frequencies. The presented model offers a promising tool for the design of fin flexibility and swimming gait, to achieve speed and efficiency objectives for the robotic fish. The traditional rigid joint for pectoral fins requires different speeds for power and recovery strokes in order to produce net thrust and consequently results in control complexity and low speed performance. To address this issue, a novel flexible passive joint is presented where the fin is restricted to rowing motion during both power and recovery strokes. This joint allows the pectoral fin to sweep back passively during the recovery stroke while it follows the prescribed motion of the actuator during the power stroke, which results in net thrust even under symmetric actuation for power and recovery strokes. The dynamic model of a robotic fish equipped with such joints is developed and validated through extensive experiments. Motivated by the need for design optimization, the model is further utilized to investigate the influences of the joint length and stiffness on the robot locomotion performance and efficiency. An alternative flexible joint for pectoral fins is also proposed, which enables the pectoral fin to operate primarily in the rowing mode, while undergoing passive feathering during the recovery stroke to reduce hydrodynamic drag on the fin. A dynamic model, verified experimentally, is developed to examine the trade-off between swimming speed and mechanical efficiency in the fin design. Finally, we investigate flexible fins with actively tunable stiffness, enabled by electrorheological (ER) fluids. The tunable stiffness can be used in optimizing the robotic fish speed or maneuverability in different operating regimes. Fins with tunable stiffness are prototyped with ER fluids enclosed between layers of liquid urethane rubber (Vytaflex 10). Free oscillation and base-excited oscillation behaviors of the fins are measured underwater when different electric fields are applied for the ER fluid, which are subsequently used to develop a dynamic model for the stiffness-tunable fins."--Pages ii-iii.
Show less
- Title
- Host-symbiont coevolution in digital and microbial systems
- Creator
- Zaman, Luis
- Date
- 2014
- Collection
- Electronic Theses & Dissertations
- Description
-
Darwin's image of the entangled bank captures foremost the pervasiveness of life as it clothes the earth, but it also captures how intimately species interact and often depend on one another. This interaction is particularly pronounced for obligate parasites, who's livelihoods depend on interactions with their hosts and who's hosts often pay severely. In my thesis, I first demonstrate how antagonistic coevolution in Avida leads to a diverse set of interacting host and parasite phenotypes: a...
Show moreDarwin's image of the entangled bank captures foremost the pervasiveness of life as it clothes the earth, but it also captures how intimately species interact and often depend on one another. This interaction is particularly pronounced for obligate parasites, who's livelihoods depend on interactions with their hosts and who's hosts often pay severely. In my thesis, I first demonstrate how antagonistic coevolution in Avida leads to a diverse set of interacting host and parasite phenotypes: a digital entangled bank. Second, I show how further evolution is embedded within this community context by studying the coevolution of complexity driven by parasites'population genetic memory -- where the diversifying community of parasites "remembers" previously evolved hosts. Continuing to study the intersection of coevolution and community ecology, I investigate the structure of communities produced by the coevolutionary process in Avida. I show that a nested structure of interactions is common in our experiments, which is the same structure often found in natural host-parasite and plant-pollinator communities as well as many phage-bacteria interaction networks. In addition, I show that "growing" networks are nested by virtue of the process of incrementally adding nodes and edges. Thus, coevolution is expected to produce significantly nested communities when compared to random networks. However, the coevolved digital host-parasite networks are significantly more nested than expected from this neutral growth process. The interactions between hosts and their intimately interacting partners are not just parasitic, instead they span a broad range and include many mutualistic interactions. In the last section of my thesis, I study evolution and coevolution along the parasitism-mutualism continuum using a temperate λ phage system that provides its host with access to an otherwise unavailable metabolic pathway. Instead of evolving more mutualistic phage as I predicted, both the phage and bacteria evolved cheating strategies.
Show less
- Title
- Applying evolutionary computation techniques to address environmental uncertainty in dynamically adaptive systems
- Creator
- Ramirez, Andres J.
- Date
- 2013
- Collection
- Electronic Theses & Dissertations
- Description
-
A dynamically adaptive system (DAS) observes itself and its execution environment at run time to detect conditions that warrant adaptation. If an adaptation is necessary, then a DAS changes its structure and/or behavior to continuously satisfy its requirements, even as its environment changes. It is challenging, however, to systematically and rigorously develop a DAS due to environmental uncertainty. In particular, it is often infeasible for a human to identify all possible combinations of...
Show moreA dynamically adaptive system (DAS) observes itself and its execution environment at run time to detect conditions that warrant adaptation. If an adaptation is necessary, then a DAS changes its structure and/or behavior to continuously satisfy its requirements, even as its environment changes. It is challenging, however, to systematically and rigorously develop a DAS due to environmental uncertainty. In particular, it is often infeasible for a human to identify all possible combinations of system and environmental conditions that a DAS might encounter throughout its lifetime. Nevertheless, a DAS must continuously satisfy its requirements despite the threat that this uncertainty poses to its adaptation capabilities. This dissertation proposes a model-based framework that supports the specification, monitoring, and dynamic reconfiguration of a DAS to explicitly address uncertainty. The proposed framework uses goal-oriented requirements models and evolutionary computation techniques to derive and fine-tune utility functions for requirements monitoring in a DAS, identify combinations of system and environmental conditions that adversely affect the behavior of a DAS, and generate adaptations on-demand to transition the DAS to a target system configuration while preserving system consistency. We demonstrate the capabilities of our model-based framework by applying it to an industrial case study involving a remote data mirroring network that efficiently distributes data even as network links fail and messages are dropped, corrupted, and delayed.
Show less
- Title
- Using Eventual Consistency to Improve the Performance of Distributed Graph Computation In Key-Value Stores
- Creator
- Nguyen, Duong Ngoc
- Date
- 2021
- Collection
- Electronic Theses & Dissertations
- Description
-
Key-value stores have gained increasing popularity due to their fast performance and simple data model. A key-value store usually consists of multiple replicas located in different geographical regions to provide higher availability and fault tolerance. Consequently, a protocol is employed to ensure that data are consistent across the replicas.The CAP theorem states the impossibility of simultaneously achieving three desirable properties in a distributed system, namely consistency,...
Show moreKey-value stores have gained increasing popularity due to their fast performance and simple data model. A key-value store usually consists of multiple replicas located in different geographical regions to provide higher availability and fault tolerance. Consequently, a protocol is employed to ensure that data are consistent across the replicas.The CAP theorem states the impossibility of simultaneously achieving three desirable properties in a distributed system, namely consistency, availability, and network partition tolerance. Since failures are a norm in distributed systems and the capability to maintain the service at an acceptable level in the presence of failures is a critical dependability and business requirement of any system, the partition tolerance property is a necessity. Consequently, the trade-off between consistency and availability (performance) is inevitable. Strong consistency is attained at the cost of slow performance and fast performance is attained at the cost of weak consistency, resulting in a spectrum of consistency models suitable for different needs. Among the consistency models, sequential consistency and eventual consistency are two common ones. The former is easier to program with but suffers from poor performance whereas the latter suffers from potential data anomalies while providing higher performance.In this dissertation, we focus on the problem of what a designer should do if he/she is asked to solve a problem on a key-value store that provides eventual consistency. Specifically, we are interested in the approaches that allow the designer to run his/her applications on an eventually consistent key-value store and handle data anomalies if they occur during the computation. To that end, we investigate two options: (1) Using detect-rollback approach, and (2) Using stabilization approach. In the first option, the designer identifies a correctness predicate, say $\Phi$, and continues to run the application as if it was running on sequential consistency, as our system monitors $\Phi$. If $\Phi$ is violated (because the underlying key-value store provides eventual consistency), the system rolls back to a state where $\Phi$ holds and the computation is resumed from there. In the second option, the data anomalies are treated as state perturbations and handled by the convergence property of stabilizing algorithms.We choose LinkedIn's Voldemort key-value store as the example key-value store for our study. We run experiments with several graph-based applications on Amazon AWS platform to evaluate the benefits of the two approaches. From the experiment results, we observe that overall, both approaches provide benefits to the applications when compared to running the applications on sequential consistency. However, stabilization provides higher benefits, especially in the aggressive stabilization mode which trades more perturbations for no locking overhead.The results suggest that while there is some cost associated with making an algorithm stabilizing, there may be a substantial benefit in revising an existing algorithm for the problem at hand to make it stabilizing and reduce the overall runtime under eventual consistency.There are several directions of extension. For the detect-rollback approach, we are working to develop a more general rollback mechanism for the applications and improve the efficiency and accuracy of the monitors. For the stabilization approach, we are working to develop an analytical model for the benefits of eventual consistency in stabilizing programs. Our current work focuses on silent stabilization and we plan to extend our approach to other variations of stabilization.
Show less
- Title
- ASSURING THE ROBUSTNESS AND RESILIENCY OF LEARNING-ENABLED AUTONOMOUS SYSTEMS
- Creator
- Langford, Michael Austin
- Date
- 2022
- Collection
- Electronic Theses & Dissertations
- Description
-
As Learning-Enabled Systems (LESs) have become more prevalent in safety-critical applications, addressing the assurance of LESs has become increasingly important. Because machine learning models in LESs are not explicitly programmed like traditional software, developers typically have less direct control over the inferences learned by LESs, relying instead on semantically valid and complete patterns to be extracted from the system’s exposure to the environment. As such, the behavior of an LES...
Show moreAs Learning-Enabled Systems (LESs) have become more prevalent in safety-critical applications, addressing the assurance of LESs has become increasingly important. Because machine learning models in LESs are not explicitly programmed like traditional software, developers typically have less direct control over the inferences learned by LESs, relying instead on semantically valid and complete patterns to be extracted from the system’s exposure to the environment. As such, the behavior of an LES is strongly dependent on the quality of its training experience. However, run-time environments are often noisy or not well-defined. Uncertainty in the behavior of an LES can arise when there is inadequate coverage of relevant training/test cases (e.g., corner cases). It is challenging to assure safety-critical LESs will perform as expected when exposed to run-time conditions that have never been experienced during training or validation. This doctoral research contributes automated methods to improve the robustness and resilience of an LES. For this work, a robust LES is less sensitive to noise in the environment, and a resilient LES is able to self-adapt to adverse run-time contexts in order to mitigate system failure. The proposed methods harness diversity-driven evolution-based methods, machine learning, and software assurance cases to train robust LESs, uncover robust system configurations, and foster resiliency through self-adaptation and predictive behavior modeling. This doctoral work demonstrates these capabilities by applying the proposed framework to deep learning and autonomous cyber-physical systems.
Show less
- Title
- Achieving reliable distributed systems : through efficient run-time monitoring and predicate detection
- Creator
- Tekken Valapil, Vidhya
- Date
- 2020
- Collection
- Electronic Theses & Dissertations
- Description
-
Runtime monitoring of distributed systems to perform predicate detection is critical as well as a challenging task. It is critical because it ensures the reliability of the system by detecting all possible violations of system requirements. It is challenging because to guarantee lack of violations one has to analyze every possible ordering of system events and this is an expensive task. In this report, wefocus on ordering events in a system run using HLC (Hybrid Logical Clocks) timestamps,...
Show moreRuntime monitoring of distributed systems to perform predicate detection is critical as well as a challenging task. It is critical because it ensures the reliability of the system by detecting all possible violations of system requirements. It is challenging because to guarantee lack of violations one has to analyze every possible ordering of system events and this is an expensive task. In this report, wefocus on ordering events in a system run using HLC (Hybrid Logical Clocks) timestamps, which are O(1) sized timestamps, and present some efficient algorithms to perform predicate detection using HLC. Since, with HLC, the runtime monitor cannot find all possible orderings of systems events, we present a new type of clock called Biased Hybrid Logical Clocks (BHLC), that are capable of finding more possible orderings than HLC. Thus we show that BHLC based predicate detection can find more violations than HLC based predicate detection. Since predicate detection based on both HLC and BHLC do not guarantee detection of all possible violations in a system run, we present an SMT (Satisfiability Modulo Theories) solver based predicate detection approach, that guarantees the detection of all possible violations in a system run. While a runtime monitor that performs predicate detection using SMT solvers is accurate, the time taken by the solver to detect the presence or absence of a violation can be high. To reduce the time taken by the runtime monitor, we propose the use of an efficient two-layered monitoring approach, where the first layer of the monitor is efficient but less accurate and the second layer is accurate but less efficient. Together they reduce the overall time taken to perform predicate detection drastically and also guarantee detection of all possible violations.
Show less
- Title
- The evolution of digital communities under limited resources
- Creator
- Walker, Bess Linden
- Date
- 2012
- Collection
- Electronic Theses & Dissertations
- Description
-
Schluter (1996) describes adaptive radiation as "the diversification of a lineage into species that exploit a variety of different resource types and that differ in the morphological or physiological traits used to exploit those resources". My research focuses on adaptive radiation in the context of limited resources, where frequency-dependence is an important driver of selection (Futuyma & Moreno, 1988; Dieckmann & Doebeli, 1999; Friesen et al., 2004). Adaptive radiation yields a community...
Show moreSchluter (1996) describes adaptive radiation as "the diversification of a lineage into species that exploit a variety of different resource types and that differ in the morphological or physiological traits used to exploit those resources". My research focuses on adaptive radiation in the context of limited resources, where frequency-dependence is an important driver of selection (Futuyma & Moreno, 1988; Dieckmann & Doebeli, 1999; Friesen et al., 2004). Adaptive radiation yields a community composed of distinct organism types adapted to specific niches.I study simple communities of digital organisms, the result of adaptive radiation in environments with limited resources. I ask (and address) the questions: How does diversity, driven by resource limitation, affect the frequency with which complex traits arise? What other aspects of the evolutionary pressures in this limited resource environment might account for the increase in frequency with which complex traits arise? Can we predict community stability when it encounters another community, and is our prediction different for communities resulting from adaptive radiation versus those that are artificially assembled?Community diversity is higher in environments with limited resources than in those with unlimited resources. The evolution of an example complex feature (in this case, Boolean EQU) is also more common in limited-resource environments, and shows a strong correlation with diversity over a range of resource inflow rates. I show that populations evolving in intermediate inflow rates explore areas of the fitness landscape in which EQU is common, and that those in unlimited resource environments do not. Another feature of the limited-resource environments is the reduced cost of trading off the execution of building block tasks for higher-complexity tasks. I find strong causal evidence that this reduced cost is a factor in the more common evolution of EQU in limited-resource environments.When two communities meet in competition, the fraction of each community's descendants making up the final post-competition community is strongly consistent across replicates. I find that three community-level factors, ecotypic diversity, community composition, and resource use efficiency can be used to predict this fractional community success, explaining up to 35% of the variation.In summary, I demonstrate the value of digital communities as a tractable experimental system for studying general community properties. They sit at the bridge between ecology and evolutionary biology and evolutionary computation, and offer comprehensible ways to translate ideas across these fields.
Show less
- Title
- Harnessing evolutionary computation for the design and generation of adaptive embedded controllers within the context of uncertainty
- Creator
- Byers, Chad Michael
- Date
- 2015
- Collection
- Electronic Theses & Dissertations
- Description
-
A critical challenge for the design of embedded controllers is incorporating desirable qualities such as robustness, fault tolerance, and adaptability into the control process in order to respond to dynamic environmental conditions. An embedded controller governs the execution of a task-specific system by monitoring information from its environment via sensors and producing an appropriate response through the system's actuators, often independent of any supervisory control. For a human...
Show moreA critical challenge for the design of embedded controllers is incorporating desirable qualities such as robustness, fault tolerance, and adaptability into the control process in order to respond to dynamic environmental conditions. An embedded controller governs the execution of a task-specific system by monitoring information from its environment via sensors and producing an appropriate response through the system's actuators, often independent of any supervisory control. For a human developer, identifying the set of all possible combinations of conditions a system might experience and designing a solution to accommodate this set is burdensome, costly, and often, infeasible. To alleviate this burden, a variety of techniques have been explored to automate the generation of embedded controller solutions. In this dissertation, we focus on the bio-inspired technique referred to as evolutionary computation where we harness evolution's power as a population-based, global search technique to build up good behavioral components. In this way, evolution naturally selects for these desirable qualities in order for a solution to remain competitive over time in the population. Often, these search techniques operate in the context of uncertainty where aspects of the (1) problem domain, (2) solution space, and (3) search process itself are subject to variation and change. To mitigate issues associated with uncertainty in the problem domain, we propose the digital enzyme, a biologically-inspired model that maps the complexity of both the environment and the system into the space of values rather than instructions. To address uncertainty in the solution space, we remove constraints in our initial digital enzyme model to allow the genome structure to be dynamic and open-ended, accommodating a wider range of evolved solution designs. Finally, to help explore the inherent uncertainty that exists in the search process, we uncover a hidden feature interaction present between the diversity-preserving search operator of a popular evolutionary algorithm and propose a new way to use niching as a means to mitigate its unwanted effects and bias on search.
Show less
- Title
- Mitigating uncertainty at design time and run time to address assurance for dynamically adaptive systems
- Creator
- Fredericks, Erik M.
- Date
- 2015
- Collection
- Electronic Theses & Dissertations
- Description
-
A dynamically adaptive system (DAS) is a software system that monitors itself and its environment at run time to identify conditions that require self-reconfiguration to ensure that the DAS continually satisfies its requirements. Self-reconfiguration enables a DAS to change its configuration while executing to mitigate unexpected changes. While it is infeasible for an engineer to enumerate all possible conditions that a DAS may experience, the DAS must still deliver acceptable behavior in all...
Show moreA dynamically adaptive system (DAS) is a software system that monitors itself and its environment at run time to identify conditions that require self-reconfiguration to ensure that the DAS continually satisfies its requirements. Self-reconfiguration enables a DAS to change its configuration while executing to mitigate unexpected changes. While it is infeasible for an engineer to enumerate all possible conditions that a DAS may experience, the DAS must still deliver acceptable behavior in all situations. This dissertation introduces a suite of techniques that addresses assurance for a DAS in the face of both system and environmental uncertainty at different levels of abstraction. We first present a technique for automatically incorporating flexibility into system requirements for different configurations of environmental conditions. Second, we describe a technique for exploring the code-level impact of uncertainty on a DAS. Third, we discuss a run-time testing feedback loop to continually assess DAS behavior. Lastly, we present two techniques for introducing adaptation into run-time testing activities. We demonstrate these techniques with applications from two different domains: an intelligent robotic vacuuming system that must clean a room safely and efficiently and a remote data mirroring network that must efficiently and effectively disseminate data throughout the network. We also provide an end-to-end example demonstrating the effectiveness of each assurance technique as applied to the remote data mirroring application.
Show less
- Title
- The evolution of division of labor in digital organisms
- Creator
- Goldsby, Heather J.
- Date
- 2011
- Collection
- Electronic Theses & Dissertations
- Description
-
Division of labor is a hallmark strategy employed by a wide variety of groups ranging in complexity from bacteria to human economies. Within these groups, some individuals, such as worker ants, sacrifice their ability to reproduce and instead dedicate their lives to the maintenance of the colony and success of their kin. A worker ant may spend its entire life performing a single task, such as defending the colony or tending to the brood. The complexity of the strategies employed by these...
Show moreDivision of labor is a hallmark strategy employed by a wide variety of groups ranging in complexity from bacteria to human economies. Within these groups, some individuals, such as worker ants, sacrifice their ability to reproduce and instead dedicate their lives to the maintenance of the colony and success of their kin. A worker ant may spend its entire life performing a single task, such as defending the colony or tending to the brood. The complexity of the strategies employed by these groups, combined with their rampant success, gives rise to questions regarding why division of labor exists. While extensive research has been done to better understand the patterns and mechanisms of division of labor, exploring this topic in an evolutionary context remains challenging to study due to the slow pace of evolution and imperfect historical data. Understanding how and why division of labor arises is pertinent not just for understanding biological phenomena, but also as a means to enable evolutionary computation techniques to address complex problems using problem decomposition. The objective of problem-decomposition approaches is to have a group of individuals cooperatively solve a complex task by breaking it into pieces, having specialist individuals solve the pieces, and reassembling the solution. Essentially, problem-decomposition approaches use division of labor to enable groups to solve more challenging problems than any individual could alone. Unfortunately, human engineers have struggled with creating effective, automated problem-decomposition approaches.In this dissertation, I use digital evolution (i.e., populations of self-replicating computer programs that undergo open-ended evolution) to investigate questions related to the evolution of division of labor and to apply these insights to problem decomposition techniques. This dissertation has three primary components: First, we provide experimental evidence that evolutionary computation techniques can evolve groups of individuals that exhibit division of labor. Second, we explore two hypotheses for the evolution of division of labor. Specifically, we find support for the hypothesis that temporal polyethism (i.e., where a worker's age is related to the task it performs within the colony) may result from the evolutionary pressures of aging and risks associated with tasks. Additionally, we find support for a hypothesis initially proposed by Adam Smith, the premier economist, that the presence of task-switching costs results in an increase in the amount of division of labor exhibited by groups. Third, we describe how our analyses revealed that groups of organisms evolved as part of our task-switching work exhibit complex problem decomposition strategies that can potentially be applied to other evolutionary computation challenges. This work both informs biological studies of division of labor and also provides insights that can enable the development of new mechanisms for using evolutionary computation to solve increasingly complex engineering problems.
Show less
- Title
- Using formal analysis and search-based techniques to address the assurance of cyber-physical systems at the requirements level
- Creator
- DeVries, Byron
- Date
- 2017
- Collection
- Electronic Theses & Dissertations
- Description
-
For high-assurance cyber-physical systems (CPS), such as the onboard features in modern transportation systems (e.g., automobiles, trains, and flight systems), ensuring acceptable and safe behavior is of paramount importance. Furthermore, the increasing complexity and the number of onboard features for autonomous vehicles further exacerbates the challenge of guaranteeing safe behavior. The operation of these high-assurance cyber-physical systems depends on the specification, implementation,...
Show moreFor high-assurance cyber-physical systems (CPS), such as the onboard features in modern transportation systems (e.g., automobiles, trains, and flight systems), ensuring acceptable and safe behavior is of paramount importance. Furthermore, the increasing complexity and the number of onboard features for autonomous vehicles further exacerbates the challenge of guaranteeing safe behavior. The operation of these high-assurance cyber-physical systems depends on the specification, implementation, and verification of those systems. Obstacles to assessing and ensuring assurance for cyber-physical system requirements may occur in many forms, but two significant sources of specification errors are incomplete requirements specifications and undesired feature interactions. In the case of incomplete requirements, it can be challenging to enumerate all the decomposed requirements necessary to satisfy a requirement (i.e., ensuring completeness), especially when considering different combinations of environmental conditions. A feature interaction occurs when two or more features satisfy specific properties in isolation, but no longer satisfy those properties when they are composed together. It may be necessary to analyze an exponential number of feature combinations to detect all possible interactions, resulting in a potentially exponential number of feature interaction results presented to the system developer. Furthermore, the uncertainty created by unexpected system and environmental scenarios exacerbates already difficult requirements specifications problems, many of which involve an exhaustive search for errors and their causes. That is, the exponential number of possibilities represents not only computational growth but also growth in the effort it takes the system designer to assess the results. This doctoral research tackles two key requirements assurance problems that exhibit these characteristics: requirements incompleteness and undesired feature interactions. The work explores how formal analysis and search-based techniques can be used in a complementary and synergistic fashion to address the assurance of cyber-physical systems facing environmental and system uncertainty, both at design time and run time. Industrial applications are used to demonstrate the respective techniques.
Show less