In an increasingly interconnected & cyber-physical world, the ability to coherently measure and manage complexity is vital for the engineering design and systems engineering community. While numerous complexity measures (CMs) have been promulgated over the years, these greatly disagree about “how” complexity should be measured and so far, there has been no comparison across these CMs. In this paper, we propose a framework for benchmarking CMs in terms of their alignment with commonly-held beliefs in the literature; that a measure of complexity should detect increases in complexity with increasing size or level of interconnections, and should decrease through structuring of the architecture. We adopt a design of experiments approach and synthetically create system architectures to systematically vary across these three dimensions. We use this framework as a shared test-bed to document the response of six CMs that are representative of the predominant perspectives of the literature. We find that none of the measures fully satisfy the commonly-held beliefs of the literature. We also find that there is a dichotomy in the literature regarding the archetype of systems that are considered as complex: physics-based (e.g. aircraft) or flow-based (e.g. the power grid), and the intellectual origin of a CM often determines which system characteristics are considered as more complex. Our findings show that we are far from convergence. Our framework provides a path to enable better cross-validation as the community progresses towards a more complete understanding of the complexity phenomena.
Decomposition is a dominant design strategy because it enables complex problems to be broken up into loosely-coupled modules that are easier to manage and can be designed in parallel. However, contrary to widely held expectations, we show that complexity can increase substantially when natural system modules are fully decoupled from one another to support parallel design. Drawing on detailed empirical evidence from a NASA space robotics field experiment we explain how new information is introduced into the design space through three complexity addition mechanisms of the decomposition process: interface creation, functional allocation, and second order effects. These findings have important implications for how modules are selected early in the design process and how future decomposition approaches should be developed. Although it is well known that complex systems are rarely fully decomposable and that the decoupling process necessitates additional design work, the literature is predominantly focused on reordering, clustering, and/or grouping based approaches to define module boundaries within a fixed system representation. Consequently, these approaches are unable to account for the (often significant) new information that is added to the design space through the decomposition process. We contend that the observed mechanisms of complexity growth need to be better accounted for during the module selection process in order to avoid unexpected downstream costs. With this work we lay a foundation for valuing these complexity-induced impacts to performance, schedule and cost, earlier in the decomposition process.
In an increasingly interconnected & cyber-physical world, the ability to coherently measure and manage complexity is vital for the engineering design and systems engineering community. To this end, numerous measures have been promulgated in the literature, yet these measures differ in terms of their intellectual foundations and perspectives, with limited cross-validation among them. In this paper, we propose a framework for benchmarking the status quo of existing complexity measurement approaches in terms of their alignment with the commonly-held beliefs in the literature. We discover that the literature broadly suggests an understanding of complexity based on a system’s size, number of interconnections, and architectural structure. We adopt a design of experiments approach and synthetically create system architectures to mimic the variation across these dimensions. We then use these architectures as a shared test-bed to document the response of four complexity measures that are representative of the predominant perspectives of the literature. We do this by evaluating the change in measurement of a complexity measure as we incrementally varied the levels of one system architecture property believed to affect complexity while keeping the others constant. We find that none of the measures fully satisfy the commonly-held beliefs of the literature and provide a discussion on the underlying factors that lead to these discrepancies. We note that multiple independent discussions coexist in the literature, with little cohesion and communication across the groups, suggesting that further research is required to understand the interactions and influences among these communities. For this purpose, our rigorous, structured, and grounded in literature benchmarking approach can serve as a testbed for development and verification of future architectural assessment tools and measures.
The atmosphere of Venus is an exciting destination for both further scientific study and future human exploration. A recent internal NASA study of a High Altitude Venus Operational Concept (HAVOC) led to the development of an evolutionary program for the exploration of Venus, with focus on the mission architecture and vehicle concept for a 30-day crewed mission into Venus's atmosphere at 50 km. Key technical challenges for the mission include performing the aerocapture maneuvers at Venus and Earth, inserting and inflating the airship at Venus during the entry sequence, and protecting the solar panels and structure from the sulfuric acid in the atmosphere. Two proofs of concept were identified that would aid in addressing some of the key technical challenges. To mitigate the threat posed by the sulfuric acid ambient in the atmosphere of Venus, a material was needed that could protect the systems while being lightweight and not inhibiting the performance of the solar panels. The first proof of concept identified candidate materials and evaluated them, finding FEPteflon to maintain 90% transmittance to relevant spectra even after 30 days of immersion in concentrated sulfuric acid. The second proof of concept developed and verified a packaging algorithm for the airship envelope to inform the entry, descent, and inflation analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.