In an increasingly interconnected & cyber-physical world, the ability to coherently measure and manage complexity is vital for the engineering design and systems engineering community. While numerous complexity measures (CMs) have been promulgated over the years, these greatly disagree about “how” complexity should be measured and so far, there has been no comparison across these CMs. In this paper, we propose a framework for benchmarking CMs in terms of their alignment with commonly-held beliefs in the literature; that a measure of complexity should detect increases in complexity with increasing size or level of interconnections, and should decrease through structuring of the architecture. We adopt a design of experiments approach and synthetically create system architectures to systematically vary across these three dimensions. We use this framework as a shared test-bed to document the response of six CMs that are representative of the predominant perspectives of the literature. We find that none of the measures fully satisfy the commonly-held beliefs of the literature. We also find that there is a dichotomy in the literature regarding the archetype of systems that are considered as complex: physics-based (e.g. aircraft) or flow-based (e.g. the power grid), and the intellectual origin of a CM often determines which system characteristics are considered as more complex. Our findings show that we are far from convergence. Our framework provides a path to enable better cross-validation as the community progresses towards a more complete understanding of the complexity phenomena.
Prior advances in systems engineering (SE) theory were instrumental in defining the discipline and its tools, but are limited in perspective. The SE community needs new theoretical advances to address its existing and emerging sociotechnical challenges. This communication paper is a product of an NSF/SERC/INCOSE funded workshop on theory building in SE with a focus on the use of abstraction and elaboration. The overarching goals of the workshop were twofold. First, to illustrate the nuances and complexities of the theory building process with an emphasis on developing theory about the SE discipline rather than about a particular system of interest. The second goal was to stimulate a new wave of SE theory by providing a set of frames that could be used for formal theory articulation in follow‐up research. The workshop focused on abstraction and elaboration due to their richness and pervasiveness in SE and investigated the concept in two interrelated contexts: (a) the social coordination of design teams and (b) its strategic use during system design. For each research context, we offer a structured guideline for theorizing, and provide an understanding and framing of some of the generalized abstraction and elaboration uses in SE. Research frames articulated in this workshop could serve as the first step for future research on SE theory, which necessitates hypotheses generation, data collection, validation, and refinement.
Decomposition is a dominant design strategy because it enables complex problems to be broken up into loosely-coupled modules that are easier to manage and can be designed in parallel. However, contrary to widely held expectations, we show that complexity can increase substantially when natural system modules are fully decoupled from one another to support parallel design. Drawing on detailed empirical evidence from a NASA space robotics field experiment we explain how new information is introduced into the design space through three complexity addition mechanisms of the decomposition process: interface creation, functional allocation, and second order effects. These findings have important implications for how modules are selected early in the design process and how future decomposition approaches should be developed. Although it is well known that complex systems are rarely fully decomposable and that the decoupling process necessitates additional design work, the literature is predominantly focused on reordering, clustering, and/or grouping based approaches to define module boundaries within a fixed system representation. Consequently, these approaches are unable to account for the (often significant) new information that is added to the design space through the decomposition process. We contend that the observed mechanisms of complexity growth need to be better accounted for during the module selection process in order to avoid unexpected downstream costs. With this work we lay a foundation for valuing these complexity-induced impacts to performance, schedule and cost, earlier in the decomposition process.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.