Among the tenets of Smart Manufacturing (SM) or Industry 4.0 (I4.0), digital twin (DT), which represents the capabilities of virtual representations of components and systems, has been cited as the biggest technology trend disrupting engineering and design today. DTs have been in use for years in areas such as model-based process control and predictive maintenance, however moving forward a framework is needed that will support the expected pervasiveness of DT technology in the evolution of SM or I4.0. A set of requirements for a DT framework has been derived from analysis of DT definitions, DTs in use today, expected DT applications in the near future, and longer-term DT trends and the DT vision in SM. These requirements include elements of re-usability, interoperability, interchangeability, maintainability, extensibility, and autonomy across the entire DT lifecycle. A baseline framework for DT technology has been developed that addresses many aspects of these requirements and enables the addressing of the requirements more fully through additional specification. The baseline framework includes a definition of a DT and an object-oriented (O-O) architecture for DTs that defines generalization, aggregation and instantiation of DT classes. Case studies using and extending the baseline framework illustrate its advantages in supporting DT solutions and trends in SM.
Smart manufacturing (SM) is a term generally applied to the improvement in manufacturing operations through integration of systems, linking of physical and cyber capabilities, and taking advantage of information including leveraging the big data evolution. SM adoption has been occurring unevenly across industries, thus there is an opportunity to look to other industries to determine solution and roadmap paths for industries such as biochemistry or biology. The big data evolution affords an opportunity for managing significantly larger amounts of information and acting on it with analytics for improved diagnostics and prognostics. The analytics approaches can be defined in terms of dimensions to understand their requirements and capabilities, and to determine technology gaps. The semiconductor manufacturing industry has been taking advantage of the big data and analytics evolution by improving existing capabilities such as fault detection, and supporting new capabilities such as predictive maintenance. For most of these capabilities: (1) data quality is the most important big data factor in delivering high quality solutions; and (2) incorporating subject matter expertise in analytics is often required for realizing effective on-line manufacturing solutions. In the future, an improved big data environment incorporating smart manufacturing concepts such as digital twin will further enable analytics; however, it is anticipated that the need for incorporating subject matter expertise in solution design will remain.
In this paper we discuss the modelling and control of networked control systems (NCS) where sensors, actuators and controllers are distributed and interconnected by a common communication network. Multiple distributed communication delays as well as multiple inputs and multiple outputs (MIMO) are considered in the modelling algorithm. In addition, the asynchronous sampling mechanisms of distributed sensors are characterized to obtain the actual time delays between sensors and the controller. Due to the characteristics of a network architecture, piecewise constant plant inputs are assumed and discrete-time models of plant and controller dynamics are adopted to analyse the stability and performance of a closed-loop NCS. The analysis result is used to verify the stability and performance of an NCS without considering the impact of multiple time delays in the controller design. In addition, the proposed NCS model is used as a foundation for optimal controller design. The proposed control algorithm utilizes the information of delayed signals and improves the control performance of a control system encountering distributed communication delays. Several simulation studies are provided to verify the control performance of the proposed controller design.
A prototype hardware/software system has been developed and applied to the control of single wafer chemicalmechanical polishing (CMP) processes. The control methodology consists of experimental design to build response surface and linearized control models of the process, and the use of feedback control to change recipe parameters (machine settings) on a lot by lot basis. Acceptable regression models for a single wafer polishing tool and process were constructed for average removal rate and nonuniformity which are calculated based on film thickness measurement at nine points on 8 in blanket oxide wafers. For control, an exponentially weighted moving average model adaptation strategy was used, coupled to multivariate recipe generation incorporating user weights on the inputs and outputs, bounds on the input ranges, and discrete quantization in the machine settings. We found that this strategy successfully compensated for substantial drift in the uncontrolled tool's removal rate. It was also found that the equipment model generated during the experimental design was surprisingly robust; the same model was effective across more than one CMP tool, and over a several month period. We believe that the same methodology is applicable to patterned oxide wafers; work is in progress to demonstrate patterned wafer control, to improve the control, communication, and diagnosis components of the system, and to integrate real-time information into the run by run control of the process.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.