Abstract. A good state-time quantized symbolic abstraction of an already input quantized control system would satisfy three conditions: proximity, soundness and completeness. Extant approaches for symbolic abstraction of unstable systems limit to satisfying proximity and soundness but not completeness. Instability of systems is an impediment to constructing fully complete state-time quantized symbolic models for bounded and quantized input unstable systems, even using supervisory feedback. Therefore, in this paper we come up with a way of parametrization of completeness of the symbolic model through the quintessential notion of "Trimmed-Input Approximate Bisimulation" which is introduced in the paper. The amount of completeness is specified by a parameter called "trimming" of the set of input trajectories. We subsequently discuss a procedure of constructing state-time quantized symbolic models which are near-complete in addition to being sound and proximate with respect to the time quantized models.
When a user learns to use a new device, her understanding of it evolves. A progressive comparison of the evolving user models towards the device target model, for analysing learning, involves determining the behavioral proximity between them. To quantify the gap between a user model and a target model, we introduce an edit distance metric for measuring their behavioral proximity using a bisimulation-based equivalence relation. We define edit distance to be the minimum number of edges and states with incident edges required to be deleted from and/or added to a user model to make it bisimilar to the target model. We propose an algorithm to compute edit distance between two models and employ the heuristic procedure on experimental data for computing edit distance between target and user models. The data is organised into two experiments depending on the device the user interacted with: (a) a simple device resembling a vending machine and (b) a close to real-world vehicle transmission model. The results validate our proposed metric as edit distance converges with progressive user learning, increases for erroneous learning, and remains unchanged indicating no learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.