The synthesis of four-bar mechanisms is a well-understood, classical design problem. The original systematic work in this field began in the late 1800s and continues to be an active area of research. Limitations to the classical theory of four-bar synthesis potentially limit its application to certain real-world problems by virtue of the small number of precision points and unspecified order. This paper presents a numerical technique for four-bar mechanism synthesis based on genetic algorithms that removes this limitation by relaxing the accuracy of the precision points.
This paper describes a system that provides a full body, digitally programmed, kinesthetic display for virtual reality applications. This design begins by providing a full six axis motion platform for each foot of the operator. From this base design, kneeling boards are added to support rolling, kneeling and prone postures; and a vertical feature presentation mechanism is appended which allows the operator to interact with realistic walls, windows, doors and other vertical obstacles features in his movement space. The design of this complex device was driven by a variety of requirements including the need for system modularity, enhanced safety and the need to emulate all significant physical activities.
We present a system for generation and recognition of oscillatory gestures. Inspired by gestures used in two representative human-to-human control areas, we consider a set of oscillatory (circular) motions and refine from them a 24 gestures lexicon. Each gesture is modeled as a dynamic system with added geometric constraints to allow for real time gesture recognition using a small amount of processing time and memory. The gestures are used to control a pan-tilt camera neck. The gesture lexicon is then enhanced to include nonlinear in parameter ("come here") gesture representations. An enhancement is suggested which would enable the system to be trained to recognized previously unidentified yet consistent human generated oscillatory motion gestures. Systems, Man, and Cybernetics, 1997. 'Computational Cybernetics and Simulation', 1997 , Volume 5, pages 4513 -4518. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Pennsylvania's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it. NOTE: At the time of publication, author Daniel Koditschek was affiliated with the University of Michigan. Currently, he is a faculty member in the School of Engineering at the University of Pennsylvania. Comments Copyright 1997 IEEE. Reprinted from Proceedings of the IEEE International Conference onThis conference paper is available at ScholarlyCommons: http://repository.upenn.edu/ese_papers/348 Dynamic System Representation of Basic and Non-Linear in Parameters Oscillatory Motion GesturesCharles J. Cohen, Lynn Conway, Dan Koditschek, and Gerald P. Roston Cybernet Systems Corporation 727 Airport Blvd. Ann Arbor, MI 48 108, USA ABSTRACTWe present a system for generation and recognition of oscillatory gestures. Inspired by gestures used in two representative human-to-human control areas, we consider a set of oscillatory (circular) motions and refine from them a 24 gestures lexicon. Each gesture is modeled as a dynamic system with added geometric constraints to allow for real time gesture recognition using a small amount of processing time and memory. The gestures are used to control a pan-tilt camera neck. The gesture lexicon is then enhanced to include non-linear in parameter ("come here") gesture representations. An enhancement is suggested which would enable the system to be trained to recognized previously unidentified yet consistent human generated oscillatory motion gestures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.