Proceedings of the 3rd Annual ACM SIGGRAPH Symposium on User Interface Software and Technology - UIST '90 1990
DOI: 10.1145/97924.97938
|View full text |Cite
|
Sign up to set email alerts
|

Integrating gesture and snapping into a user interface toolkit

Abstract: This paper describes Artkit -the Arizona Retargetable Toolkit -an extensible object-oriented user interface toolkit.Artkit provides an extensible input model which is designed to support a wider range of interaction techniques than conventional user interface toolkits. In particular the system supports the implementation of interaction objects using dragging, snapping (or gravity fields), and gesture (or handwriting) inputs. Because these techniques are supported directly by the toolkit it is also possible to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0
1

Year Published

1993
1993
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(21 citation statements)
references
References 19 publications
(12 reference statements)
0
20
0
1
Order By: Relevance
“…In the same years, Henry et al [56] used FSMs for solving the problem of modelling non atomic actions on the UI, such as the drag and drop technique. Indeed, such kind of interaction is particularly tedious for developers, since they need to track the event sequence in order to implement describe the temporal relationship of the user's actions, which is close to the definition of a gesture.…”
Section: 2mentioning
confidence: 99%
“…In the same years, Henry et al [56] used FSMs for solving the problem of modelling non atomic actions on the UI, such as the drag and drop technique. Indeed, such kind of interaction is particularly tedious for developers, since they need to track the event sequence in order to implement describe the temporal relationship of the user's actions, which is close to the definition of a gesture.…”
Section: 2mentioning
confidence: 99%
“…This functionality allows for "uncluttering" of the display, removing GUI widgets and preserving space for the key focal point of any image review workstation-the images. Gesture recognition itself has been explored in other application contexts, including: handwriting recognition, personal device assistants (PDAs), web browser manipulation, computer drawing programs, and computer games [40]- [44]. The set of gestures is configurable such that a gesture from a training set can be dynamically assigned to one of a set of common image manipulation functions (e.g., window-level setting, rotation, image layout settings, measurement, annotation), allowing the user to tailor the display and the functionality of the application to his/her preferences.…”
Section: ) Application Programmer's Interfacementioning
confidence: 99%
“…They also describe input and event handling in a cleaner, more general and extensible way than traditional toolkits [26,15]. Even though they support at some level Post-WIMP techniques such as simple gesture recognition, they are only aware of a limited set of input devices and require significant modifications to handle any new interaction paradigm.…”
Section: Advanced Gui Toolkitsmentioning
confidence: 99%