The GEsture Clustering toolKit (GECKo) makes it easy to study the manner in which users articulate stroke gestures. GECKo clusters and visualizes stroke gestures according to stroke number, order, and direction, enabling interactive gesture playback and auditing of the clustering results. Gecko allows efficient and effective segmentation of the voice signal by speaker as well as annotation of the linguistic content of the conversation. A key feature of Gecko is the presentation of the output of automatic segmentation and transcription systems in an intuitive user interface for.
Recognizers: $1 • $N • $P • $P+ • $Q • Impact of $-family
Tools: GECKo • GREAT • GHoST • AGATe
GEsture Clustering toolKit (GECKo)
Lisa Anthony, University of Maryland—Baltimore County†
Radu-Daniel Vatavu, University Stefan cel Mare of Suceava
Jacob O. Wobbrock, University of Washington ?subject=From the GECKo page'>[contact]
†Currently at the University of Florida
Download
Current Version: 1.0.5-2016.04
Windows executable: EXE
GECKo source code: C#
Multistroke gesture logs: XML
Paper: PDF
Microsoft .NET 4.0 Framework required. Download it here.
This software is distributed under the New BSD License agreement.
About
The GEsture Clustering toolKit (GECKo) makes it easy to study the manner in which users articulate stroke gestures. GECKo clusters and visualizes stroke gestures according to stroke number, order, and direction, enabling interactive gesture playback and auditing of the clustering results. GECKo also reports within- and between-subject agreement rates after clustering. GECKo will be useful to gesture researchers and developers who wish to better understand how users make gestures, especially when complex multistroke gestures are involved. The gestures produced as part of the research on the $N multistroke recognizer, known as the Mixed Multistroke Gesture (MMG) dataset, are offered for exploration with GECKo.
Video
Our Gesture Software Projects
Gecko Toolkit Apk
- $Q: Super-quick multistroke recognizer - optimized for low-power mobiles and wearables
- $P+: Point-cloud multistroke recognizer - optimized for people with low vision
- $P: Point-cloud multistroke recognizer - for recognizing multistroke gestures as point-clouds
- $N: Multistroke recognizer - for recognizing simple multistroke gestures
- $1: Unistroke recognizer - for recognizing unistroke gestures
Gecko Toolkit Ipod Touch
- AGATe: AGreement Analysis Toolkit - for calculating agreement in gesture-elicitation studies
- GHoST: Gesture HeatmapS Toolkit - for visualizing variation in gesture articulation
- GREAT: Gesture RElative Accuracy Toolkit - for measuring variation in gesture articulation
- GECKo: GEsture Clustering toolKit - for clustering gestures and calculating agreement
Our Gesture Publications
- Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2018). $Q: A super-quick, articulation-invariant stroke-gesture recognizer for low-resource devices. Proceedings of the ACM Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). Barcelona, Spain (September 3-6, 2018). New York: ACM Press. Article No. 23.
- Vatavu, R.-D. (2017). Improving gesture recognition accuracy on touch screens for users with low vision. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '17). Denver, Colorado (May 6-11, 2017). New York: ACM Press, pp. 4667-4679.
- Vatavu, R.-D. and Wobbrock, J.O. (2016). Between-subjects elicitation studies: Formalization and tool support. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '16). San Jose, California (May 7-12, 2016). New York: ACM Press, pp. 3390-3402.
- Vatavu, R.-D. and Wobbrock, J.O. (2015). Formalizing agreement analysis for elicitation studies: New measures, significance test, and toolkit. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI '15). Seoul, Korea (April 18-23, 2015). New York: ACM Press, pp. 1325-1334.
- Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2014). Gesture heatmaps: Understanding gesture performance with colorful visualizations. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '14). Istanbul, Turkey (November 12-16, 2014). New York: ACM Press, pp. 172-179.
- Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2013). Relative accuracy measures for stroke gestures. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '13). Sydney, Australia (December 9-13, 2013). New York: ACM Press, pp. 279-286.
- Anthony, L., Vatavu, R.-D. and Wobbrock, J.O. (2013). Understanding the consistency of users' pen and finger stroke gesture articulation. Proceedings of Graphics Interface (GI '13). Regina, Saskatchewan (May 29-31, 2013). Toronto, Ontario: Canadian Information Processing Society, pp. 87-94.
- Vatavu, R.-D., Anthony, L. and Wobbrock, J.O. (2012). Gestures as point clouds: A $P recognizer for user interface prototypes. Proceedings of the ACM International Conference on Multimodal Interfaces (ICMI '12). Santa Monica, California (October 22-26, 2012). New York: ACM Press, pp. 273-280.
- Anthony, L. and Wobbrock, J.O. (2012). $N-Protractor: A fast and accurate multistroke recognizer. Proceedings of Graphics Interface (GI '12). Toronto, Ontario (May 28-30, 2012). Toronto, Ontario: Canadian Information Processing Society, pp. 117-120.
- Anthony, L. and Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface (GI '10). Ottawa, Ontario (May 31-June 2, 2010). Toronto, Ontario: Canadian Information Processing Society, pp. 245-252.
- Wobbrock, J.O., Wilson, A.D. and Li, Y. (2007). Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST '07). Newport, Rhode Island (October 7-10, 2007). New York: ACM Press, pp. 159-168.
Gecko Toolkit For Mac
Copyright © 2013-2019 Jacob O. Wobbrock. All rights reserved.
Last updated June 30, 2019.