Monday, September 12, 2011

Paper Reading #8- Gesture search: a tool for fast mobile data access

Title: Gesture search: a tool for fast mobile data access
Reference Information:
Yang Li. "Gesture search: a tool for fast mobile data access." Proceeding UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology ACM. New York, NY ©2010. ISBN: 978-1-4503-0271-5.
Author Bio:
Yang Li- Yang is a Senior Research Scientist at Google. Before joining Google's research team, Yang was a Research Associate in Computer Science & Engineering at the University of Washington and helped found the DUB (Design:Use:Build), a cross-campus HCI community. He earned a Ph.D. degree in Computer Science from the Chinese Academy of Sciences, and then did a postdoctoral research in EECS at the University of California at Berkeley.
Summary:
  • Hypothesis: "Intuitively, GUI-oriented touch input should have less variation in its trajectory than gestures."
  • Methods: Users can make gestures on the screen of the phone to invoke certain actions. To erase a gesture, the user can swipe from right to left on the bottom of the screen where a scaled-down version of the drawn gesture is displayed; to erase all current gestures, the user can swipe from left to right. The device can also use Gesture Search by incorporating "gesture history" into its algorithm. One main issue with Gesture Search is separating casual touch input events from "Gestures". Yang distincts the two by first receiving touch input, browsing a list of possible touch events as well as gesture inputs, and then decides what to do. If it is determined that the movement is a gesture, the displayed input will transition from tranclucent yellow (meaning action was pending based on deciding if input was a gesture or a usual touch event) to a brighter yellow, overlaying the screen on the phone and halting all "behind" actions and events. In the case of a touch event and not a gesture, Yang recruited volunteers with Android phones to study how they interact with their phones to incorporate determining touch inputs over gesture inputs into his algorithm.
  • Results: After collecting results, Yang figured out a way to implement an algorithm. For gestures that would require "drawing" and "tapping" (i.e., writing out a "j" gesture), Yang would process the "tap" input if it were first, and allow a buffer time to see if the user was going to "draw" any more; if the user did not process any more input during that time, the input was considered to be a touch event. Yang discovered that touch events to the GUI had narrower "areas of touch" and were "more square" then gesture inputs. After querying participants who tested Gesture Search, Yang received feedback on a scale of 1-5 in a Linkert Scale (one being the worst, five being the best). Users predominantly answered that Gesture Search was a 4 on this scale (meaning generally happy and saw use in the applications). Participants also answered with comments such as "did not interfere with normal touch inputs", "I did not see Gesture Search on my home screen when I didn't need it", and "I got to what I wanted to a lot faster."
  • Contents: Yang wrote this paper in order to publicize his invention of Gesture Search. It is a tool for users to quickly access mobile phone data, such as applications and contacts, by drawing gestures. Gesture Search seamlessly integrates gesture-based interaction and search for fast mobile data access. It demonstrates a novel way for coupling gestures with standard GUI interaction. Yang studied interactions with touchscreen phones by typical users as well as polled volunteers for feedback on the software. He found there is some room for improvement, but overall users were satisfied with the software.
Discussion:
I thought this was a weird idea actually. I don't think I would personally use something like this unless I was a user that had over 100 applications or something and needed to be able to access any one at any time relatively quickly. The idea is genius though. The algorithms and designs for implementing this were also really creative and well thought-out. This kind of imagination is definitely a springboard for other researchers. This kind of idea could also be carried over into other fields and could be used on other devices such as PCs (if they ever got touch screen desktop PCs....just saying). For example, if you wanted to load up a certain .exe from a directory in your system and didn't have a shortcut on your desktop (or the opposite- you wanted to load up an .exe but had a TON of shortcuts in your desktop and didn't want to spend minutes looking for it), you could just gesture and the .exe would load up. I thought Yang definitely achieved his goals for sure. He created what he set out to and got an overwhelmingly positive feedback in return. Some users felt there was room for improvement, but those fixes (as explained in the paper) could be easily fixed.

No comments:

Post a Comment