Reference Information:
Jamie Ruiz, Yang Li, and Edward Lank. "User-defined motion gestures for mobile interaction". CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems. ACM New York, NY, USA. ©2011. ISBN: 978-1-4503-0228-9.
Author Bios:
Jamie Ruiz- I'm a fifth-year doctoral student in the Human Computer Interaction Lab in the Cheriton School of Computer Science at the University of Waterloo. My advisor is Dr. Edward Lank.
Yang Li- Yang is a Senior Research Scientist at Google. Before joining Google's research team, Yang was a Research Associate in Computer Science & Engineering at the University of Washington and helped found the DUB (Design:Use:Build), a cross-campus HCI community.
Edward Lank- I am an Assistant Professor in the David R. Cheriton School of Computer Science at the University of Waterloo.
Summary:
- Hypothesis: If smartphones contain devices that allow for 3D tracking and sensing of the phone, then a set of optimal gestures to invoke commands with very natural mapping exists.
- Methods: The authors conducted a "guessability study" on participants by having them perform tasks and asking them what gesture / motion allows for the optimal mapping. For the experiment, users were told to treat the smartphone as a "magic black brick" because the authors removed all recognition technology from it so the users wouldn't possibly be influenced by anything of that nature. They were told to create gestures for performing tasks from scratch. The participants were recorded via audio and video. Data was also collected from a software on the phone for a "what was the user trying to do?" perspective. The study was conducted over all users who have previous and relevant smartphone experience.
- Results: Users most commonly used gestures not related to smartphone gestures. For example, viewing the homescreen for the smartphone had a very popular gesture as a result involving shaking the phone (as with an etch-a-sketch). Generally, the results gathered from the authors had a general agreement among gestures as well as the reasoning for the gestures. The authors therefore were allowed the luxury from the video "out-loud" process data to understand the user's thought process when creating the gesture. Tasks that were considered to be "opposite" of each other had similar gesture motions but were performed in "the opposite direction" for example. For example, zooming in and out from a map would involve moving the phone closer to you or farther away, respectively. The authors then went into a detailed study about what kind of gesture was created for performing tasks and what was involved / how the phone was treated for the gesture. "Agreement scores" were then calculated for each gesture to quantitatively find how "good" each individual gesture users made were.
- Content: The authors wanted to be able to create more natural, easy-to-use gesture sets for interactions with motion for smartphones versus plain gesture interactions. To create such a set, the authors studied volunteers who use smartphones which created their own gestures for performing tasks on smartphones. These gestures were than individually taken and measured against every other participants to see how "good" each gesture created was. From here, the specific kind of interaction and what all was involved/manipulated was taken into account also to see what general kind of pattern was recognizable.
I love studies like this where there is no field-specific jargon, no technical processes, no high-level fancy vernacular to have to learn, and no difficult end-goal in mind- this was very straightforward. The authors wanted to create a way to have simple and optimal motion gestures to interact with a smartphone. What better way to do it than to let a sample set of smartphone users "create" such a set? I definitely believe the authors achieved their goals and I would definitely consider using a smartphone with motion-based capabilities of the motions were based from studies like these. I enjoyed reading this paper.
No comments:
Post a Comment