Yang Li
first last name @ acm dot org, Google, Research
Google Inc.
1600 Amphitheatre Parkway
Mountain View, CA 94043 USA
(650) 485-1699

My research in Human Computer Interaction focuses on novel tools and methods for creating mobile interaction behaviors. In particular, this includes emerging input modalities (such as gestures and cameras), cross-device interaction and predictive user interfaces. My team develops software tool support and recognition methods by drawing insights from user behaviors and current practice, and leveraging techniques such as machine learning, computer vision and crowdsourcing to make complex tasks simple and intuitive. Our work has made a real world impact and benefited millions of end users and developers.

Bio · Yang is a Senior Research Scientist at Google, and an affiliate faculty member in Computer Science & Engineering at the University of Washington. He earned a Ph.D. degree in Computer Science from the Chinese Academy of Sciences, and conducted postdoctoral research in EECS at the University of California at Berkeley.

Recent Papers · Curriculum Vitae ·  Videos
2014

Optimistic Programming of Touch Interaction

Yang Li, Hao Lu, Haimo Zhang. To appear in TOCHI: ACM Transactions on Computer-Human Interaction, 2014.

Integrated tool and inference support that allows developers to easily create touch behaviors in their apps.

Reflection: Enabling Event Prediction As an On-Device Service for Mobile Interaction

Yang Li. To appear at UIST 2014: ACM Symposium on User Interface Software and Technology.

An on-device infrastructure that provides event prediction as a service to mobile applications.

Detecting Tapping Motion on the Side of Mobile Devices By Probabilistically Combining Hand Postures

William McGrath, Yang Li. To appear at UIST 2014: ACM Symposium on User Interface Software and Technology.

Video        

A method for detecting finger taps on the different sides of a smartphone, using the built-in motion sensors of the device.

Gesturemote: interacting with remote displays through touch gestures

Hao Lu, Matei Negulescu, Yang Li. AVI 2014: International Working Conference on Advanced Visual Interfaces.

A technique for interacting with remote displays through touch gestures on a handheld touch surface, which supports a wide range of interaction behaviors, from low pixel-level interaction such as pointing, to medium-level interaction such as structured navigation, to high-level interaction such as shortcuts.

Gesture Script: Recognizing Gestures and their Structure using Rendering Scripts and Interactively Trained Parts

Hao Lu, James Fogarty, Yang Li. To appear at CHI 2014: ACM Conference on Human Factors in Computing Systems.

Video         Best Paper Honorable Mention Award

Recognizing gestures and their properties using examples and parts-based scripting.

InkAnchor: Enhancing Informal Ink-Based Note Taking on Touchscreen Mobile Phones

Yi Ren, Yang Li, Edward Lank. To appear at CHI 2014: ACM Conference on Human Factors in Computing Systems.

Video

Presented a tool for informal note-taking on the touchscreen by sketching.

GestKeyboard: Enabling Gesture-Based Interaction on Ordinary Physical Keyboard

Haimo Zhang, Yang Li. To appear at CHI 2014: ACM Conference on Human Factors in Computing Systems.

Video

Enabled gesturing on an ordinary physical keyboard.

Hierarchical Route Maps for Efficient Navigation

Fangzhou Wang, Yang Li, Daisuke Sakamoto, Takeo Igarashi. IUI 2014: International Conference on Intelligent User Interfaces.

Video         Best Paper Award

Interactive optimization of map routes visualization.

Teaching Motion Gestures via Recognizer Feedback

Ankit Kamal, Yang Li, Edward Lank. To appear at IUI 2014: International Conference on Intelligent User Interfaces.

Explored mechanisms to teach end users motion gestures.
2013

CrowdLearner: Rapidly Creating Mobile Recognizers Using Crowdsourcing

Shahriyar Amini, Yang Li. UIST 2013: ACM Symposium on User Interface Software and Technology.

Presented a crowdsourcing platform for automatically generating recognizers that leverage built-in sensors on mobile devices, e.g., paying $10 for creating a usable stroke gesture recognizer in a few hours.

Open Project: A Lightweight Framework for Remote Sharing of Mobile Applications

Matei Negulescu, Yang Li. UIST 2013: ACM Symposium on User Interface Software and Technology.

Video

Discussed an end-to-end framework that allows a user to project a native mobile application onto a display using a phone camera. Any display can become projectable instantaneously by accessing the Open Project web service.

Gesture Studio: Authoring Multi-Touch Interactions through Demonstration and Composition

Hao Lu, Yang Li. CHI 2013: ACM Conference on Human Factors in Computing Systems.

Presented a tool that combines programming by demonstration and declaration, via a video-editing metaphor for creating multi-touch interaction.

FFitts Law: Modeling Finger Touch With Fitts Law

Xiaojun Bi, Yang Li, Shumin Zhai. CHI 2013: ACM Conference on Human Factors in Computing Systems.

Proposed and experimented with a new model that extends Fitts' law with a dual-Gaussian distribution for modeling finger touch behaviors.
2012

Gesture-Based Interaction: A New Dimension for Mobile User Interfaces

Yang Li. Invited Keynote at AVI 2012: International Working Conference on Advanced Visual Interfaces.

Talk

Investigated various aspects of gesture-based interaction on mobile devices, including gesture-based applications, recognition and tools for creating gesture-based behaviors.

Gesture Coder: A Tool for Programming Multi-Touch Gestures by Demonstration

Hao Lu, Yang Li. CHI 2012: ACM Conference on Human Factors in Computing Systems.

Video         Best Paper Honorable Mention Award

Present a tool that automatically generates code for recognizing each state of multi-touch gestures and invoking corresponding application actions, based on a few gesture examples given by the developer.

Bootstrapping Personal Gesture Shortcuts with the Wisdom of the Crowd and Handwriting Recognition

Tom Ouyang, Yang Li. CHI 2012: ACM Conference on Human Factors in Computing Systems.

Video

Contribute the approaches for bootstrapping a user’s personal gesture library, alleviating the need to define most gestures manually.

Gesture Search: Random Access to Smartphone Content

Yang Li. Invited article for IEEE Computer: Pervasive Computing.

Present a tool for random access of smartphone content by drawing touchscreen gestures. It flattens the UI hierarchy of smartphone interfaces.

Tap, Swipe, or Move: Attentional Demands for Distracted Smartphone Input

Negulescu, M., Ruiz, J., Li, Y. and Lank, E.. AVI 2012: International Working Conference on Advanced Visual Interfaces.

Investigated attention demands of motion gestures in comparison with traditional interaction techniques for mobile devices.
2011

Gesture Avatar: A Technique for Operating Mobile User Interfaces Using Gestures

Hao Lu, Yang Li. CHI 2011: ACM Conference on Human Factors in Computing Systems.

Video

Present Gesture Avatar, a novel interaction technique that allows users to operate existing arbitrary user interfaces using gestures. It leverages the visibility of graphical user interfaces and the casual interaction of gestures. It outperformed prior techniques especially while users are on the go.

Deep Shot: A Framework for Migrating Tasks Across Devices Using Mobile Phone Cameras

Tsung-Hsiang Chang, Yang Li. CHI 2011: ACM Conference on Human Factors in Computing Systems.

Video        Blog

Presents a framework for migrating tasks across devices using mobile cameras. It supports two interaction techniques, Deep Shot and Posting, that enabled direct manipulation of information and work states in a multi-device environment.

DoubleFlip: A Motion Gesture Delimiter for Mobile Interaction

Jaime Ruiz, Yang Li. CHI 2011: ACM Conference on Human Factors in Computing Systems.

Designed a motion gesture for separating intended motion input from ambient motion of mobile phones. A DTW-based recognizer was built to recognize the gesture which had high precision and recall.

User-Defined Motion Gestures for Mobile Interaction

Jaime Ruiz, Yang Li, Edward Lank. CHI 2011: ACM Conference on Human Factors in Computing Systems.

Present the results of a guessability study that elicits end-user motion gestures to invoke commands on a smartphone device, which led to the design of a taxonomy for motion gestures and an end-user inspired motion gesture set.

Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments

Andrew Bragdon, Eugene Nelson, Yang Li, Ken Hinckley. CHI 2011: ACM Conference on Human Factors in Computing Systems.

Investigates the impact of situational impairments on touchscreen interaction. Reveals that in the presence of environmental distractions, gestures can offer significant performance gains and reduced attentional load, while performing just as well as soft buttons when the user's attention is fully focused on the phone.
2010

Gesture Search: A Tool for Fast Mobile Data Access

Yang Li. UIST 2010: ACM Symposium on User Interface Software and Technology. p87-96.

Available on Google Play!

Describes a tool that allows users to access mobile phone data using touch screen gestures. Gesture Search flattens the deep UI hierarchy of mobile user interfaces and learns the mapping from gestures to data items.

FrameWire: A Tool for Automatically Extracting Interaction Logic from Paper Prototyping Tests

Yang Li, Xiang Cao, Katherine Everitt, Morgan Dixon, James Landay. CHI 2010: ACM Conference on Human Factors in Computing Systems. p.503-512.

Video

Presents a tool for automatically extracting interaction logic from the video recording of paper prototype tests. FrameWire generates interactive prototypes from extracted interaction logic.

Protractor: A Fast and Accurate Gesture Recognizer

Yang Li. CHI 2010: ACM Conference on Human Factors in Computing Systems. p.2169-2172.

Pseudo Code        Java Implementation in the Android Framework

Presents an algorithm for recognizing drawn gestures. Protractor employs a closed-form solution to find the best match of an unknown gesture given a set of templates.
2009

Beyond Pinch and Flick: Enriching Mobile Gesture Interaction

Yang Li. IEEE Computer: Invisible Computing, December 2009.

Shipped to the Android SDK

Presents the design of a toolkit for gesture-based interaction for touchscreen mobile phones. Introduces the concept of gesture overlays.
2008

ActivityDesigner: Activity-Centric Prototyping of Ubicomp Applications for Long-Lived, Everyday Activities

Yang Li, James Landay. CHI 2008: ACM Conference on Human Factors in Computing Systems. p.1303-1312.

Best Paper Honorable Mention Award         Video        Download

Presents a tool that allows designers to incorporate large-scale, long-term human activities as a basis for design, and speeds up ubicomp design by providing integrated support for modeling, prototyping, deployment and in situ testing.

Cascadia: A System for Specifying, Detecting, and Managing RFID Events

Evan Welbourne, Nodira Khoussainova, Julie Letchner, Yang Li, Magdalena Balazinska, Gaetano Borriello, Dan Suciu. MobiSys 2008: The International Conference on Mobile Systems, Applications, and Services. p.281-294.

Project Website

Cascadia is a system that provides RFID-based pervasive computing applications with an infrastructure for specifying, extracting and managing meaningful high-level events from raw RFID data.
2007

Design Challenges and Principles for Wizard of Oz Testing of Ubicomp Applications

Yang Li, Jason Hong, James Landay. IEEE Pervasive Computing, April-June, 2007, 6(2): 70-75.

Presents the $1 algorithm for gesture recognition and a comprehensive study that evaluates $1 against two other popular gesture recognition algorithms: Dynamic Time Wrapping and Rubine Recognizer. The study indicated that the $1 recognizer though simple outperformed its peers in both accuracy and learnability.

Gestures without libraries, toolkits or Training: a $1.00 Recognizer for User Interface Prototypes

Jacob Wobbrock, Andy Wilson, Yang Li. UIST 2007: ACM Symposium on User Interface Software and Technology. p.159-168.

Invited to the SIGGRAPH UIST Reprise Session        

Presents the $1 algorithm for gesture recognition and a comprehensive study that evaluates $1 against two other popular gesture recognition algorithms: Dynamic Time Wrapping and Rubine Recognizer. The study indicated that the $1 recognizer though simple outperformed its peers in both accuracy and learnability.

BrickRoad: A Light-Weight Tool for Spontaneous Design of Location-Enhanced Applications

Alan Liu, Yang Li. CHI 2007: ACM Conference on Human Factors in Computing Systems: pp.295-298.

Presents a tool for testing location-based behaviors without specifying interaction logic. The tool explores the extreme of Wizard of Oz approaches for designing field-oriented applications, i.e., testing with zero effort beforehand.
2006

Design and Experimental Analysis of Continuous Location Tracking Techniques for Wizard of Oz Testing

Yang Li, Evan Welbourne, James Landay. CHI 2006: ACM Conference on Human Factors in Computing Systems: pp.1019-1022.

Presents various Wizard of Oz techniques for continuously tracking user locations.
2005

Informal Prototyping of Continuous Graphical Interactions by Demonstration

Yang Li, James Landay. UIST 2005: ACM Symposium on User Interface Software and Technology. p.221-230.

Invited to the SIGGRAPH UIST Reprise Session         Video

Presents a tool for creating continuous interactions using examples. Discusses the algorithms for learning continuous interaction behaviors from discrete examples, without using any domain knowledge.

Experimental Analysis of Mode Switching Techniques in Pen-based User Interfaces

Yang Li, Ken Hinckley, Zhiwei Guan, James Landay. CHI 2005: ACM Conference on Human Factors in Computing Systems. p.461-470

Experimental Software Demo        

Conducted a study to compare different mode switching techniques for pen-based user interfaces. The study revealed that bi-manual based mode switching outperformed other techniques.
2004

Topiary: A Tool for Prototyping Location-Enhanced Applications

Yang Li, Jason Hong, James Landay. UIST 2004: ACM Symposium on User Interface Software and Technology: CHI Letters, 6(2): p.217-226.

Download         Video         Examples

Topiary is a tool for rapidly prototyping location-based applications. It introduces a Wizard of Oz approach for testing location-based applications in the field, without requiring a location infrastructure.
Talks & Travel
Dec, 2014, Seoul, Korea
CHI PC Meeting
Oct, 2014, Honolulu, Hawaii
SUI'14, UIST'14 & Vacation
Jun, 2014, Toronto, Canada
UIST PC Meeting
Apr 28-May 1, 2014, Toronto, Canada
CHI'14
Apr 27, 2014, Toronto, Canada
MobileHCI PC Meeting
Jan 24-Feb 22, 2014, China (Nanjing, Yulin, Xi'an, Shanghai)
Vacation in China
Dec 6-8, 2013, Toronto, Canada
CHI PC Meeting
Nov 14, 2013, Atlanta, Georgia
GVU Brown Bag Talk
Professional Services
program committees
paper reviewers

CHI, UIST, TOCHI, SIGGRAPH, IEEE Pervasive Computing, IJHCS, Ubicomp, IUI, ICMI, Pervasive, GI

Mentoring & Teaching
students mentoring

Haimo Zhang, Google 2013 summer Intern, Singapore National University gesturing on legacy devices (GestKeyboard) and touch programming

William McGrath, Google 2013 summer Intern, Stanford University side tap

Fangzhou Wang, Google 2012-13 Intern, Tokyo University Fangzhou's master thesis work on map routes visualization using interactive optimization (with Takeo Igarashi)

Matei Negulescu, Google 2012 Intern, University of British Columbia, Cross-device interaction (Open Project)

Shahriyar Amini, Google 2012 Summer Intern, Carnegie Mellon University, ML crowdsourcing (CrowdLearner)

Hao Lu, Google Intern 2010-2013, PhD Dissertation Committee, University of Washington, Gesture Avatar, Gesture Coder, Gesture Studio and Gesture Script.

Tom Ouyang, Google 2011 Summer Intern, MIT, crowd and HWR-based bootstrapping for gesture recognition

Xiaojun Bi, Google 2011 Winter Intern (co-hosted with Shumin Zhai), University of Toronto, mobile finger touch behavior modeling

Tsung-Hsiang Chang, Google 2010 Summer Intern, MIT, Deep Shot

Jaime Ruiz, Google 2010 Winter Intern, University of Waterloo, double flip & motion gesture design

Evan Welbourne, University of Washington 2006-07, end user programming of location-based services

Alan Liu, University of Washington 2006, Brickroad

teaching and lecturing

Spring 2008: Guest Lecturer UW iSchool INFO 498: Special Topics: Input & Interaction

Lectured for Information School Professor Jacob Wobbrock on "Pen-Based Interaction"

Autumn 2007: Instructor for UW CSE 373: Data Structures and Algorithms

Taught a Computer Science & Engineering course for 69 undergraduate students