- Paper Gesture search: a tool for fast mobile data access
- View paper here.
- Authors:
- Yang Li - Google researcher who has published several papers for UIST and CHI pertaining to applications of gestures on touch screen devices
- Presented at the 23rd annual ACM symposium on User interface software and technology.
- Hypothesis: In searching for a contact, application, etc. on a device such as Android or iPhone, could using gesture recognition be a suitable alternative to typing letters in with the keyboard.
- Methods: Li implemented an app for the Android that recognized gestures and used the user input to quickly locate contacts, apps, etc. on their phone. He then released it to the Android marketplace and took feedback from the first initial wave of users.
- Results: Feedback was generally positive, stating that a vast majority of searches were easily done using three or less gestures. The only negative feedback was that people stated that they wouldn't need to use Gesture Search if the app they were looking for was on or near the home page.
Summary
Yang Li's paper introduces Gesture Search, an application that will allow the user to quickly find important items on their phone by gesturing the shapes of the letters on the screen. He starts off by describing the different functions of Gesture Search, such as displaying all possible results, even accounting for ambiguity (i.e. sometimes an "A" can look like an "H"). He discusses that his biggest challenge was configuring the touch recognition software to be able to tell the difference between letter gestures and routine taps and swipes. He solved this issue by adding functionality to study the "squareness" of the gesture. The more square a gesture was, the more likely it was a letter. He then tested with by releasing it as a beta and taking feedback through questionnaires. He states that the feedback was very positive, receiving very high score in the app ratings and can see this feature being implemented in the future.
My View
In my last blog, I discussed my negative feelings toward stroke based text entry, mainly because it's not a natural motion for us. The idea of Gesture Search, on the contrary, seems like a spectacular idea to me because the "strokes" I'd be inputting are the natural letters. While I have stated my content for the qwerty keyboard, I will admit that I have fat fingers and typing those tiny keys can be quite tedious. I also have a very large number of apps on my iPod Touch and an even larger number of contacts in my phone, so this method would prove extremely useful to me. Some may argue that the standard search would be just as quick if not quicker, but the way I understand it, these gestures can be done from anywhere, and it would take about as much time to make these gestures as it would just to get to the search screen.
The only concern I have for this application was already brought up in the article, namely how to tell the difference between a routine tap or swipe and a gesture. I've actually thought of another idea for this problem and would love to hear my faithful readers' comments on it. What about keeping a small button in the top left corner of the screen where, when it is held down, the screen gets locked to everything except Gesture Search?
All in all, my basic point of view on this subject is, when can I download this for my iPod?
The only concern I have for this application was already brought up in the article, namely how to tell the difference between a routine tap or swipe and a gesture. I've actually thought of another idea for this problem and would love to hear my faithful readers' comments on it. What about keeping a small button in the top left corner of the screen where, when it is held down, the screen gets locked to everything except Gesture Search?
All in all, my basic point of view on this subject is, when can I download this for my iPod?
No comments:
Post a Comment