Google Labs provides gateway for developers to enable more gesture-based application navigation
In an understated blog in its developer zone, we learned that Gesture Search functionality from Google Labs now has its own API. Developers can use this API to integrate the branded ‘Gesture Search’ function into Android apps so that users can use gesture input to write text and search for application-specific data.
Google Labs defines Gesture Search as an option to, “Search your Android-powered device by drawing gestures on the touch screen.” Google’s gesture tool first appeared on Android as an application in its own right constructed directly on top of standard Android search functionality.
The search giant now describes Gesture Search as a route for users to draw a single character to narrow down a contacts list or menu (or other set of information choices) alphabetically by drawing the first letter of their desired destination.
By using the new API, Google says that developers will open up a route to facilitating two individual gesture-based activities. Programmers will first have the option to implement gesture-based searches on top of existing search queries that are in progress – this should mean that users get gesture-based power similar to that seen in the original Gesture Search app, but specifically pertaining to that particular application’s data.
Second, developers will have the option to build with gesture-based navigation functions. Although there are no demos available for this option as yet, it does portend to a new route for developers to improve Android user experiences – especially for those programmers seeking to build quick and simple apps that can be used on the go.
You can read the original post by Google research scientist Yang Li here.