Search 2.0: Navigating the World of Data Using Auditory and Visual Queries
Search continues to develop, as users are eager to find information that is most relevant to them. Google goes real-time, Aardvark crowdsources, and Bing does images. With powerful sensor technology packed into consumer mobile devices we see a Search 2.0 taking shape, one that takes elements of the physical world to serve a mobile audience.
We’re already seeing elements of this emerge: voice-to-text technology is making it possible to completely skip the touch screen keyboard; Google’s voice search recognizes what you say to pull up results; apps like Shazam and Midomi run recorded music across their database to match and present an exact artist and song. Soon it could be possible to determine location based on sound levels and texture being fed in through a microphone. This kind of data is beginning to be collected by companies like NoiseTube.
The phone’s camera is also a sensor being used for search, as we’ve seen in new augmented reality applications. Zehnder’s Voodoo Experience helped users find information about performances, attractions, and services within a specific venue. Using Foursquare on Layar brings a similar experience to finding nearby restaurants and bars. Google Goggles lets users snap shots of landmarks, books, or art to pull up search results. Whisper Deck introduces a more experimental interface for accessing information using augmented reality. With physical goggles and microphone, users can immerse themselves in data while away from the computer.
These new methods for search add a new dimension to finding what we want, when we want it. Advertisers would be wise to start thinking about these new modalities and how they can best serve information to consumers through them. While brands sometimes insist on creating an entirely independent store locater app, perhaps it would be wise to simply boost their presence in paths already being used by consumers for discovery and navigation.