We’ve been paying lots of attention to the emerging fields of voice and visual search. This comes down to the notion that the mobile device has eyes and ears. As mobile search, apps and games evolve, they’ll better leverage these hardware realities, rather than replicate the desktop experience.
Besides augmented reality, voice search and bar-code scanners, one of the more notable events of the past year was Google’s introduction of Goggles. Pictures taken with a phone are referenced against image databases to not only identify objects but also use them as a kind of launchpad for a search query.
Like a lot of things Google, this was largely unproved and given the chance to be tested “in the wild.” Through lots of use, the product was meant to improve, including a feedback loop of images taken, shared and queried. This compares with a company like Apple that only releases fully baked products.
Google even admits that image recognition is still a challenging problem and Goggles is still a Labs product. As such, it works best for landmarks, book covers, DVDs, games and things like wine labels. It’s still ineffective on more amorphous objects like animals or food.
But the latest for Goggles is today’s announcement that it’s part of the Google Mobile App for iPhone. When it launched about nine months ago, it was only available for Android. The extra use it gets from iPhone users should support the aforementioned goal of getting it out there.
Part of this will likely involve local search applications. The example Google gives in today’s blog post in fact has a large local component (screen shot above). And there are clear implications for mobile shopping, especially with consumer packaged goods where there are labels and bar codes.
This will be something to watch closely. In the meantime you can watch the video below and download the app for free.