Skip to content

ScreenHunter_01 Feb. 05 11.23

One of the success factors for apps joining the crowded fray (140,000 iPhone apps) is elegant melding with the hardware and general mobile “use case.” This is contrasted with Web-based products that shove themselves onto a smaller screen. Bar code scanning apps and voice search are a few examples we’ve explored here recently.

Siri launched today in the latter category with an iPhone app that has a voice controlled interface to find and discover items and events around you. After $24 million in funding and years of Darpa-supported underlying research at SRI, it carries a robust natural language processor.

Results from spoken queries are returned from 30 different services (via API) and more to come. These include OpenTable, TaxiMagic,, WeatherBug, Yahoo Local, Eventful, Citysearch, and AllMenus. In this way, it is making a clear run at ultimate mobile aggregator status.

True to the mobile device, it doesn’t require a lot of finger pecking, but rather discovers the right source based on semantic analysis. This requires speech-to-text capability as well as a cognitive engine to discern and return appropriate results. Like many such systems, it will get better over time via feedback loop and machine learning.

It’s available for free in the App Store, though it’s recommended for use on the iPhone 3Gs, due to required processing speeds (more compatibility to come later). Siri will make money through affiliate revenues with its content partners — usually tied to local actions taken, such as table reservations through OpenTable.

It looks to have all the ingredients to quickly surface in the App Store’s most-popular lists. You can also picture it easily making its way to “there’s an app for that” TV spots. We should see more voice search interfaces develop and more intelligent discovery engines — both important trends in mobile. This is the first to combine them in what appears to be an elegant fashion.

It will require more review, a conversation with the company, and of course the market’s reaction. More to come on all fronts. In the meantime, below is a video that highlights its features, including natural language processing. “Take me drunk, I’m home” is my favorite (but hopefully not needed anytime soon).

This Post Has 0 Comments

Leave a Reply

Back To Top