There are quite a few new iOS’s being released this year. This week, Apple announced the new features of iOS 9, coming to the market later this year. In an effort to compete with major changes hitting Google I/O, Apple added some interesting new features to iOS 9.
Apple changed the way OS searches in order to make Siri and Spotlight more intelligent. To do this, apps will now deliver core data to the two search engines. The data will no longer be strictly linked inside the apps and will now be accessible, so Siri and Spotlight can search better across all devices.
Also, Siri will be able to search through the photo roll for specific dates and locations. So, when you want to see photos from last year’s camping trip in Maine, all you have to do is ask Siri through voice command. In addition to smarter searching, Siri will finally be able to take requests for specific songs and playlists. For example, you can ask her to “play the top songs from 2001.” According to Apple, Siri will also be faster and return more accurate results, though that is yet to be seen.
The changes to the desktop search assistant, Spotlight, are profound as well. Spotlight will be able to understand us the way we naturally speak and deliver accurate results. Searching with a keyword on your phone will return results within apps now, too. Apple announced that it would allow third party apps to link to spotlight through API, so searches will be incredibly deep-linked.
The main goal here is to increase the intelligence of search engines using the additional data pulled from apps. Because apps are the new way we use technology, both iOS and Android take giant steps to make them smarter and more integrated. Hopefully, this means that, in time, our phones will immediately know what we’re looking for so we don’t have to spend time searching for it.