News circulated recently that Apple may be planning to integrate Shazam’s song identification technology into its native iOS platform. While this may appear to be a simple integration announcement, it’s much bigger than that. Integrating Shazam into the iPhone’s native feature set could represent a sea change in how Apple views these kinds of enabling technologies.
With over 400 million downloads, Shazam has grown into one of the most popular utility apps in the Apple App Store. While I am a loyal and longtime Shazam user, I really only launch the app about once a month.
Even though opportunities arise to use Shazam more, there’s a usage barrier that prevents me from doing so. I need to launch the app, which is a small challenge, in itself, as I have well over 60 apps on my phone. Then, I have to press the song identification button in a short enough period of time to recognize the song playing. Such is the challenge of Shazam. It has great utility, but it requires a few steps before a user can get to the payoff.
Integrating Shazam technology into Siri would be a game changer. Not because song identification technology is that important, but rather because it shows that Apple recognizes the value of such a utility to its native platform. Just asking Siri, “What’s the name of this song?” is much easier than navigating to the Shazam app individually.
Such an integration could foreshadow Apple leveraging more enabling technologies within native iOS functions. Imagine if QR or barcode scanning capabilities were integrated into Apple’s camera app. QR codes would get an instant boost in relevancy and usability. This seamless integration would have a ripple effect across a number of industries, including advertising, payments, and retail.
Could a Shazam partnership be the start of a major shift in Apple’s core apps?