Apple made a host of announcements last evening during the WWDC (Worldwide Developers Conference) 2021 keynote that included introducing the iOS 15, iPadOS 15, watchOS 8, and the new macOS called Monterey. In case you missed the kickoff last evening, you can catch up with all the announcements here. Now, among all the new software upgrades Apple announced, one of them is called Live Text. This Live Text feature essentially lets you pull out text and contact details from a photo, and this is coming to Apple devices soon.
It’s a great feature upgrade and a handy on too. Sure, Apple users have a lot to be excited about going by the demo shown. But Android users are having the last laugh here, thanks to the Google Lens. Google Lens has been doing exactly what Apple’s Live Text is going to do, and more, for years now. And it has been around since 2017.
Google Lens Copy Cat
Announced at Google I/O 2017, Google Lens has since expanded its capabilities extensively and can do things like “see details from Google’s extensive knowledge graph when looking at objects or landmarks, copy relevant information like contact details, and even scan documents,” as Android Police points out. So, Apple is late to the party.

All the things Apple’s Live Text can do. (Apple)
Apple’s Live Text feature will pretty much do all the same stuff that Google Lens can do, including recognising landmarks and dog breeds by pulling data from “Siri Knowledge” and other Apple services.
This is the same as what Google Lens does with the Google knowledge graph. Live Text will also be able to pull out text from what you see, including images that you’ve already taken. It also has capabilities like searching for recognised landmarks, locations in Spotlight, objects, etc, that Google also has through services like Google Photos.
Apple’s Live Text is going to work across platforms – on the Macs, as well as on iPad and iOS devices. Google Lens is integrated into Chrome’s image search, so that is cross-platform as well.