Among the updates recently introduced in iOS 15, Live Text has been one of the highlights for iPhone users. But unlike other iOS operating system updates, it didn’t surprise Android users, who have long performed a similar function. For the most part, when it comes to comparing, iOS 15’s Live Text vs Android 12’s Google Lens, both functions serve the same purpose, but of course with features that make a difference.
Although iOS 15 is not yet available for all audiences, the first beta revealed many features of this update. And the YouTuber In Depth Reviews posted a video comparing Live Text on iOS 15 with Google Lens on Android 12
Live Text vs.Google Goal
The two functions are similar in the possibility of extract text from images. Beyond that, the Android and iOS option offer additional features, since this YouTuber has divided the experience into several tests, which according to him summarize the actions most used in this type of tool.
Text recognition: According to the test, iOS 15 has a better text recognition system because it can search spotlgith text directly
Text translation: For translating text from images taken from a computer, Google Lens and Live Text worked well. However, again, Google Lens translates text better when it comes to translating text from handwritten images.
Visual search: Visual Search is a feature of iOS 15 that provides you with landmark information directly from the Photos app. Google Lens can already do this, with an additional ability to search for objects. While extracting landmarks from an image, Live Text erred in identifying the Tolerance Bridge in Dubai as the Millennium Bridge in New Castle.
Yes OK Android 12’s Google Lens proved to be superior in several waysLive Text, just in its second beta, still has time to correct its weaknesses, before the official launch scheduled for this fall. And maybe it’s time for a new comparison.