top of page
Search
smithgracee65

Google Lens Vs. iOS 15's Live Text: A Comparison Between Image Recognition Tools

iOS 15 updates are to bring Google Lens-like capabilities. But, there are some differences. These differences include the need for an internet connection, device support, privacy, and more.

Recently, Apple announced its new feature, Live Text for iPhone users. Remarkably, it is very much similar to Google Lens. While Google Lens has been in use for several years, Apple’s latest solution is developer beta testing. It will not be launched officially until Apple releases iOS 15 updates.




The newly designed Apple’s object recognition and text features will arrive on laptops and Mac computers running macOS Monterey. Also, it will arrive on any iPad compatible with iPad OS 15.

The Arrival of Google Lens & Its Features

Google launched its Pixel 2 smartphone in 2017. Google Lens was also released in the same year as the Pixel 2 smartphone. But, text recognition software has been in existence for a long time. But, when the search engine giant implements this, it became part of Google’s search engine skills and Google Assistant.

Users’ actions based on the text they found in live camera and photographs views are much more powerful. Mainly, it is more powerful than extracting a text from a photo, as previously optical character recognition did.

Further, Google introduced Google Photos as an added layer to object recognition complexity. It is an online image storage service to use in identifying animals and people. Meanwhile, Apple has also added the same object features to its Photos app. The company has done it for organizing photos. Still, at the moment, these things don’t seem to create any challenge to Google Lens until Live Text is released.

Apple’s Live Text with iOS 15 Upgrade

With the latest iOS 15 upgrade, Apple’s Live Text immediately recognizes any type of text in an image. It makes every number, symbol, word, and letter searchable and selectable. Plus, all these are available for different actions. For example, users can translate all of the words or any portion in a photo. Moreover, when you click or tap on the phone numbers and email addresses, they become links to launch the Phone and Mail apps.

Additionally, you can use Spotlight search to find objects in the photos and images based on the text. It makes it easy to locate a particular image or recover information for users having more extensive libraries. No doubt, all these additions, and features introduced to Apple devices are beneficial and exciting. But, iPad and iPhone users have had access to very similar features.

They have been using similar capabilities and features since 2017 when the search engine introduced Google Lens to the iOS search app. Also, Google Photos is available on various Apple devices. It means searching for objects and text works here. The comparison is very close between the features of Google Lens and Apple’s Live Text.

Google Lens Vs. Live Text: Privacy & Compatibility

Apple’s Visual Lookup and Live Text features will require a new iPad or iPhone with an A12 processor newer version. These features will arrive on Mac computers using the M1 chip. If you have computers based on the Intel process, you will not access Live Text.

Meanwhile, Google Photos and Google Lens work on a wide variety of computer and mobile devices. Initially, Google Lens was available on only Pixel smartphones. Later, it expanded to other Chrome and Android OS devices. To date, Google Lens is available on almost every Android.

Users can use the robust set of features related to image recognition on their Linux, Mac, and Windows computers using a browser. When the app recognizes a text in any image, Google Lens appears as an option. Google had launched Lens many years before the launch of Live Text. So, no doubt, it has much more and extensive experience with image recognition.

In other words, it means that Google Lens has much more features and gives more accurate results. On the other hand, Apple is still to release Live Text to the public. So, its compatibility may grow with the addition of new features over time.

When talking about why Apple made Live Text available only to its in-house processors, it seems that privacy can be the biggest reason. Users need an active internet connection to use Google Lens. On the contrary, you don’t need it for Apple’s Visual Lookup and Live Text.

Its processing uses the chips’ artificial intelligence capabilities built into an M1 Mac, iPad, or iPhone. It may bring better performance in comparison to Google Lens when you use a slow internet connection. The tech giant could have introduced Live Text on its devices based on Intel processors. It is also evident with the fact that Mac Pro gives better performance than M1 Mac.

But, Apple has built its own processor and is transitioning from Intel’s processors. Further, the tech giant may write more complicated algorithms that rely on specialized hardware for previous systems. Of course, it is bad news for users who have purchased Mac computers based on Intel processors. But, it is not surprising news at all.

Google provides a simple solution for translations, text recognition, and more for various Apple devices that are not supported when using the mobile app and Google Photos. For users having the latest iPhone models and the M1 Mac computers, Live Text integrates better with the entire Apple ecosystem and its Photos app. It provides them with a more privacy-focused solution.

Meta Description

Here’s the detailed comparison between Google Lens and iOS 15’s Live Text. Find out their differences that include device support, internet connection, etc.

1 view0 comments

Recent Posts

See All

Best Photo Editing Apps for Android in 2021

Posting impressive photos on social media platforms helps you gain more audience. The more attractive your photo appears, the more likes...

Commentaires


Post: Blog2_Post
bottom of page