What is Apple Visual Intelligence?
Visual Intelligence is a key part of the Apple Intelligence experience for iPhone 16 users, but it was surprisingly absent at the latest iPhone’s launch in September 2024.
However, it has finally arrived with the release of iOS 18.2 in early December alongside other key features like Genmoji, Image Playground and ChatGPT integration, meaning it’s now readily available on thousands of iPhones around the world.
The question is, what is Apple Visual Intelligence and why is it such a key part of the Apple Intelligence proposition? Here’s everything you need to know about Visual Intelligence on iPhone.
What is Apple Visual Intelligence?
Visual Intelligence is an Apple Intelligence feature that is exclusive to the iPhone 16 Series despite the fact that most other features are available on the iPhone 15 Pro and iPhone 15 Pro Max.
It seemingly comes down to the activation of the new feature, which makes use of the new Camera Control button, allowing users to jump into the app in seconds by pressing and holding the button. Then, you simply hold your camera up to an object or place and snap a photo to learn more about it.
The feature uses AI to scan an image and pull relevant information or complete a related action. For example, you can snap a photo of a dog to find out what breed it is, take a photo of a bike to see similar ones for sale or hold your camera up to a poster to automatically add the event and date to your calendar.
Visual Intelligence also works alongside third-party tools, such as Google to search for products for sale or ChatGPT for further expertise – though you need to enable ChatGPT on your iPhone for the latter.
If you’re understandably concerned about the privacy aspect of sending photos of your location, the places you plan to go and the items you intend to buy to Apple and its partners, you’ll be relieved to hear that Visual Intelligence is private and designed to give users control of when third-party tools are used and exactly what information is shared.
Visual Intelligence is essentially Apple’s answer to Google’s Lens and Circle To Search. While Google Lens is available on the iPhone, Circle To Search is exclusive to specific Android smartphones, with Visual Intelligence somewhat levelling the playing field for iOS users – minus the circling part.
When was Visual Intelligence released?
Visual Intelligence was released on 11 December as part of the iOS 18.2 update.
The feature is available in the US, UK, Australia, Canada, Ireland, New Zealand and South Africa for now, with other regions, including users in the EU, getting access sometime in 2025.
Which models of iPhones support Visual Intelligence?
Visual Intelligence works alongside the new Camera Control button, meaning the feature is currently exclusive to the iPhone 16 Series, including the cheapest iPhone 16 and iPhone 16 Plus models.