Introduced with iOS 18.2, Visual Intelligence is a cool AI-based feature. Just aim your phone at an object that has aroused your curiosity. Press a button, and you can run a Google web search or ask ChatGPT to describe, summarize, or answer specific questions about the mystery item.
Previously, you had to activate and press the Camera Control to trigger Visual Intelligence, but that limited its scope to the iPhone 16 lineup. With iOS 18.4, you can use your iPhone’s Action button instead, expanding the feature to the iPhone 15 Pro, the iPhone 15 Pro Max, and the iPhone 16e.
I have an iPhone 16 Pro, though; so why would I need to launch Visual Intelligence with the Action button? Well, depending on how I’m holding my phone and the object I’m framing, I sometimes find the Action button easier to access and press. Having both the Camera Control and Action button available gives me more flexibility when I want to call on Visual Intelligence.
Also: This useful Apple Intelligence camera feature is coming to iPhone 15 Pro – here’s how it works
To set this up on a supported iPhone, head to Settings and select Action Button. Swipe through the different options until you find the one for Visual Intelligence. Then just exit the Settings screen. Now point your phone at an object you’d like described and press down on the Action button. A screen pops up with two choices. Tap Ask to ask ChatGPT questions about the object; tap Search to run a Google search on it.
Source: Robotics - zdnet.com