counter easy hit

Visual Intelligence has made the Camera Control on my iPhone 16 worth using

Visual Intelligence has made the Camera Control on my iPhone 16 worth using
0

One of the big selling points of the iPhone 16 hardware is the Camera Control button. It’s a small physical button on the bottom right of the frame that also has some capacitive capabilities. With the initial launch of iOS 18, a single press launches your camera app of choice, and you can do half presses and sliding gestures to adjust camera settings. It’s a neat idea, but it has some flaws that prevent it from being a great shutter button.

But now we have iOS 18.2, and that brought a lot of new Apple Intelligence features to our phones, especially if you have an iPhone 16. With iOS 18.2, Apple finally added Visual Intelligence, a feature similar to Google Lens, but on iPhone.

After playing with the latest update, I’m happy to report that Visual Intelligence is a real game-changer for the Camera Control, and it’s made the good but awkward button finally worth using.

Camera Control initially let me down

A person using the Camera Control on the Apple iPhone 16 Plus.
Andy Boxall / Digital Trends

When I got my iPhone 16 Pro on launch day, I was excited because I was eager to try out Camera Control for all my photography needs. But as I used the Camera Control as a shutter button and a way to adjust settings, the problems began to rear their heads.

First, the position of the Camera Control isn’t great if you want to use it as a shutter button. It’s more toward the lower center of the frame instead of being closer to the bottom. If you’re taking a photo in landscape orientation, you may still need to reach a bit to press the Camera Control. For me, this meant part of my thumb would end up in front of the screen, obstructing it. This would be a bigger hassle on the iPhone 16 Pro Max due to its size.

iPhone 16 Pro Max in Desert Titanium.
Christine Romero-Chan / Digital Trends

Another issue I had was that pressing the Camera Control meant getting some slight camera shake, which could result in some blur in a still image. Adjusting the pressure sensitivity helped a bit, but there will always be a slight shake compared to touching the on-screen shutter button.

When the iPhone 16 was originally launched, I had trouble with the pressure needed for the half press to get to camera settings. It seems that Apple has fixed that with recent updates, but I still find it faster to just use the touchscreen. A few months after launch, I pretty much only use Camera Control for launching the Camera app. Meanwhile, I continue taking photos with the on-screen shutter to ensure my photos don’t come out blurry or out of focus.

Visual Intelligence is what Camera Control needed

Using Visual Intelligence on an iPhone 16 Pro showing Google search results.
Christine Romero-Chan / Digital Trends

Prior to iOS 18.2, the Camera Control was just a camera-only Action button for me. But now that I have updated my phone and finally have access to Visual Intelligence, I’m actually using Camera Control more.

To activate Visual Intelligence, press and hold the Camera Control. It brings up a viewfinder for you to point the camera at something in the real world. Then, you can either select the shutter/Camera Control button to do a quick capture (not saved to the photo library) before inquiring about it or select either Ask or Search. The Ask option will default to prompt ChatGPT with a simple “What is this?” or you can ask for more details about what you’re looking at. Search will bring up Google results relating to the object you’re inquiring about.

Visual Intelligence on iPhone.
Jesse Hollington / Digital Trends

What you can get with Visual Intelligence depends on what you’re pointing your camera at. So far, I’ve used it to identify plants, animals, and random objects. But you can also use it to look up details about points of interest, businesses, services, contact information, translate text, and more.

Though I haven’t had much time to use it since installing iOS 18.2, I can see myself using this feature quite a bit when I’m out and about. It also definitely feels like the placement of Camera Control works better for Visual Intelligence than a camera shutter button. I’m right-handed, so I typically hold my phone that way, with my thumb on Camera Control. I can easily use Visual Intelligence one-handed, unlike using Camera Control as a camera shutter button.

I no longer regret getting my iPhone 16 Pro

iPhone 16 Pro Max next to the 16 Plus, 16 Pro and regular iPhone 16
Nirave Gondhia / Digital Trends

I’ve upgraded my iPhone every year since the beginning, but this was the first year that I actually did have some second thoughts, at least initially. When the iPhone 16 series launched, Apple Intelligence didn’t ship with them, so while the hardware was good, the software felt incomplete.

But now that Apple has rolled out the Apple Intelligence features it was advertising so heavily, I’m satisfied with my iPhone 16 Pro purchase. Camera Control’s primary purpose is Visual Intelligence, in my opinion, along with quickly getting to the camera. And when combined with the fact that the smaller iPhone 16 Pro now has 5x optical zoom, yeah, I’m a happy camper.

From what I’ve seen, it doesn’t sound like many people have used Camera Control since it debuted. I certainly only used it for one thing. But now, with iOS 18.2 and Visual Intelligence, I think Camera Control could be my new favorite iPhone feature.

Leave A Reply

Your email address will not be published.