Apple’s Enhanced Visual Search: A New Feature in Photos App
Sure enough, when I checked my iPhone 15 Pro this morning, the toggle was switched to on. You can find it for yourself by going to Settings > Photos (or System Settings > Photos on a Mac). Enhanced Visual Search lets you look up landmarks you’ve taken pictures of or search for those images using the names of those landmarks.
How to Use Enhanced Visual Search
To see what it enables in the Photos app, swipe up on a picture you’ve taken of a building and select “Look up Landmark,” and a card will appear that ideally identifies it. Here are a couple of examples from my phone:
That’s definitely Austin’s Cathedral of Saint Mary, but the image on the right is not a Trappist monastery, but the Dubuque, Iowa city hall building. Screenshots: Apple Photos
What Does Enhanced Visual Search Do?
On its face, it’s a convenient expansion of Photos’ Visual Look Up feature that Apple introduced in iOS 15 that lets you identify plants or, say, find out what those symbols on a laundry tag mean. But Visual Look Up doesn’t need special permission to share data with Apple, and this does.
Data Sharing and Privacy Concerns
A description under the toggle says you’re giving Apple permission to “privately match places in your photos with a global index maintained by Apple.” As for how, there are details in an Apple machine-learning research blog about Enhanced Visual Search that Johnson links to:
The process starts with an on-device ML model that analyzes a given photo to determine if there is a “region of interest” (ROI) that may contain a landmark. If the model detects an ROI in the “landmark” domain, a vector embedding is calculated for that region of the image.
According to the blog, that vector embedding is then encrypted and sent to Apple to compare with its database. The company offers a very technical explanation of vector embeddings in a research paper, but IBM put it more simply, writing that embeddings transform “a data point, such as a word, sentence or image, into an n-dimensional array of numbers representing that data point’s characteristics.”
Privacy Concerns and Opt-in Option
Like Johnson, I don’t fully understand Apple’s research blogs and Apple didn’t immediately respond to our request for comment about Johnson’s concerns. It seems as though the company went to great lengths to keep the data private, in part by condensing image data into a format that’s legible to an ML model.
Even so, making the toggle opt-in, like those for sharing analytics data or recordings or Siri interactions, rather than something users have to discover seems like it would have been a better option.
Conclusion
Apple’s Enhanced Visual Search is a new feature in the Photos app that lets you look up landmarks you’ve taken pictures of or search for those images using the names of those landmarks. While it’s a convenient expansion of Photos’ Visual Look Up feature, it does require special permission to share data with Apple, which raises privacy concerns.
FAQs
Q: What is Enhanced Visual Search?
A: Enhanced Visual Search is a new feature in the Photos app that lets you look up landmarks you’ve taken pictures of or search for those images using the names of those landmarks.
Q: How do I use Enhanced Visual Search?
A: To use Enhanced Visual Search, swipe up on a picture you’ve taken of a building and select “Look up Landmark,” and a card will appear that ideally identifies it.
Q: Does Enhanced Visual Search require special permission to share data with Apple?
A: Yes, Enhanced Visual Search requires special permission to share data with Apple, which raises privacy concerns.
Q: How does Apple keep the data private?
A: Apple condenses image data into a format that’s legible to an ML model and encrypts the vector embedding before sending it to Apple to compare with its database.
Q: Should I opt-in for Enhanced Visual Search?
A: It depends on your personal preference regarding data sharing and privacy. If you’re concerned about data sharing, you may want to consider opting out of Enhanced Visual Search.

