Favorite Apple Intelligence feature: Does Tim Cook’s highlight reveal future product plans? | News


2024 should be the year Apple catches up in terms of AI capabilities. At the developer conference, the group announced the chosen strategy: Local models take on smaller tasks, while a more powerful model is available on the “Private Computing Cloud” for complex calculations. Apple wanted to provide corresponding functions universally across platforms. The whole thing goes under the name “Apple Intelligence”. At the investor conference, Tim Cook revealed that visual intelligence is the most used Apple intelligence feature. This may come as a surprise, as it has the highest hardware requirements: By default, it is tied to the presence of the camera control button, which is limited to the iPhone 16 and 17 (along with the iPhone Air). On iPhone 15 Pro (Max) and 16e you can bring the function to the action button or the lock screen; Older iPhones, iPads and Macs are excluded. But it has obvious practical benefits, as it recognizes text and motifs without an image first having to end up in the iPhone’s image collection. In many cases, this saves additional work steps, for example if the information from a poster needs to be recorded in a calendar event.

Integration into other devices
The company has probably known for a long time that visual intelligence is the most popular of Apple’s intelligence functions. The fact that Tim Cook particularly highlighted this at the conference call to present the quarterly figures gives hope that it will soon find its way onto other devices: There is nothing wrong with passing camera images from MacBooks or iPads directly to integrated Foundation models. An additional button is no reason for this – Apple itself proves this by saying that the Pro models of the iPhone 15 can use this function.
“AppleGlasses” with visual intelligence?
Cook may be secretly hinting at essential functions of a new type of device that Apple is rumored to be announcing in 2026: AI glasses, similar to Meta glasses or Snapchat Spectacles. Visual intelligence could represent the core function in such a device: an integrated camera records what is currently being seen, the paired iPhone takes care of the AI analysis, and the insights gained are transmitted to the wearer via integrated speakers or AirPods.














