Apple is reportedly accelerating three new AI wearables packed with cameras to give Siri real-time visual awareness through your iPhone. Bloomberg says the lineup includes smart glasses, a pendant, and even AirPods with built-in sensors, all feeding data to a revamped Siri expected in iOS 27 later this year.
Why these camera gadgets could actually change things
The smart glasses lead the pack: dual cameras in Apple-designed frames, no display to keep it lightweight, targeting production by late 2026 for a 2027 launch. They’re built to capture what you see and pipe it straight to your iPhone for Siri to analyze on the fly — think contextual help without having to pull out your phone.
The pendant sounds like the iPhone’s “eyes and ears”: an always-on camera and mic you wear around your neck, discreetly recording surroundings to give Siri live context. No word on exact timeline, but it’s part of the same fast-tracked push.
Camera AirPods could beat them to market, possibly this year, with low-res sensors for visual input while building on live translation features on the AirPods. These tie into Apple’s Siri overhaul, adding a chatbot interface powered by Google Gemini for smarter, more responsive AI.
Can Apple finally deliver on AI wearables?
I’ve been hands-on with AirPods Pro 3 and their audio upgrades before, but cameras in earbuds? That’s a wild pivot from Meta’s clunky Ray-Bans or Humane’s Pin flop. If Apple nails privacy controls and battery life — big ifs after Siri’s endless delays, these could finally make “always-on AI sight” mainstream and practical without constant phone checks. Skeptics like yours truly will always wait for demos.

