Apple’s Next Act Smart Glasses, a Pendant, and AI-Powered AirPods

Tech & Innovation Report
AI Hardware

Apple’s Next Act: Smart Glasses, a Pendant, and AI-Powered AirPods

A Bloomberg report reveals Apple is accelerating development of three camera-equipped wearables, each built around a smarter version of Siri. Here’s what we know, what’s still uncertain, and how it stacks up against what’s already out there.

Apple has spent the better part of the last decade dominating the wearables market with AirPods and the Apple Watch. But when it comes to the next wave of AI-driven hardware — the kind that can see the world around you and act on it — the company has been notably absent. That’s apparently about to change.

On February 17, 2026, Bloomberg’s Mark Gurman reported that Apple is actively accelerating development of three new wearable devices: AI-powered smart glasses, a wearable pendant or pin, and a new version of AirPods with embedded cameras. All three are designed to feed visual information to Siri, essentially giving Apple’s voice assistant the ability to see and interpret the physical world.

The report is significant not just because of what the devices are, but because of the urgency behind them. Apple watched Meta’s Ray-Ban smart glasses sell roughly seven million units in 2025 — triple the 2024 figure — and now faces the prospect of OpenAI and former Apple design chief Jony Ive releasing their own AI wearable. The window for Apple to set the terms in this space is narrowing.

The Three Devices at a Glance

🕶️
Smart Glasses
Target launch: 2027
Dual-camera system, no display, Siri integration. Production could begin December 2026. Codename: N50.
🔮
AI Pendant / Pin
Target launch: 2027 (TBD)
AirTag-sized, wearable as clip or necklace. Always-on camera for Siri context. iPhone-dependent, not standalone.
🎧
Camera AirPods
Possible launch: Late 2026
Infrared cameras, not for photos. Designed to give Siri environmental awareness. Most advanced in development.

The Smart Glasses (Codename: N50)

Apple’s smart glasses are the most complex of the three products, and they’re also the furthest along in terms of engineering work. According to MacRumors, Apple has already distributed prototypes to its hardware engineering team and is targeting the start of production as early as December 2026, with a consumer launch slated for 2027.

Unlike Meta’s Ray-Ban collaboration, Apple is developing its own lenses and frames in-house. The glasses will feature two camera systems: one high-resolution lens capable of capturing photos and videos, and a second dedicated computer-vision sensor that measures distance and identifies environmental details for real-time navigation and contextual awareness. Early prototypes were connected via cable to a battery pack and iPhone, but newer versions reportedly have the components embedded directly in the frame.

“The glasses won’t have a built-in display, but they will allow users to make phone calls, interact with Siri, play music, and take actions based on surroundings — such as asking about the ingredients in a meal.” — Bloomberg

The choice to skip a display is intentional and echoes Meta’s current approach. AR glasses with a proper display remain technically difficult to build in a form factor people actually want to wear. Apple is reportedly working on display-equipped glasses separately, but that project is described as “many years away.”

The N50 glasses will support a range of sizes and styles — Gurman describes them as “more upscale and feature-rich” than either the pendant or the AirPods — and Apple’s in-house hardware expertise is expected to give it a materials and camera quality advantage over existing competitors.

The AI Pendant (The Most Uncertain of the Three)

The pendant is the most intriguing device in the lineup, and also the one Apple seems least committed to. Wareable notes that the project is described internally as experimental — Apple has killed similar concepts before if they don’t clear its internal standards.

What’s currently being developed is roughly the size of an AirTag but with computing power closer to that of the AirPods chip. The device has a hole for threading through a necklace and a clip for attaching to clothing or a bag. It features an always-on camera and a microphone, and Apple is still debating whether to add a speaker for two-way Siri conversations.

Key Distinction

Unlike the Humane AI Pin — which tried and failed to replace the smartphone — Apple’s pendant is being designed specifically as an iPhone accessory. The iPhone handles the processing; the pendant serves as a camera-equipped extension of it. Apple employees reportedly describe it as the “eyes and ears” of the phone.

This is a smart positioning move. The Humane AI Pin launched in 2024 at $699, received poor reviews, and the company eventually shut down. Apple appears to have studied that failure closely. By not asking the pendant to do things it can’t do well — like replacing a phone — it has a better chance of being genuinely useful.

The pendant also puts Apple in direct competition with the rumored OpenAI wearable reportedly being developed with Jony Ive, which is not expected to ship before late 2027.

Camera-Equipped AirPods

Of the three products, the camera AirPods are the furthest along in development. Tom’s Guide reports that these could arrive as early as late 2026, making them the near-term product to watch.

The cameras in these AirPods are infrared rather than standard optical cameras, meaning they’re not for taking pictures. The purpose is environmental sensing — giving Siri the ability to understand what you’re looking at, track hand gestures, and build awareness of your surroundings through your ears. Think of them as Siri’s spatial awareness engine, built into a device 600 million people already own in some form.

The engineering challenge here is significant. Fitting cameras, processing capability, and batteries into something that fits in your ear canal without making the fit uncomfortable or the battery life worse is not trivial. Apple has managed similar miniaturization feats before with AirPods Pro’s hearing health features, so there’s precedent.

How These Compare to What’s Already Available

Product Maker Camera Display AI Assistant Status
Ray-Ban Meta Smart Glasses (Gen 2) Meta / EssilorLuxottica 12 MP + video None Meta AI Available Now
Apple Smart Glasses (N50) Apple Dual-camera (hi-res + CV) None (initially) Siri (upgraded) 2027
Apple AI Pendant Apple Always-on camera None Siri 2027 (TBD)
Apple Camera AirPods Apple Infrared (sensing only) None Siri Late 2026
Snap Spectacles (5th gen) Snap Camera + AR display AR Waveguide My AI (Snap) 2026
OpenAI / Ive Wearable OpenAI + Jony Ive TBD TBD ChatGPT Late 2027

The Role of Siri — and Google Gemini

All three devices are being built around a significantly upgraded version of Siri, which Apple plans to overhaul at its developer conference (WWDC) in June 2026. What makes this particularly interesting is the reported engine behind the upgrade: TechRepublic notes that Apple has reportedly tapped Google’s Gemini model as the backbone of the relaunched Siri, which would function as both a voice assistant and a text-based chatbot.

If that holds, it would mean Apple’s new wearables ecosystem runs on Google’s AI infrastructure — an unusual arrangement that reflects just how quickly the AI hardware race is moving. Building a competitive large language model from scratch takes years; licensing one allows Apple to ship products on a faster timeline.

The visual context capability is central to why any of this matters. Current Siri is largely reactive — you ask it something, it answers. The new version, fed real-time visual data from cameras on your glasses, pendant, or AirPods, could answer questions you didn’t even know to ask. It could notice that you’re looking at a restaurant menu and offer dietary information, or recognize a landmark and pull up relevant history without you saying a word.

Meta’s Sales Numbers Put Apple’s Timeline in Perspective

Meta Ray-Ban Smart Glasses — Estimated Unit Sales (millions)
2023
~1M
2024
~2.3M
2025
~7M

Source: EssilorLuxottica earnings call, 2025. Figures approximate.

Seven million is a meaningful number, but it’s still a fraction of Apple’s scale. Apple shipped over 200 million iPhones in 2025 alone. If the company can get even a small percentage of its existing user base to add a camera AirPods or pendant to their setup, it would dwarf what Meta has achieved in smart glasses so far.

What Could Go Wrong

Apple’s product roadmap is notoriously fluid. Projects get cancelled, delayed, or rebuilt from scratch when they don’t meet internal standards. The pendant in particular is described as still being in an experimental phase — there’s no guarantee it ships at all.

There are also real-world concerns that no product launch can fully paper over. Smart glasses with cameras raise legitimate privacy questions. MacRumors’ comments section reflects a concern shared more broadly: people don’t always want to be recorded by strangers’ eyewear, and the social acceptance of camera glasses remains an open question. Google Glass ran into exactly this problem in 2013, and it’s not clear the cultural calculus has shifted enough for it to be a non-issue this time around.

Battery life on miniaturized devices is another perennial constraint. Embedding cameras and real-time AI processing into AirPods or a thin glasses frame while keeping them comfortable and charged all day is an engineering problem with no easy solution.

The Bigger Picture

Apple’s move into AI wearables isn’t just about products. It’s about keeping the iPhone at the center of people’s lives as more computing shifts to ambient, always-on devices. Every one of these three products is designed to be an iPhone accessory — not a replacement for it. The iPhone handles the heavy processing; the wearables serve as sensory extensions of it.

That’s a coherent strategy. Rather than asking users to swap their phone for something new, Apple is building a constellation of devices that make the iPhone more capable and more present throughout the day. It’s the same playbook as the Apple Watch and AirPods — both of which succeeded not by replacing the iPhone but by deepening dependency on it.

Whether that strategy works in the AI wearables space remains to be seen. But Apple rarely enters a market it doesn’t eventually come to define, and by 2027, the smart glasses market will look very different from today.

This article is based on publicly reported information. Apple has not officially confirmed any of these products.

Leave a Comment

Your email address will not be published. Required fields are marked *