
Apple Accelerates Smart Glasses Development, Targeting 2026 Launch
Share
According to a new report from Bloomberg, Apple is ramping up its long-awaited smart glasses project, now aiming for a late 2026 release. Internally known as part of the N401 project (previously codenamed N50), the device marks a major step in Apple’s efforts to blend AI with wearable technology.
Large-scale prototype production is expected to begin later this year in partnership with overseas suppliers. This move indicates Apple is transitioning from early-stage development to hardware validation and supply chain readiness.
What to Expect from Apple’s Smart Glasses
The glasses are expected to feature:
-
Built-in cameras, microphones, and speakers
-
Support for Siri voice commands
-
Hands-free tasks like calls, music playback, translations, and navigation
These features put them in direct competition with Meta’s Ray-Ban smart glasses and upcoming Android XR devices.
Apple is also working on a custom chip to power the glasses, with mass production of that chip slated for 2025. The component is expected to handle AI workloads locally, further reducing reliance on cloud services and ensuring faster, more private performance.
A Shift in Wearables Strategy
Interestingly, Apple has reportedly shelved plans to add cameras to the Apple Watch and Apple Watch Ultra by 2027. Instead, the company is continuing work on AirPods with integrated cameras, suggesting a shift toward new form factors for wearable cameras.
This indicates Apple may see its future wearables—like glasses and AirPods—as the primary vehicles for AI-powered real-world interaction, rather than modifying existing products like the Watch.
AI: Apple’s Biggest Advantage—or Weak Spot?
The smart glasses effort comes as Apple faces mounting pressure in the AI hardware space. Competitors like OpenAI, which recently partnered with Jony Ive's startup io, are also exploring new device formats built around artificial intelligence.
Internally, some Apple engineers are reportedly worried that Apple's current AI capabilities may not be strong enough to support a next-gen wearable experience. Today, Apple’s Visual Intelligence relies heavily on third-party solutions like Google Lens and OpenAI, rather than in-house AI engines. But the company is clearly working to change that by expanding Apple Intelligence and opening up its LLMs to developers.
The Bigger Picture: Vision Pro, Foldables, and What’s Next
The team behind the Vision Pro is playing a major role in this glasses project. In parallel, they’re developing:
-
A lighter, lower-cost Vision Pro
-
A tethered version for Mac users focused on low-latency performance
Apple previously explored tethered AR glasses linked to a Mac, but abandoned that concept earlier this year.
In addition to the glasses, Apple is also reportedly planning to launch its first foldable iPhone by the end of 2026, further highlighting its pivot toward more flexible and immersive device categories.
Final Thoughts
As the lines between wearables, AI, and spatial computing continue to blur, Apple’s smart glasses could be its next major hardware leap—if it can overcome the challenges of AI integration and deliver a seamless user experience. With the clock ticking toward 2026, the tech world will be watching closely to see how Apple’s next frontier unfolds.