Apple Bets on AI Glasses Without a Screen
Bloomberg reported on April 12 that Apple is actively testing four distinct frame designs for its smart glasses project, internally codenamed N50. The designs include a large rectangular frame reminiscent of classic Wayfarers, a slimmer rectangular style similar to what CEO Tim Cook wears, a larger oval or circular frame, and a smaller oval variant.
All four prototypes use premium acetate rather than standard plastic, a deliberate material choice that signals Apple’s positioning in the luxury eyewear segment. Color options under testing include black, ocean blue, and light brown. The frames house two cameras with distinctive vertically-oriented oval lenses paired with LED indicator lights, a design Apple refers to internally as the “icon.”
What the glasses will not have is a display. Unlike the $3,499 Vision Pro, Apple’s smart glasses rely entirely on cameras, microphones, and speakers embedded in the frame. Users interact through Siri voice commands to access contextual awareness, live translation, photo and video capture, phone calls, music, and navigation. The product is designed to look and feel like ordinary premium eyeglasses that happen to be intelligent.
The Giannandrea Exit and What It Means
The glasses announcement coincides with the official departure of John Giannandrea, Apple’s former senior vice president of Machine Learning and AI Strategy. Giannandrea joined Apple from Google in 2018 as one of the most respected names in artificial intelligence. His departure on April 15, timed to a stock vesting date, closes an eight-year chapter.
The exit was not sudden. Apple reduced Giannandrea’s role in March 2025 after the disappointing launch of Apple Intelligence and repeated delays to the Siri overhaul. He was stripped of oversight of Siri, robotics, and other AI teams, then formally announced his retirement in December 2025. The intervening months were what the industry calls “resting and vesting,” remaining on payroll until final stock awards mature.
Apple hired Amar Subramanya as VP of AI, reporting directly to Craig Federighi. Subramanya spent 16 years at Google and most recently led engineering for Gemini. He now oversees Apple Foundation Models, ML research, and AI Safety and Evaluation. The leadership transition from a research-oriented AI executive to an engineering-focused one from Google’s Gemini team signals a shift in Apple’s AI priorities from exploration to execution.
Advertisement
Competing in a Market Meta Already Owns
Apple enters a smart glasses market that Meta has dominated. Meta and EssilorLuxottica sold over 7 million pairs of Ray-Ban Meta glasses in 2025, tripling the combined total from 2023 and 2024. The broader smart glasses market is projected to surge from $1.2 billion to $5.6 billion in 2026, with Meta targeting 10 million annual units.
Apple’s differentiation strategy rests on two pillars. First, premium design: acetate frames in multiple styles position the product as fashion-forward rather than tech-forward. Second, privacy: Apple plans to process AI interactions on-device rather than routing data to cloud servers. In a market where Meta has faced scrutiny over how much data Ray-Ban glasses share with contractors and AI systems, Apple’s on-device approach could be a decisive advantage for privacy-conscious consumers.
Early pricing speculation places the entry point around $499, competing directly with Meta’s Ray-Ban territory while carrying Apple’s ecosystem premium. Production is targeting December 2026 with retail availability in spring or summer 2027.
The Simpler Product Apple Needed
The N50 represents a strategic pivot from Apple’s original AR ambitions. The Vision Pro, Apple’s $3,499 mixed reality headset, demonstrated the company’s technical capabilities but struggled to find a mass market at that price point. Smart glasses without a display are a fundamentally different product: lower cost, lower technical complexity, and integrated into a form factor that hundreds of millions of people already wear daily.
The two-camera system serves distinct purposes. One camera captures high-resolution photos and video. The other handles computer vision tasks similar to the Vision Pro’s spatial awareness capabilities, enabling the glasses to understand what the user is looking at and provide contextual AI responses through Siri.
This “AI without a screen” approach prioritizes ambient intelligence over immersive computing. Rather than replacing the iPhone, the glasses extend it. Rather than demanding the user enter a virtual environment, they enhance the real one. For Apple, which has struggled to articulate why consumers need a $3,499 headset, a $499 pair of intelligent glasses that looks normal may be the product that actually brings AI wearables to scale.
What the Leadership Change Signals
The parallel stories of N50 development and Giannandrea’s departure are connected. Apple’s AI strategy under Giannandrea prioritized research and ML infrastructure over shipped products, resulting in an Apple Intelligence launch widely seen as underwhelming compared to competitors. The Siri overhaul that was supposed to showcase Apple’s LLM capabilities was repeatedly delayed.
Under Subramanya and Federighi, the emphasis is shifting to execution. The N50 smart glasses represent the first major Apple hardware product designed around AI from the ground up, where AI is not a feature added to existing hardware but the core reason the product exists. If Apple can deliver a competent Siri experience through these glasses by 2027, it validates the leadership transition. If Siri continues to disappoint, even premium acetate frames will not save the product from comparison with Meta’s already-mature AI glasses ecosystem.
Frequently Asked Questions
What features will Apple’s AI smart glasses have without a display?
Apple’s N50 glasses use cameras, microphones, and speakers to deliver AI-powered features through voice interaction with Siri. Users can capture photos and video, make phone calls, listen to music, get navigation directions, and use live translation, all without a visual display. The glasses rely on on-device AI processing for privacy, with two cameras handling photo capture and computer vision tasks.
Why did Apple’s AI chief John Giannandrea leave the company?
Giannandrea’s role was reduced in March 2025 after Apple Intelligence launched to disappointing reviews and the Siri overhaul was repeatedly delayed. He was stripped of oversight of Siri, robotics, and other AI teams. Apple formally announced his retirement in December 2025, and he officially departed on April 15, 2026, timed to a stock vesting date. He was replaced by Amar Subramanya, a former Google Gemini engineering lead.
How do Apple’s smart glasses compare to Meta’s Ray-Ban smart glasses?
Meta sold over 7 million Ray-Ban smart glasses in 2025 and has a multi-year head start. Apple differentiates through premium acetate materials in four frame styles, on-device AI processing for privacy, and deeper iPhone integration. Early pricing estimates place Apple at around $499 versus Meta’s comparable pricing, but Apple’s 2027 launch timeline means Meta will have an additional year of market development and iteration before facing direct competition.
Sources & Further Reading
- Apple Testing Four Smart Glasses Styles Made of High-End Materials — MacRumors
- Apple Exploring Four Styles for Smart Glasses Using Premium Materials — 9to5Mac
- Former AI Boss Giannandrea Officially Leaving Apple — 9to5Mac
- John Giannandrea to Retire from Apple — Apple Newsroom
- Apple Eyes 2027 for AI Smart Glasses Built Around Context — AppleInsider
- Apple’s AI Glasses Coming to Take on Meta Ray-Bans — Knowledge Hub Media






