Apple to Bet on AI Wearable Devices Built Around Visual Intelligence: Report

Written by: Mane Sachin

Published on:

Follow Us

Apple seems to be taking a quiet but deliberate step forward in artificial intelligence — and it may soon move beyond the smartphone.

While much of the tech industry is locked in a race to build bigger AI models and expand massive data centres, Apple appears to be thinking differently. Instead of competing on scale, the company is reportedly focusing on how AI can be woven more naturally into the devices people use — and possibly wear — every day.

According to reports, Tim Cook has spoken internally about developing a new category of AI-powered wearables built around what Apple calls “Visual Intelligence.” In simple terms, the idea is to create technology that understands what a user is looking at and responds in a helpful, contextual way. Rather than just answering voice commands, the system would interpret images and on-screen content to provide relevant information or suggest actions.

Apple first introduced elements of Visual Intelligence with the iPhone 16 Pro in 2024. The feature allows users to snap a photo or take a screenshot and ask questions about what they see. By connecting with services like OpenAI’s ChatGPT and Google’s reverse image search, the tool can identify objects, summarise text or quickly pull up related details — all without forcing users to jump between multiple apps.

Now, Apple is reportedly building its own visual AI models that could power upcoming wearable devices. In his Power On newsletter, Mark Gurman noted that Visual Intelligence has already become one of the company’s most popular features. Cook has described it as a way for users to learn more from the content on their screens and take action faster across apps.

What stands out is Apple’s broader strategy. Unlike competitors such as Google, OpenAI and Anthropic, Apple is not aggressively trying to build the largest AI systems. Instead, it appears focused on delivering AI through its ecosystem of devices. With more than a billion active devices worldwide, Apple’s real strength lies in its ability to bring new features directly to everyday users at scale.

The company’s absence from the recent AI Impact Summit 2026 in New Delhi — where leaders like Sundar Pichai, Sam Altman and Dario Amodei were present — also reflects this different tone. While others highlighted ambitious AI breakthroughs, Apple appears content to focus on refining its own product experience.

Alongside its AI plans, Apple may also rethink how it unveils products this year. Instead of hosting one large keynote, reports suggest the company could spread announcements across three days, followed by a hands-on media event on March 4, 2026.

Several products are expected to debut in March, including the iPhone 17e, an updated iPad Air with an M4 chip, a refreshed entry-level iPad, and upgraded versions of the MacBook Air and MacBook Pro.

Apple may not be chasing headlines about the biggest AI model. Instead, it seems focused on making artificial intelligence quietly useful — built into devices people already trust and use daily. If its wearable ambitions take shape, they could mark the company’s next steady step in bringing AI closer to everyday life.

Also Read: Apple Introduces Agentic Coding in Xcode with Claude Agent and OpenAI Codex

Mane Sachin

My name is Sachin Mane, and I’m the founder and writer of AI Hub Blog. I’m passionate about exploring the latest AI news, trends, and innovations in Artificial Intelligence, Machine Learning, Robotics, and digital technology. Through AI Hub Blog, I aim to provide readers with valuable insights on the most recent AI tools, advancements, and developments.

For Feedback - aihubblog@gmail.com