Meta’s Ray-Ban smart glasses now watch what you eat

Meta adds AI nutrition tracking to its Ray Ban smart glasses

Meta's updated ray-ban smart glasses can now identify and log meals using built-in ai. ©Image Credit: Meta
Meta's updated ray-ban smart glasses can now identify and log meals using built-in ai. ©Image Credit: Meta

If you’ve ever tried logging your meals in a fitness app, you already know how it goes. You start strong. Then, forget every now and then until you give up completely.

Meta is trying to fix that by adding a new feature to its Ray-Ban and Oakley smart glasses that can log and track your food using AI. It might be one of the easiest ways to track your meals yet. Here is how it works, and why it could change how people keep up with their diet.

How the AI-powered food logging works

The new feature allows users to use Meta AI to log what they’re eating.

Instead of opening an app and typing everything in, you can ask the glasses to record your meal and build a running log over time. From there, the system can offer suggestions based on your eating habits and goals. For example, you could ask what to eat to improve your energy levels, and it will respond using the data it has collected.

For clarity purposes, Meta’s smart glasses do not track heart rate, sleep, steps, or calories burned. So the feature is not a full health tracker like a smartwatch. Right now, it is more of a food logging tool powered by AI.

The feature is not fully automated, as users will still need to prompt the AI, confirm what they are eating, and set nutrition goals. So while the glasses make the process easier, they don’t completely replace manual tracking yet.

Meta has said it is working toward making this more seamless in the future.

The update will roll out to existing smart glasses

The nutrition tracking feature is expected to be available to users aged 18 and older in the U.S. It will work across current Ray-Ban Meta and Oakley Meta smart glasses, with some newer models receiving the update later.

At the same time, Meta is also preparing new versions of its Ray-Ban smart glasses designed specifically for prescription wearers.

Meta is continuing to expand what its glasses can do

Meta has been steadily adding features to its smart glasses, including AI capabilities that respond to voice commands and interact with what users see. That aligns with the company’s push into AI wearables. According to CEO Mark Zuckerberg, billions of people already wear glasses. So the idea is, why not turn them into something smarter beyond just “take photos” smart. We’re talking understanding your environment, reacting in real time, and even giving contextual suggestions. Food just happens to be one of the easiest entry points.

There seems to be a little problem in Meta not having access to most users’ health data currently. It doesn’t integrate deeply with Apple Health or Samsung Health.

But it doesn’t really need to – at least not yet. Because if the company eventually builds its own ecosystem (possibly through things like its Neural Band, a wrist device that can read muscle signals and control interfaces), it could connect what you eat, how your body responds, and what you should do next. All without leaving its own platform.

Whether people stick with the idea long-term is another question. But for those who have been caught in the web of trying and failing to track their meals, this approach might feel a lot easier.

Source: Bloomberg