Ray-Ban Meta Smart Glasses Upgraded with AI Vision

Ray-Ban Meta Smart Glasses Upgraded with AI Vision

The Ray-Ban Meta Smart Glasses, launched last fall, have recently received a significant upgrade with the addition of multimodal AI. This new feature allows the glasses to process various types of information, such as photos, audio, and text, making them a more versatile and useful wearable device. Despite some limitations and quirks, the Meta glasses offer a glimpse into the future of AI-powered gadgets and their potential to seamlessly integrate into our daily lives. Note that existing owners of the Ray-Ban Meta Smart Glasses only need to update their glasses in the Meta View app to access the new features.

Below is the video posted on X by Brian Stout (@stoutiam), recording a run in Bodega Bay Dunes:

The Power of Multimodal AI

Multimodal AI enables the Meta glasses to understand and respond to a wide range of user queries. By simply saying, "Hey Meta, look and..." followed by a specific command, users can ask the glasses to identify plants, read signs in different languages, write Instagram captions, or provide information about landmarks and monuments. The glasses capture a picture, process the information in the cloud, and deliver the answer through the built-in earphones. While the possibilities are not endless, the AI's capabilities are impressive and constantly evolving. Here are some examples what you can do with the Multimodal AI function:

  • Ask about what you see yourself: "Hey Meta, look and describe what I'm seeing."
  • Understand text: "Hey Meta, look and translate this text into English."
  • Get gardening tips: "Hey Meta, look and tell me how much water these flowers need?"
  • Express yourself: "Hey Meta, look and write a funny Instagram caption about this dog."

Last month, Meta released the next big thing in AI models, the Llama 3. It’s the latest AI model to be offered by Meta free of charge and with a relatively open (though not open-source) license that lets developers deploy it in most commercial apps and services. Meta's new model scores significantly better than its predecessor in benchmarks without an increase in model size. The secret is the use of a lot of training data. Check out my earlier post about the Llama 3 model.

Strengths and Weaknesses

Like most AI systems, Meta's multimodal AI has its strengths and weaknesses. It can be incredibly accurate in identifying certain objects, such as specific car models or plant species. However, it can also be confidently wrong at times, mistaking one object for another or providing irrelevant information. The AI's performance is often dependent on the quality of the image captured and the user's ability to frame the question in a way that the AI can understand.

One of the key advantages of the Ray-Ban Meta Smart Glasses is their familiar form factor. As a pair of glasses with built-in headphones, they feel natural and comfortable to wear. Users are already accustomed to talking through earbuds, making it less awkward to interact with the AI assistant. The glasses' design allows for a seamless integration of AI technology into a well-known and widely-used accessory.

Ray-Ban is also introducing a limited-edition Ray-Ban Meta smart glasses in an exclusive Scuderia Ferrari colorway. This Miami 2024 special edition is the perfect fusion of iconic design, racing heritage, and cutting-edge technology.

Using the Meta glasses' AI requires a bit of a learning curve. Users need to adapt to the AI's language and understand its limitations. For example, the lack of a zoom feature can hinder the AI's ability to identify distant objects accurately. However, users can often find workarounds, such as taking a picture of a picture, to help the AI along. As users become more familiar with the technology, they can better leverage its capabilities to enhance their daily experiences.

Comparing Ray-Ban Meta Smart Glasses with the Rabbit R1 & Humane AI Pin

Ray-Ban Meta Smart Glasses: Priced at $299 ($329 with polarized lenses), it offers a familiar glasses form factor with added AI capabilities. By simply saying "Hey Meta," users can access the assistant and ask questions about their surroundings. The glasses excel at visual recognition, consistently identifying dog and cat breeds and providing contextual answers. However, they rely on a smartphone connection and lack live audio translation.

Humane AI Pin: Priced at $699, it is a cell-connected device that clips to your shirt. Users can ask questions by holding down the touchpad, and the built-in "laser ink" display projects the answer on their palm. The pin offers live translation in over a dozen languages, enabling users to have conversations with people speaking different languages. However, the display can be difficult to see outdoors, and the device's response times can be slow.

Rabbit R1: Priced at $199, it is a pocket-sized device that connects directly to cellular or Wi-Fi networks. Users press and hold a button to ask the assistant a question and navigate through on-screen menus using a scroll wheel. While the R1 promises live translation, it struggled with basic phrases during testing and had slow response times. The device also had issues with basic tasks like playing music and setting timers.

While all three AI gadgets show promise, the Ray-Ban Meta Smart Glasses emerge as the most practical and user-friendly option. Despite lacking live audio translation and requiring a smartphone connection, the glasses offer excellent visual recognition, contextual answers, and a familiar form factor at a reasonable price point. The Humane AI Pin and Rabbit R1, while ambitious, suffer from slow response times and inconsistent performance. As these companies release software updates to address issues, they may become more viable alternatives in the future.

Here is a comparison video by WSJ’s Joanna Stern who put these devices through a series of tests, including translation and visual search:

Ray-Ban Meta Smart Glasses Reviews

Here are some of the reviews of Ray-Ban Meta Smart Glasses:

  • The Verge: The Ray-Ban Meta Smart Glasses have multimodal AI now - "It can be handy, confidently wrong, and just plain finicky — but smart glasses are a much more comfortable form factor for this tech. To me, it’s the mix of a familiar form factor and decent execution that makes the AI workable on these glasses. Because it’s paired to your phone, there’s very little wait time for answers. It’s headphones, so you feel less silly talking to them because you’re already used to talking through earbuds. In general, I’ve found the AI to be the most helpful at identifying things when we’re out and about. It’s a natural extension of what I’d do anyway with my phone."

  • Tom's Guide: Ray-Ban Meta smart glasses just got a ton of upgrades, including new AI features and video calling - "Having experimented with the multimodal AI integration through limited beta access in recent months, I've found that it mostly succeeds in identification. For example, Meta AI could name some New York City landmarks just by taking a picture through the glasses. But it's not right every time, and the glasses are prone to the same kind of occasional connectivity headaches that reviewers reported for the Humane AI Pin. Good looks are a major perk of the Ray-Ban Meta Smart Glasses. They mostly look like an average pair of designer glasses."

  • ZDNet: Meta's Ray-Ban smart glasses just got another useful feature for free (and a new style) - "The improvements to the Ray-Ban Meta glasses and sunglasses include better integration with Apple Music, support for multimodal AI, and compatibility with WhatsApp and Messenger, allowing users to stream what they're seeing from the sunglasses themselves. Meta is focusing the wearable on features wearers of smart glasses will already be used to, while adding new capabilities such as live video integrated into common messaging apps. The ability to share your view on WhatsApp and Messenger is completely hands-free, letting you show exactly what you're seeing in real time."

  • Popsugar: Even Non-Techy Folks Will Love the Ray-Ban Meta Smart Glasses - "Given that plain-old designer sunglasses can cost upwards of $300, I'd definitely say that what you're getting with these glasses is worth it — think of them as headphones, a camera, a smart assistant, and shades all in one. Particularly if you're a fan of Ray-Bans, then there's no reason not to opt for all of these cool features. What's more, even non-techy folks will love these. They've easily become part of my daily life; they're really just there to help enhance your life, whether by capturing what's around you easier or providing AI answers to your questions."

Beyond the AI

While the multimodal AI is a significant feature of the Meta glasses, it is not the only one. The glasses also function as a capable live streaming device, a POV camera, and an excellent pair of open-ear headphones. This versatility ensures that the glasses remain useful and enjoyable even if users choose not to engage with the AI assistant regularly.

The Ray-Ban Meta Smart Glasses, with their newly added multimodal AI, offer a compelling glimpse into the future of wearable AI technology. Despite some limitations and quirks, the glasses demonstrate the potential for AI to enhance our daily lives in a comfortable and familiar form factor. As users adapt to the technology and the AI continues to evolve, we can expect to see more seamless integration of AI-powered features into our everyday accessories. While pulling out a smartphone may still be faster in some cases, the Meta glasses showcase the potential for wearable AI to become an increasingly handy and convenient tool for gathering information on the go.

Recent Posts