- Meta is rolling out new features to its Ray-Ban smart glasses, including live AI, live translations, and Shazam integration.
- The live AI feature allows users to converse naturally with Meta's AI assistant and get information about their surroundings, like recipe suggestions based on grocery items.
- The live translation feature can translate speech in real-time between English and Spanish, French, or Italian. Users can choose to hear the translations or view transcripts on their phone.
- Shazam support allows users to identify songs they hear simply by prompting Meta's AI.
- The new features are being rolled out to members of Meta's Early Access Program, except for Shazam which is available to all users in the US and Canada.
- These new AI-powered features position smart glasses as a key platform for AI assistants, as evidenced by Google's recent announcements around its Android XR OS and Gemini AI assistant.
- Meta CTO Andrew Bosworth believes smart glasses may be the best form factor for an AI-native device, with AI defining the hardware from the beginning.
- The updates come as Big Tech companies are increasingly positioning AI assistants as the killer app for smart glasses.
- Users need to ensure their Ray-Ban Meta smart glasses are running the latest v11 software and v196 of the Meta View app to access the new features.
- The rollout of these AI capabilities to consumer smart glasses represents a significant step forward in the integration of AI into wearable devices.
Meta adds live AI, live translations, and Shazam to its smart glasses - The Verge
Source:
www.theverge.com