Meta announces new features for its Ray-Ban smart glasses. Now the gadgets can talk about the space around the user.
Ray-Ban glasses with Meta technology can now talk to the user about the space around them. This was one of several new features announced by the company.
Available only to members of Meta’s early access program, the live AI allows Meta’s AI to see what’s around the user and talk about it.
Another feature available is live translation. If a person is talking to someone in a foreign language, the glasses automatically translate and play the translation through the speakers or show a transcription on the phone.
The device has also integrated with Shazam, but this update is only available to users in the US and Canada. Meta plans to announce more features in 2025.
Photos: X @AppleX4_. This content was created with the help of AI and reviewed by the editorial team.