Mark Zuckerberg’s Meta got off to a horribly slow start with its Ray-Ban smart glasses. They launched in 2021 with some lackluster features. You could take a picture with your glasses. Or, get this, take a video with your glasses. Exciting stuff!

It’s only now, almost three years on from launch, the glasses might be hitting their stride. Andrew Bosworth—known as Boz, the Chief Technology Officer at Meta—posted on Instagram that the company is beta testing multimodal AI in the glasses. Essentially, you could speak to Ray-Bans like they were a smart assistant. As for what they can do, well… The video shows them describing things they’re seeing through the front-facing camera.

Hey glasses, what am I looking at? And then they tell you you’re looking at a piece of wall art from TJ Maxx. Again, exciting stuff!

All that aside, it’s easy to see where Meta is trying to go with this. AI features start small and stupid, but neural networks quickly develop into something powerful. It looks like the end goal with these glasses is to have Wikipedia and Google on your face. Hey glasses, what am I looking at? Then it gives you a condensed art-historical lesson on the Picasso you’re staring at in a museum.

There’s a lot of potential here, but right now they are still just glasses with a camera.



Read the full article here

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *