"You’ll also soon be able to test multimodal Meta AI on our Ray-Ban Meta smart glasses."
Now this is interesting. I've been thinking for some time now that traditional computer/smartphone interfaces are on the way out for all but a few niche applications.
Instead, everyone will have their own AI assistant, which you'll interact with naturally the same way as you interact with other people. Need something visual? Just ask for the latest stock graph for MSFT for example.
God, I hope not. Maybe it's just me, but this sounds insanely annoying? It kind of reminds me of the objection I've seen to the metaverse, where it's actually a more inefficient way to do stuff, so it doesn't make sense to imagine it replacing the text-based internet.
Like, I'm not even a fan of smartphones these days, but surely in a world where you could only access information via yelling at your smart glasses, the invention people would be crying out for would be a way to use it silently with your hands, with a screen you could use to easily show it to other people...?
If twitter did get put on the blockchain after Elon bought it, it really would be "X on blockchain"... ba dum tshh