Following Meta, which first integrated AI into its Ray-Ban smart glasses five months ago, Google last week approached Samsung to develop AI-enabled smart glasses. For Google, the stakes are high. After OpenAI’s rapid emergence as a leading conversational AI company, it needs to find a way to put AI into intelligent data that can help it advance its conversational agent.
Meta’s boldness in developing glasses, relative to Google’s efforts, has been particularly striking of late. If the first version of Meta Ray-Ban was a failure, the second version that Meta launched in 2023 exceeded the company’s expectations. The reception of these glasses provided further evidence of the public’s appetite for even a relatively simple computing device that could be worn on the face. Creators producing content that depicts their daily lives can be a powerful tool in the industry’s efforts to make people feel comfortable with smart glasses. They could also bring creators closer to their audience, who could literally see what they see, not without risk.
From science fiction
Two Harvard students have found a way to transform Meta’s Ray-Bans into something straight out of a sci-fi movie. Using a custom AI platform, they can identify virtually anyone just by looking at them and reveal personal details like their address, phone number and family.
Cheap and fast fashion glasses: “I buy pairs every two or three months”
Some content creators are already explaining how to use these glasses, teaching people how to protect themselves before it’s too late. At the moment, we don’t yet know if the boom in smart glasses is underway, as the craze has already seen ups and downs. But we can assume that other smart glasses will appear, as other companies will most likely set their sights on them.