The return of smart glasses: Can we see past AR's flaws this time?

A woman wearing smart glasses looking up
(Image credit: Getty Images)

We already have computers in our pockets and on our wrists, and in the not-too-distant future we’ll be strapping them to our faces too. All the major Silicon Valley players are hard at work developing augmented reality (AR) glasses that blend the physical and digital worlds together, and they are expected to hit the market over the next few years.

We have, of course, been here before with devices including the Google Glass. The last few months has seen a slew of new options teased to the market, including Ray-Ban’s collaboration with Meta and Lenovo’s T1 smart glasses, with more to follow. What makes the device manufacturers believe that customers will find their AR kit any less weird and disturbing than the failed devices of before?

A warning from history

“The idea of being constantly watched is not something that everyone would appreciate,” says Thomas Ffiske, a metaverse and immersive tech analyst who writes the AR industry newsletter Immersive Wire. He’s not speaking hypothetically. Back in 2013 when Google launched its Glass wearable, it generated countless headlines about the potential privacy worries, and there were even reports of restaurants and bars banning “glassholes”, as early adopters were labelled.

Will the same happen again with AR? “I'm not sure much has changed in the past few years that would get people to accept it more,” says Professor Jason Hong, who works in the Human Computer Interaction Institute at Carnegie University. “It's putting people at risk for no clear benefit to themselves or no clear benefit to society. Those were the concerns then, and it's not clear that anything has changed.”

What’s going to be pivotal for Meta, Apple and the others, then, is demonstrating the utility of the new technology – much as today we accept the benefits of Alexa and CCTV over the perceived cost. “When we look at the features of Google Glass there were not a lot of benefits, it was basically just a smartphone in front of your face,” says Professor Philipp Rauschnabel, who researches AR at Bundeswehr University Munich. “There were not a lot of useful apps.”

Rauschnabel believes that this time, it could be different – at least for Apple. “Apple has invested millions of dollars to have AR on their smartphones,” he says. “AR on a smartphone doesn't really make sense to a user, because you have to hold it. However, Apple has learned about AR and it has learned about useful use cases. So I expect when Apple enters the market with glasses, they can build on a lot of knowledge about how AR works and how it can be used. They will enter the market with a strong ecosystem of useful apps that really provide benefits to users.”

Building trust through smart glasses design

Making AR we can trust isn’t only about having a range of sufficiently useful apps. The real path to widespread acceptance begins with the fundamental design of the hardware and software. “I think they need to be very clear on what data is being used and how,” says Ffiske, who foresees a particular sensitivity around data usage on live map layers that glasses will overlay on the real world.

“Companies are making live maps of the world at the moment, where there's more context and depth. I think Meta might need to be clear when people's data is going to be used for that. There may well be targeted ads on this layer, but then they have to make clear it’s only used for that specific purpose.”

Another trust-building strategy could be for device makers to carefully consider what they actually let users do with their devices. “I would say that the main thing is, unless there's a clear motivation for it, not to actually record things,” says Hong. “That would probably greatly assuage people's concerns.”

We also already have a real-world example of another new technology becoming broadly accepted by being careful and judicious. “A lot of the things actually have always-on microphones, but they're only looking for the keyword first,” says Hong. “And then it actually responds in a very loud way. So, everybody knows you're talking to Siri and you know that some of the audio is being sent off. I think that's a pretty good example of everybody being on the same page with respect to what's being recorded and what's not.”

Perhaps the hardest privacy question for designers is where data is processed. In recent years, we’ve seen both iPhone and Android emphasise local processing, where images are analysed on device using the phone’s own hardware, in response to heightened concerns over privacy in the cloud. This functionality will be critical to how AR glasses function, as they will need to have a camera or LiDAR scanner constantly running to overlay data on to our world correctly.

That will require a lot of processing power and may be one of the reasons why we’re still waiting to see devices from the major manufacturers. “I'm not sure if it's possible using today's technology, because the microprocessors and the amount of battery you can have on these glasses would probably not be sufficient,” says Hong. “Even if you could, it'd be so warm and so uncomfortable.”


AI for customer service

IBM Watson Assistant solves customer problems the first time


Hong thinks some sort of remote processing is inevitable, even though there are risks. “That's going to cause a lot of problems, because it's unclear what data is being captured, and where it is going,” he says. “What happens if you're in a bathroom or a gym? If it's on in these sensitive locations, that can be really embarrassing.”

Rauschnabel agrees that any sort of local processing will be hard if the technology is to work effectively. “The trend will not be local processing,” he says. ”You can make the glasses smaller, so that the glasses are basically just the training technology [for example the camera] and the display. Everything else is sent to an external server over a 5G network… because the idea is that you can have the devices smaller and lighter and you can save energy.”

Guaranteeing privacy in next-gen smart glasses

There is something unique about AR glasses – and that is whose privacy is most affected. “The difference here is you're particularly threatening other people's privacy,” says Rauschnabel. “Apple focuses on the user's privacy, they've never talked about other people's privacy because it never mattered.”

This is arguably a lot harder when we have a camera or Lidar sensor constantly rolling, pointing at everything we so much as glance at. “It's not so challenging to protect one's own privacy, it just means you have to make sure that other people can't access your sensors and cameras while you're using it. But when it comes to AR, I don't know how they will treat other people's privacy.”

He speculates when they do arrive, Apple’s AR glasses might have specific restrictions on facial recognition or only allow face tracking for specific individuals. There are some real-world examples of how these privacy concerns could be mitigated. For example, in 2004, South Korea passed a law requiring a sound of at least 65 decibels to be made by a device every time a photo is taken. And more recently, Meta has built similar functionality into the Ray-Ban Stories, with an LED lighting up whenever the camera is in use.

Professor Hong speculates that taking a photo with AR glasses could require some sort of physical gesture, such as the wave of a hand, so that the people around you know what is happening. But ultimately, even if these sorts of protections are added, either in software or by law, the pessimistic view is that the battle for privacy is unwinnable. The devices are coming, and the unfortunate thing about technology is that it just keeps getting better. One day soon, it’s possible that almost all of us will be so-called glassholes.

“If a lot of people are using them, and the devices are so small that you can't distinguish smart glasses from regular glasses – and this will happen – then there is no way to control it,” says Rauschnabel.