Ever since Meta debuted its smart glasses back in 2021, concerns have been raised over their ability to film people without their knowledge.
Now, two Harvard students have taken the device’s privacy-invading capabilities even further – by building a modified version called ‘I-XRAY’.
The creepy system uses AI and facial recognition software to instantly dox people’s identities.
In an astonishing clip, the students go up to random strangers and quickly identify their name and other personal details – including their home address, work history and even the names of parents.
It’s reminiscent of the Black Mirror episode, White Christmas, where a hopeless singleton uses an implant to instantly find online information about strangers.
Dubbed I-XRAY, the creepy tech, developed by Harvard students, lets you go up to a random stranger and quickly identify them
In the Black Mirror episode ‘White Christmas’, hopeless singleton Harry (Rasmus Hardiker) uses an implant to instantly find online information about strangers
How does it work?
Video is streamed from the Meta smart glasses straight to Instagram.
A computer programme monitors the stream for people’s faces and can match a face to publicly available images across the internet.
An AI is prompted to infer details such as the person’s name, occupation, and other personal details.
The results are sent to a separate app the students created on their phone.
The tech has been created by AnhPhu Nguyen and Caine Ardayfio, two engineers at Harvard University in Cambridge, Massachusetts.
‘The purpose of building this tool is not for misuse, and we are not releasing it,’ they say in a document outlining the technology.
‘Our goal is to demonstrate the current capabilities of smart glasses, face search engines, large language models and public databases.
‘[We’re] raising awareness that extracting someone’s home address and other personal details from just their face on the street is possible today.’
On X (Twitter), Nguyen posted a video of the tech with the caption: ‘Are we ready for a world where our data is exposed at a glance?’
As the students show in the clip, they use a combination of existing tech on the market to create AI glasses ‘that reveal anyone’s personal details just from looking at them’.
First, the students take a pair of Meta Ray Bans 2, released last year, ‘as they look almost indistinguishable from regular glasses’.
At the touch of a button on the side of the specs, these glasses can film up to three minutes of live video, which can be streamed to Instagram.
The students took a pair of Meta Ray Bans 2, released last year, ‘as they look almost indistinguishable from regular glasses’
The livestreamed footage is monitored by a programme called PimEyes, described as a ‘reverse image search tool’,
By monitoring the livestream, PimEyes can match a face to publicly available images of that face across the internet, the duo explain.
Once their faces are found, an AI is prompted to take details such as the person’s name, occupation, and other personal details that may be accompanying that image.
I-XRAY uses FastPeopleSearch, an online tool that only needs someone’s name to find more personal information such as home addresses, phone numbers, age, and relatives from publicly available records and social media profiles.
‘It’s all fed back to an app we wrote on our phone,’ says Nguyen in the video, posted to X.
I-XRAY is unique, the engineers say, because it operates entirely automatically, quickly allowing the wearer to find information about people they come across.
The video shows the students approaching total strangers on campus, in the street, and on the Harvard subway station in Cambridge.
In one instance, Ardayfio approaches a woman he’d never met before and asks: ‘Are you Betsy? I think I met you through the Cambridge Community Foundation’.
Smiling, the woman – presumably believing she’d met him previously but forgotten about it – confirms she is indeed Betsy and they start a conversation.
Fortunately it is possible to remove your details from PimEyes and FastPeopleSearch so that I-XRAY or any similar system can’t identify you.
The students provide links in their document that provide step-by-step instructions on how to do this so that ‘you and those you care about can protect themselves’.
The experts say: ‘Initially started as a side project, I-XRAY quickly highlighted significant privacy concerns. The purpose of building this tool is not for misuse, and we are not releasing it’
The £299 glasses were unveiled by Meta during the Meta Connect conference last year
In request to comment, a Meta spokesperson stressed to MailOnline that the glasses had to be modified heavily so that they could perform like this.
The Meta spokesperson also said the AI and facial recognition element of the system ‘would work with photos taken on any camera, phone or recording device’.
However, phones and other recording devices have an element of transparency that arguably Meta’s smart glasses do not.
With Meta’s glasses, the camera is concealed in the frames – and the average person on the street may not realise their function is to record.
When the glasses are filming, a noise sounds and a small LED next to one of the lenses lights up – although users would have to explain what these mean.
Jake Moore, security advisor at ESET, said ‘glasses enabled to film the public is a ‘worryingly dangerous development’.
‘We are seeing technology advancing into areas that are simply not required,’ Moore told MailOnline.
‘Furthermore, when they are adapted to recognise individuals it becomes a scary tool that could easily be abused.’
The Meta spokesperson said: ‘To be clear, Ray-Ban Meta glasses do not have facial recognition technology.
‘From what we can see, these students are simply using publicly-available facial recognition software on a computer that would work with photos taken on any camera, phone or recording device.
‘Unlike most other devices, Ray-Ban Meta glasses make a sound and the capture LED activates to indicate to others that the user is recording.
‘This sound and LED cannot be disabled by the user, and we introduced tamper detection technology to prevent users from covering up the capture LED.’