Students use Meta’s Ray Ban smart glasses to get personal info with a glance

AnhPhu Nguyen (left) and Caine Ardayfio (right) are two Harvard students who have added software to Meta Ray-Ban smart glasses that can do facial recognition and look up data about anyone in camera range. (Jonathan Wiggs/Globe Staff)

(by Bianca Beltrán, NBC News10 Boston) – Two Harvard University students are exposing how much personal information is publicly available online by combining smart glasses and artificial intelligence to collect that data by just looking at someone.

AnhPhu Nguyen is a junior at Harvard studying human augmentation, and Caine Ardayfio is a junior studying physics. They used a pair of Ray-Ban Meta Smart Glasses [which have cameras embedded in the frame] available to them through Harvard’s Augmented Reality Club, and they built software that allowed them to use existing search engines and facial recognition technology to identify people on the spot, using those lenses. [They named their project I-XRAY. A person wearing the glasses can tap the frame to take someone’s photo, which is transmitted to the user’s account on Instagram, which is owned by Meta.]

“You get a video feed from the glasses, and we have a bot that takes those video data and tries to find a face in it,” explained Ardayfio. “If it finds a face, then it will upload it to this tool called ‘PimEyes,’ and it will essentially, it’s called ‘reverse image search,’ where you take an image and you find other similar images online. Once you have URLs of those other images, we use an AI basically to try and figure out a person’s name. Once we find their name, we use databases like voter registration databases to find an address, phone number, that type of thing.”

Using a large language model [LLM] — a type of artificial intelligence algorithm — the data is compiled within a minute in an app they created.

Meta Ray-Ban Smart Glasses have cameras embedded in the frame.

The students posted a video on social media demonstrating the technology by showing other Harvard students and strangers the personal information they learned through the process. The reveals were met with amazement and shock.

“We were surprised just how much data you could extract now that [significant progress has been made with] large language models, [which] are the piece that unlocks the rest of the pipeline,” said Nguyen. “Most people didn’t even know that these tools could exist where you could just find someone’s home address from their name, so people were rightfully scared. That’s why when we started building this, we opted to also make that guide to solve the problem immediately.”

They compiled a list of instructions on how to opt out of reverse face search engines, people search engines and how to protect yourself from data leaks.

In a report in early October on Nguyen’s and Ardayfio’s findings, the Boston Globe noted that federal law does not bar the use of facial recognition systems, with legal standards varying across the country.

Some states like Massachusetts restrict government use of the technology, with Boston and Cambridge banning government use completely, the Globe wrote.

But only a few jurisdictions, like Illinois, have laws that forbid individuals or businesses to use facial recognition systems without the subject’s permission. …

Happily, the two students have no intention of sharing their software with the world. They developed it as a demonstration of how easy it could be to obtain sensitive information. They don’t plan to release the product or their code because they recognize how it could be misused.

“Basically, why we did this was to raise awareness of how much data we have just publicly available,” said Nguyen. “All of the data that we collect is publicly accessible by anyone. But you can instantly delete yourself and make this tool ineffective.”

These students aren’t the first to experiment with this kind of technology.

Last year, the New York Times reported, tech giants Google and Facebook built tools years ago that could put a name to any face, but had chosen to hold the technology back, deciding that it was too dangerous to make widely available, however companies continue to develop similar tools.

Microsoft and Amazon faced a massive backlash when they sought to sell facial recognition systems to police departments, forcing both companies to back away from the idea.

But there’s nothing to stop a cyber criminal from developing their own version of I-XRAY.

“The bad actors are already aware they can do this,” said Ardayfio, a physics major.

And the system could be tweaked to capture more than one face at a time. Because all the heavy computing is done by powerful servers in the cloud, Ardayfio and Nguyen said, it would be easy to design a version which could photograph a crowd of people, then look up the data for each one of them.

But Nguyen and Ardayfio say it’s not completely hopeless. You can’t prevent people using AI to recognize your face. But we can make it harder to find our sensitive data online.

The inventors of I-XRAY recommend that people reach out to the major online data brokers and facial image search companies, and ask to be deleted from their databases. PimEyes and FastPeopleSearch say they’ll remove people on request, and there are a number of companies, including Cloaked.com and DeleteMe, that sell services that remove your information from multiple data brokers.

But there’s no guarantee that all database operators will be so accommodating. And even though the workings of I-XRAY are secret, the Harvard pair say it’s just a matter of time before someone develops an open-source version and [releases it].

Forbes notes: Privacy in public is probably dead, but the students have some suggestions to restore at least some of it:

  1. Remove yourself from facial recognition databases like Pimeyes and Facecheck ID
  2. Remove yourself from people search engines like FastPeopleSearch, CheckThem and others
  3. Add two-factor authentication to any financial or other highly private accounts

Compiled from articles published at NBC News10 Boston on October 7, 2024 and Boston Globe on Oct. 4. Reprinted here for educational purposes only. May not be reproduced on other websites without permission.

Questions

1. Explain each of the following terms:
-LLM
-smart glasses
-artificial intelligence
-human augmentation
-facial recognition technology
-bot
-algorithm
-open source

2. How did Harvard college students AnhPhu Nguyen and Caine Ardayfio develop the I-XRAY?

3. How does I-XRAY work?

4. What was the students’ reaction to the results of their project?

5. a) For what reason did AnhPhu and Caine develop the I-XRAY?
b) Why aren’t they releasing / selling this technology?

6. What solution do they offer for people who are concerned with this potential invasion of privacy?

7. Even though the students will not share their I-XRAY software, the Harvard pair say it’s just a matter of time before someone develops an open-source version and releases it.
a) Will you take the steps suggested by Nguyen and Ardayfio to protect your identity from this type of potentially widespread technology? Explain your answer.
b) Ask a parent the same question.

Background

How to remove your information. By builders: AnhPhu Nguyen & Caine Ardayfio

Motivations For Building I-XRAY
Initially started as a side project, I-XRAY quickly highlighted significant privacy concerns. The purpose of building this tool is not for misuse, and we are not releasing it. Our goal is to demonstrate the current capabilities of smart glasses, face search engines, LLMs, and public databases, raising awareness that extracting someone’s home address and other personal details from just their face on the street is possible today. (from googledocs)

Get Free Answers

Daily “Answers” emails are provided for Daily News Articles, Tuesday’s World Events and Friday’s News Quiz.