Leaf&Core

Vending Machine on College Campus Equipped with Facial Recognition

Reading Time: 5 minutes.
Screenshot from the game Cyberpunk 2077, depicting a person leaning up against a vending machine with a face on it.

Screenshot via CD Projekt Red’s Cyberpunk 2077

Cyberpunk 2077 is full of little quirks. Some of the secrets you’ll uncover in the dystopian Night City are utterly soul crushing. Others are more fun. One such diversion is a vending machine named Brendan. Brendan is a “Spontaneous Craving Satisfaction Machine” or SCSM… a vending machine. He’s clever though. He asks for help with an obstructed view, talks to people walking by, even becomes one woman’s best friend. Brendan is odd. He knows things about the player he absolutely shouldn’t. He says he just gleamed data from the net. Facial recognition, public information, discussions of the people in the area. Using that, he pieced together who you were and other personal details about your character. He, apparently had done that for a number of other people. While some found it endearing, others found it deeply disturbing. After all, this was a vending machine, belonging to a corporation. It doesn’t matter how friendly he is, he’s collecting massive amounts of data on people, and leveraging it. It would be so creepy if anyone was doing that in our world, right?

Well, we don’t have to wait until 2077 to experience the magic of a dystopia! A student on University of Waterloo’s campus noticed an M&M-branded vending machine on campus had begun showing a strange error on the screen. It claimed there was a crash in the “FacialRecognitionApp.” A vending machine, installed on a college campus, using facial recognition on the people passing by without consent?

Hello, 2077! Point me to the nearest Ripperdoc, please!

An Error Reveals More Than Company Wanted

A student going by the Reddit username SquidKid47* uploaded a photo they claim they took at their college campus, the University of Waterloo in Canada. Another student at the school, River Stanley, launched an investigation as well for the Waterloo student newspaper, mathNews. Stanley found conflicting responses from the companies involved. Invenda makes the machines. Adaria works as sort of a “last mile” distributor for them, ensuring they’re stocked up as well, and Mars, being the candy company that makes M&Ms, sells the products within. It’s a Mars vending machine, and a crashed application showed it was using facial recognition.

“The machines do not take or store any photos or images, and an individual person cannot be identified using the technology in the machines. The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface—never taking or storing images of customers.”

– Adaria statement

Numerous news organizations asked Mars for a comment, but did not get one. Invenda’s marketing materials claim that their devices can report back demographics data, included estimated age and gender, and claims it does so within privacy laws. They specifically mention the GDPR, which would require permission from a user to gather any identifying information. While estimated age and gender may not identify a person directly, it could be used in conjunction with other information to suss out an identity. The machines take cash as well as multiple mobile payment options, tap to pay, credit cards, and even the University of Waterloo’s student ID cards. All of which would be able to collect a person’s identity, and, along with the facial recognition, easily tie a person’s face to an identity.

Invenda says they don’t do that. Obviously they have the capability to do that, but Invenda says they live up to GRPD laws and protect privacy. According to them, that facial recognition is only local, and never exported. Mars has stayed silent. Adaria claims they can’t even collect this information, though is careful to mention that it doesn’t store “images of customers,” specifically not mentioning demographic data that could come from that software. They’re being careful with their answers, and Mars isn’t even willing to speak up.

Eroding Consumer Trust

A shopping mall owner, Cadillac Fairview, allegedly had made similar claims, that they were able to do all facial recognition locally and were not collecting identifying information. An investigation by Alberta and British Columbia’s privacy commissioners found otherwise. They had apparently collected millions of images containing facial recognition data. For their crime, they were forced to delete the data.

No fines. No jail time. Just delete the data. Data is valuable, but simply deleting it could never undo the profit made from it already.

Surely no one else would collect facial recognition data after such a harsh rebuke!

Invenda claims that their “machines are fully GDPR compliant and are in use in many facilities across North America.” Without regular investigations, how could anyone trust that? Why go through the trouble of using facial recognition if you’re not going to make it profitable with the data it could collect? It just doesn’t seem logical. We have to take them at their word, but with this technology being used in privacy-violating ways before, one has to ask, why even risk it? Why add it at all if you weren’t going to use it to its full potential?

The school has asked that the software is disabled, and has requested the removal of the machines from campus. Students will just have to get their snacks the old fashioned way: by selecting them on a vending machine that doesn’t have the capability to collect biometric or demographic data.

Who Owns Your Face?

We’re going to get a little weird here, but, do you own your face? If you take a photo of yourself, you own the photo. But do you own your face? Or is your face fair use? For the purpose of street photography, people have been unwittingly in photos for years. Shop keepers can record their premises and the nearby area for security purposes. Governments have CCTV, monitoring traffic intersections and busy areas. But all of those serve a purpose: freedom of expression and security. Photos and other media can even be logged by machines for learning purposes, under fair use.

But is your face fair use for profit?

In some regions, you may have privacy protection for facial identification. But a company could use some basic facial recognition, demographic data, and combine it with other pieces of data to complete a more holistic view of you. A single operator may not legally violate your privacy, but any data collected about you can paint a better picture of you.

Right now, in most places, I can tell you who owns your face. It’s whoever can quantify it digitally the best. Whoever can identify your face in a crowd with near perfect certainty, recreate it anywhere they want, and track you and your lifestyle, spending habits, partners, and family wherever you go. The person who can sell data based on your face is the one who owns it. That’s not you.

You don’t own your face. You didn’t even get the chance to sell it. It’s being used in machines to turn on a menu system—apparently nothing more, for now—and could be collected in others with little repercussion. Other companies can collect and sell that valuable data, assigning your identity to your personality, purchasing habits, demographics, everything about you. You can’t sell something you don’t own, right? Then you don’t own your face.

I love Cyberpunk 2077, but this is a bit much.


Sources:

* I both dread and cannot wait for the day when the Reddit source is one of those graphic usernames and every news organization, has to properly attribute their source by listing the ridiculous name. SquidKid47 is probably just a Splatoon fan, and who can blame them?

Exit mobile version