The Milk Chocolate That Creeps On Your Face

…not on your hand?

Over at my day job, we sell our software through an online store. We offer bundles which provide a discount when purchasing multiple complementary products together. If a customer instead completes a purchase of one individual piece of software, we offer them a last chance to get a second item at the discounted price, completing the bundle after the fact.

Years ago, the code for this was contained in a file not visible to the outside world. Not visible, that is, until one day something broke on the server and displayed the file’s name to visitors: upsell.inc.php. As a result, a customer emailed us to express his displeasure at being upsold. He wasn’t wrong. No one wants to feel like they’re being upsold, and while our offer was good, our language was careless. That file is now named something much more anodyne, along the lines of postpurchase.inc.php.

I thought of that mistake while reading about a creepy vending machine at University of Waterloo. Earlier this month, a student caught the machine in this error state:

An error message that reads, in part, “Invenda.Vending.FacialRecognition.App.exe - Application Error”
[Photo credit: SquidKid47]

An error message on any device out in the world can be amusing, but this one is also leaking crucial information. The dialog’s contents are quite generic, but its title is spilling the beans:
Invenda.Vending.FacialRecognition.App.exe.

Why in this hell-on-Earth is facial recognition software trying to run on a vending machine? An investigation by on-campus publication mathNEWS into that question received contradictory responses from two involved parties.1 First, the company operating the machines, Adaria Vending Services, made this claim:

The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface — never taking or storing images of customers.

Later, the machine’s manufacturer Invenda sent this:

…we formally warrant that the demographic detection software integrated into the smart vending machine operates entirely locally. It does not engage in storage, communication, or transmission of any imagery or personally identifiable information. The software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.

At a minimum, these machines needlessly possess a simple camera that can detect faces. Given what Invenda said, however, it seems probable Adaria’s statement is inaccurate. It’s likely that the devices are capturing images in high resolution, then analyzing them locally and recording data about them, before discarding the raw photos.

That may not be quite as awful as creating a database of customer images, but it’s still plenty awful. Vending machines simply do not require the capacity to recognize that a face is nearby. The ability to “activate the purchasing interface” is unnecessary. Even if the manufacturer feels it’s worthwhile, unless the school is infested by massive rodents, simple motion detection would be more than enough. Likewise, gathering data on customer gender and age is needlessly invasive.

Back at the aforementioned day job, we have a very straight-forward privacy policy, which opens with this:

We believe strongly in the right to privacy. We do all we can to protect the privacy of our users. Our business earns money by selling software, and never by monetizing your private data. Our privacy policy can be easily summarized in one line:

We don’t sell or rent any of your data to third parties, ever, period.

In addition, we collect and keep as little data from you as we reasonably can. Collecting the bare minimum of data for our needs means there’s little incentive for a malicious actor to attempt access to our databases, as well as minimal consequences were such a breach to occur.

Too many other companies are too willing to do terrible things to gain an edge. As a result of invasive ad tech, the future is an awful and disturbing place to live. Until companies are forced by law to stop snooping on every aspect of our lives just to squeeze a few more cents out of us, consumer sabotage is more than justified.

Previously in disturbing ad tech: Creeping on You in the Cold Drinks Aisle


Footnotes:

  1. The issue of mathNEWS is archived here. ↩︎