The Video Analytics For Digital Signage Sky Is Not Falling

May 17, 2019 by Dave Haynes

There is much fuss out there about the San Francisco Board of Supervisors voting 8-1 to approve  outlawing the use of facial recognition software or retention of information obtained through facial recognition software systems.

The ordinance still has to clear a second hurdle next week before it would be law.

I have read suggestions this is the thin edge of a wedge and will lead to other cities doing this, and more to the point, really cramp the style of digital OOH media companies who use the technology.

But here’s the thing:

1 – Media companies that do audience analytics use face pattern detection, not facial recognition. They’re related but different. Recognition means a camera is scanning for faces and looking for matches in a database of photos. Detection means the camera is scanning for the geometry of faces, and then making machine-learned estimates of age, gender and emotion, as well as logging how many and how long these faces look. These systems don’t reference any face photo database and they don’t store the video stream.

2 – The proposed ban is for City of San Francisco departments, like first responders. It does not relate, at all, to private companies.

This is what the ordinance says:

Ordinance amending the Administrative Code to require that City departments acquiring surveillance technology, or entering into agreements to receive information from non-City owned surveillance technology, submit a Board of Supervisors approved Surveillance Technology Policy Ordinance, based on a policy or policies developed by the Committee on Information Technology (COIT), and a Surveillance Impact Report to the Board in connection with any request to appropriate funds for the purchase of such technology or to accept and expend grant funds for such purpose, or otherwise to procure surveillance technology equipment or services; require each City department that owns and operates existing surveillance technology equipment or services to submit to the Board a proposed Surveillance Technology Policy Ordinance governing the use of the surveillance technology; and requiring the Controller, as City Services Auditor, to audit annually the use of surveillance technology equipment or services and the conformity of such use with an approved Surveillance Technology Policy Ordinance and provide an audit report to the Board of Supervisors.

It’s about surveillance, not media.

Now, you could argue this is step 1 and eventually, other systems that surveil the general public will be blocked in the Bay Area and more broadly. But what this is and what those would be are poles apart.

Ordinance author Supervisor Aaron Peskin told VentureBeat that facial recognition is a “uniquely dangerous technology” and cited facial recognition software being used to track the Uighur population in Western China and an ACLU test of Amazon’s Rekognition that misidentified 28 members of Congress as criminals.

Peskin called the ordinance an attempt to balance security with the need to guard against a surveillance state.

“This is really about saying [that] we can have security without being a security state. We can have good policing without being a police state,” he said.

The legislation, which amends city administrative code, will require city departments to create policy for surveillance technology use. City departments are also required to submit annual surveillance reports that explain how they use devices like license plate readers, drones, or sensor-equipped streetlights.

Acquisition of new surveillance technology will require approval by the Board of Supervisors, and if new tech is approved, city departments will be required to adopt “data reporting measures” to “empower the Board of Supervisors and the public to verify that mandated civil rights and civil liberties safeguards have been strictly adhered to.”

For a media company to do full-on facial recognition would, first of all, require a photo database. Facebook and Instagram have been suggested as potential archives of billions of photos, but that’s probably a flawed archive, even if you could buy it from some Russians.

Then there are things like passport and DMV photos, but there really would be a shit-storm if the state of Montana, for example, made those photos available to a 3rd-party, like a media company.

I’ve been around computer vision/video analytics tech for years, and while tailoring ads to specific audiences is always touted as the OMG AMAZING!!! thing, it’s not really done all that much. First, it would be a LOT of work to execute a media campaign using that. Second, what I hear over and over is that brands want insights on how many people, general profile, how long they watched on average … stuff like that.

They’re not asking if the spot it created for Dave, for hair replacement, did the business.

The whole surveillance state thing is an entirely different argument. I don’t have my own definitive point of view and would need to think more on that, but my general take is people don’t like being tracked, however they DO want bad guys tracked.

Kinda can’t have one without the other.

One of the reasons this comes up in the context of digital signage and digital OOH is the stupid, ill-considered marketing approaches of some vendors in our ecosystem. For years, I have seen start-ups coming into the business calling what they have facial recognition, when what they actually do is face pattern detection.

Then they wonder why privacy advocates are on their collective asses.

Be very selective in the terms you use, and very transparent about how the tech is used. There will always be people upset about cameras, but if they know and trust the data is anonymous and not stored, logic suggests the percentage of truly upset citizens (and their political representatives) will be low.

On the other hand, if we have companies running around saying what they do is face recognition, the industry is shooting itself in its foot.

Quividi, one of the oldest, most established companies working in analytics, says it right: No image is recorded and all data is strictly anonymized to protect privacy.

AdMobilize, meanwhile, says: “We only believe in “detection” and will never use recognition. The trust of the consumer is above everything.”

If politicians are going to go after media for privacy issues, Facebook and Google are genuinely ones to worry about.

  1. Ken Goldberg says:

    All true, Dave (especially the final sentence). Hence the need for education… first of the industry itself, then the potential customer, then the general public. The politicians don’t really want to learn… they of course find it more expedient to grandstand by hobbling law enforcement than to examine where the real danger lies.

Leave a comment