Toronto DooH Screens Trigger Privacy Fuss Over Absolutely Nothing

November 17, 2025 by Dave Haynes

Every few years another news story surfaces about the supposedly egregious privacy invasion brought on by camera-based audience measurement technology, and seemingly with every instance, it turns out to be a shriveled, undercooked nothing-burger.

I went into 12-year-old tween eye-roll mode the other day when I came across a story out of Toronto that dug into the use of cameras to measure foot traffic and gross demographics for a media company that has digital OoH ad displays in Union Station, the city’s multi-modal downtown mass transport hub.

“New billboards near Toronto’s Union Station using facial detection tech are raising privacy concerns, with experts warning of weak safeguards and lack of clear consent.”

“The controversy erupted after a Reddit post on Nov. 2 highlighted the billboards’ use of facial detection technology to track and analyze data of passersby, including their age and gender.”

To the credit of the broadcast reporter, he did a fair job (except maybe towards the end) of saying what’s really going on here – which is computer vision-driven pattern detection, involving a small camera and (probably) an edge computing device at the display that takes and analyzes the video feed, looking for what the algorithm understands to be the geometry of human faces. It’s anonymous and the captured faces in video frames are then discarded.

That means if I meandered by, it would clock me as an old fart, male, who looked at the ad face for micro-seconds. It does not mean that display will suddenly generate content that says: “Hey Dave, Sasquatch Hair Restoration could help you with your mostly barren scalp. We’ve just sent you a text with a QR code for an emergency discount on a new mane!!!”

This technology has been around for at least 20 years, and some of its early marketers and resellers made the boneheaded mistake of promoting their shiny new tech as facial recognition. That was just begging for trouble.

It could technically be argued that video analytics IS face recognition, in that the machine learning software does indeed recognize what it has been trained to understand to be faces. But it doesn’t recognize and identify individuals.

True facial recognition tech picks up individual faces from surveillance video feeds and compares them against a database of known faces to find matches. That involves math-based face-prints of people, and tends to be used for safety and security purposes – though we’re seeing similar tech used to more speedily enter countries at airports, replace passwords for phones, and in countries like China, even manage transactions at retail.

I mentioned how the reporter got this pretty much right, but then his piece surfaces a 2020 story about “an investigation by federal, Alberta and B.C. privacy commissioners found Cadillac Fairview used facial recognition in mall kiosks to analyze the images of five million shoppers without meaningful consent. The investigation prompted calls for stricter guidelines and explicit permission before capturing such data.”

Here’s what came out of that investigation The Office of the Privacy Commissioner of Canada (OPC) concluded there was no evidence that Cadillac Fairview was using any technology for the purpose of identifying individuals.

Cadillac, which is a commercial property company (not an OoH media company), stressed it was just doing face pattern detection. The OPC suggested the company had accumulated five million representations of faces (kind of suggesting it was storing a database of captured faces), but Cadillac said what it had stored was millions of numeric strings tied to occurrences of people coming within visual range of the ad totems.

In this latest case, in Toronto, the media company is Cineplex Digital Media, which was recently acquired by Creative Realities. So you had CDM people who are dealing with new bosses and culture, and what I understand to be numerous job redundancies. CRI, in turn, is getting an oddball welcome to Canada! How do you like us so far, Rick?

CDM was evidently doing everything by the book, even posting a notice at the display saying it was indeed using computer vision, and why it’s not a privacy issue. This is in line with Privacy By Design guidelines that were published years and years ago about making consumers aware of the use of the tech.

Should the sign be bigger? Maybe? But how big would it need to be to make everyone happy. Chances are, a more prominent sign would just stir up more unwarranted fuss.

Here’s the bigger thing: there are many, many, many ways your privacy is at threat day to day, and if anonymous video analytics was rated from 1 to 10 on seriousness, this would be a 1 on that scale (10 being the worst).

I asked AI: “Are there any real world examples of face pattern detection breaching consumer privacy.” It came back with numerous examples of facial recognition tech controversies, but nothing on pattern detection.

I also asked Perplexity about the prevalence of surveillance cameras, which kicked back a response that Americans, on average, are captured by security cameras more than 75 times a day, while in big city UK, it’s 4X that count. About 30% of security cameras are tied in to facial recognition systems. So our mugs are being captured and logged all the time by government agencies.

Then there’s the real bad actor when it comes to consumer privacy – smart phone apps. They do location tracking, tap into your camera and mike, harvest contacts and call logs, share usage data, get permissions to phone functions and info that may have nothing to do with an app’s stated purpose, and on and on.

There’s something about the use of cameras that weirdly freaks people out, even as we all merrily document endless moments of our lives with selfies and videos of what we’re up to. So I find it just so odd that people, and the media, get worked up about something that’s just a more sophisticated way of counting people than IR beam-tripping sensors, or parking people at gateways with clipboards and clickers.

If you use video analytics, make your life easier. Very clearly explain what it is and is not.

Then again, Cineplex did that and STILL got caught in a fuss!

I write variations on this post about every five years, going back at least to 2010. Expect another one in 2030 … if I still haven’t lost too many marbles by then.

  1. Craig k says:

    Cameras for passerbys will always engender controversy

  2. Adrian E says:

    Well said, Dave. You’re right that the perception of being spied on provokes a suspicious reaction when people spot a camera. I’m often bemused by the hissy indignation of customers who feel their privacy is invaded whilst holding loyalty cards in their wallets. A store will know more about those customers than they realise, just by analysing regular purchases of cat food and kids clothing for example. I also notice that the indignation often starts with a post on social media, and that has its own way of whipping up emotions. Measurement algorithms vs social media algorithms…

  3. Wes Dixon says:

    6-7-6-7-6-7 (It’s a “kid thing” here, mostly because it annoys the adults.) Bwahahahaha!

  4. Denis G says:

    Dave,
    Once again, you’re absolutely spot on with this piece. We appreciate you continuing to explain the critical difference between facial recognition and anonymous video analytics, as this is exactly the kind of education the market needs with regards to the different technologies and their compliance with privacy. Many thanks!

Leave a comment