“Faux DOOH” Is Cool, But Can We Be A LOT More Open About What’s Really Going On?

July 27, 2023 by Dave Haynes

London double-decker buses have eyelashes that get mascara touch-ups when they roll past a giant brush mounted on a building.

An iconic giant picture frame along a Dubai freeway is filled with an Adidas ad celebrating Lionel Messi’s World Cup win.

Holograms loom over the buses of Manchester City’s football club as they roll through Tokyo.

A Coach pop-up store in New York’s SoHo has a Rube Goldberg machine on the outside knocking out handbags running on a conveyor system.

A giant Barbie steps out of its packaging on a plaza near Dubai’s Burj Khalifa skyscraper.

They’ve all generated huge amounts of social media buzz recently, and they’re all just computer-generated graphics jobs. That would all be fine, except the people and the brands behind them rarely come clean with the fact that these are not visual projects that people can go see in person. Because they’re not there. More on that later.

It’s an emerging medium that takes physical spaces – buildings and landmarks – and uses (usually) well-executed CGI work to give them a fun, augmented reality overlay that is then pushed out on social media as videos and images. It’s been called Virtual DOOH, Fake DOOH or the one I have warmed to, Faux DOOH.

It’s a variation on all of those videos on social media channels that purport to be “naked eye 3D” visual illusions on big LED billboards, usually in China. Except the creative sometimes escapes the physical boundaries of the LED display to boost the visual impact, and often, there’s no LED display actually on that building. It’s all CGI.

Like this …

Legitimate outdoor media companies like UK-based Ocean Outdoor see forced perspective visual illusions as being less about what’s on the screens, and the audiences walking through a public space like Piccadilly Circus or Times Square, and much more about the “amplification” of the campaign on social media channels. While thousands might see and recall a campaign in person on a big DOOH screen, many multiples of that audience might see those whiz-bang visuals on TikTok, Twitter, Instagram and Linkedin, and then repost them to followers.

I get it. But here’s my problem …

It’s misleading – which can frustrate consumers and disappoint brands who want to emulate these kinds of campaigns, only to learn what was done was fake and either not technically possible, or extraordinarily difficult or expensive to actually do.

I’m not real happy about being misled, and I doubt I’m alone.

Consider this post, which even, somewhat bizarrely, uses the term BREAKING … like this is news. It just flat-out suggests this is a store in Manhattan that people can check out as visualized.

The writer even suggests in follow-ups that if people are in NYC, she “definitely recommends checking it out.”

But it’s not there. There is a pop-up store, but none of the crazy stuff running off the building corner as seen in the video.

This really hit home for me the other day when my son – who is heavily into generative AI – showed me a video of shape-shifting visuals in some store. I said it was just CGI, and he was very disappointed. He would “totally go in there” if it was a real store. This is a guy deeply knowledgeable about generative art, and his missed it.

I said these faked-up stores was an emerging thing and too few of the people behind them were being clear about what they were showing and doing.

This somewhat embarrassing, gushing story in London’s Evening Standard goes on and on about the virtual mascara campaign in the Transport for London system, COMPLETELY oblivious to it all being a CGI effort that doesn’t exist.

These guys assumed it was real, as well.

Most of the entertainment we consume day to day has a degree of computer-generated visuals, whether that’s the extremes of Barbie or Oppenheimer. But consumers know that’s the case. It’s entertainment. We’re happily suspending disbelief to be entertained.

Duped Consumers

The problem with a lot of this Faux DOOH and fake storefronts and “activations” is that if you scan through the comments that append these social media posts, most of those people providing reactions think what they’re seeing is real. So people are being duped.

Maybe that’s not one of modern society’s great problems, but it does just add to the seemingly mainstream notion today that lying is OK. I don’t know how many times I’ve gone after, in writing, the many Chinese companies that try to market their LED displays as being built for “naked eye 3D” – as if what they manufactured specifically enables that (it’s a function of clever creative, not display hardware).

The counter argument is that these things get brands and other end-users thinking and asking about doing work that’s outside the predictable norms. If it starts a conversation that leads to something fun or amazing, that’s actually feasible, that’s good. As in, “That’s not possible, or is but would be INSANELY expensive, but let’s talk about what we could do.”

This Instagram post by CGI artist Shane Fu, of a Zara store in SoHo, generated many millions of views, with comments  suggesting most people assumed it was real. One commenter even, somehow, reinforced how it was particularly amazing to see it in person.

Zara contracted Fu to produce the piece, and it obviously paid off in social media buzz. His posts involving similar kinds of work are quite coy – not really declaring they’re just concepts.

Clarity Is The Answer

Maybe the simple answer is for people to provide clarity about what they’ve done and what people are seeing.

Instead of “Tram cars made to look like wine bottles roll through the streets of Bordeaux!” … how about “We had some CGI fun to make Bordeaux tram cars into rolling wine bottles.”

That’s still visually fun. It will still get shared. But no is then heading out from their flat or office hoping to see these things.

There’s already far too much bullshit out there. We have no end of hologram products that aren’t. MicroLED that isn’t. AI-powered everything. We don’t need more BS.

Leave me out of the politic discourse, but even if we’re in an age in which lying seems somehow OK, it’s not.

  1. Noel says:

    I was actually disappointed when I found out the tube didn’t actually have a huge fake rubber eye lash stuck on it.. as I was in London that day too

  2. Jeremy says:

    Good post, Dave. I think this speaks to the issue that has always and will only increase on the internet of fake content. As a kid, the website Snopes.com would expose hoaxes. It was one of my favorite sites of the early internet (circa 1998 age 10 or something). Now – there is just too much stuff to verify + I bet very few people are even aware of Snopes or other ways to check this kind of content. With ChatGPT and all the buzz about AI, it’s only going to increase. I wonder how many of the responses and “impressions” to the examples you posted are automated bots simply saying an automated version of “cool!” or “thats dumb!”. There exists the “dead internet theory” https://en.wikipedia.org/wiki/Dead_Internet_theory
    It will be quite annoying to see in about two-three decades the internet go from a fascinating and new connected digital world to one where fake bots make a fake CGI video and post it on a bot account and all the responses are bots giving automated responses.
    Your post here has some Andy Rooney “this grinds my gears” to it – but my god do we need it. Thanks Dave.

  3. Ken Goldberg says:

    I was thinking FOOHBar would also work.

    Just when we have learned to not believe anything we hear, we now have to learn not to believe what we see. I don’t think either bodes well for humanity.

Leave a comment