Getting Clemson Orange Right On TV Screens, And Why That Can Matter For Digital Signage

September 16, 2020 by Dave Haynes

The 16:9 PODCAST IS SPONSORED BY SCREENFEEDDIGITAL SIGNAGE CONTENT

If you have been around digital signage for a while, you have almost certainly heard a discussion at some point about accurate color reproduction on screens, and the problems big brands can have with that.

The example used most often is Coca-Cola Red, which is a VERY specific red.

It can be a problem at the display level, but it also has to do with the source. A small research team of academics and students at Clemson University in South Carolina are well down the path of sorting it out.

In their case, the problem was Clemson orange – a very specific shade of orange seen on 10s of 1,000s of shirts, hats and giant foam fingers during Clemson football game broadcasts. The orange shown on TV sets and replay boards is not, in some cases, the right orange.

A research project called ColorNet is using AI and neural networks to make real-time color adjustments on the fly to the broadcast signal – using an algorithm light enough that it can run on an off-the-shelf PC.

I spoke with Dr. Erica Walker and graduating student Emma Mayes about the project, and how the technology might be applied as a low-cost box in the back of digital signage screens – so that networks run by brands can really show their true colors.

The chat is a bit technical, but even I got most of it.

One other note – I THINK at some point I reference Clemson as being an SEC team. Wrong. It’s in the ACC. I’m in Canada. Ask me about curling.

This is how you’d reach Walker – eblack4@clemson.edu

Subscribe to this podcast: iTunes * Google Play * RSS

TRANSCRIPT

Emma, Erica, thank you for joining me. Can you give me a rundown on what ColorNet is all about? I know it’s a university project that you guys presented at Display Week, going back about a month or so. 

DR. ERICA WALKER

Erica: Yeah, absolutely. Thanks for having us today. ColorNet is an artificial intelligence solution for brand colors to be displayed correctly on screens.

So not a color solution that would display all colors correctly, that solution already exists. This is specific to a brand color, and in this case, Clemson University’s orange and purple. 

That’s because you guys are working or studying out of Clemson, correct? 

Erica: That’s correct. And actually the solution could work for any color. We just happened to use the colors that we see the most on our own campus and in our athletics. 

Which is orange? 

Erica: That’s correct. 

So this is a project coming out of the graphics communication department, or is it multiple departments? 

Erica: It actually includes a lot of different departments. Each of us on the project is from a different department. In fact, I’m from graphic communications. The students are from engineering and computer science, a variety of engineering degrees. And then my co-creator, or co-inventor, works in a multidisciplinary department with a focus on data science.

 Okay, so what’s the problem you’re trying to solve here?

Erica: Yeah, thanks for asking that. It is something that is very commonly talked about at Clemson Athletic games and probably at other universities as well. But, the orange is incredibly recognizable, our brand orange for Clemson fans. And when you watch a broadcast of the football game or basketball game or baseball game, the orange is always skewed. It’s always normally skewed towards red. Now, obviously the settings can be impacted by the settings on your screen itself, but what if we could address this at the feed level, at the camera level, at the production level? 

And that would ensure that if Clemson orange is a Pantone color that is going to be color accurate, at least coming out of the feed?

Erica: Exactly. That’s really that’s the tipper right there is that we don’t have control over other people’s screens. Like the screen inside your home, we aren’t trying to make any adjustments to that. That would be the homeowner or the screen enter the bar that would have to make those adjustments, but we can make adjustments to the screens inside of our facilities. So the big screen inside of the football stadium, we could adjust that cause we have control over it, but the main thing is just having a clean feed, having a feed where Pantone 165 is a recognizable color and it displays correctly.

And why is that a problem either, you know, if I’m a Clemson fan, I know my orange, but, if I’m a Syracuse fan, maybe it’s a different orange who’s going to know other than the Clemson fans? 

Erica: Right. So, that’s a fair question. On any given Saturday, there are over 70,000 people in the stadium watching the game, and so that’s a big audience, but in general, we just use Clemson orange as kind of a testbed, for this example. So it could be done for soccer teams, you know, in Europe, the big leagues. It could be done for major league baseball, it could be done for NBA finals. It could be done for really anything where brand color is recognizable to a fan of any team of any sport.

And again, you can’t really control the final output, like on my TV, if the calibration is off, it’s gonna show it to be orangey-red instead, or wherever. Will this help that at all? 

Erica: In my head, if the feed is better than more than likely, it will show better on your TV. Now that’s not true if you’ve amped up your colors or if maybe, I know there are settings that are specific to gamers that they like, and so if you’ve changed the color settings on your TV, then that could be a problem, but one of the conversations we’ve been having with these screen manufacturers is what if we could address this at the screen level as well? But obviously, the goal of artificial intelligence is not to weasel our way into people’s homes and make adjustments on their TV.

So that’s not the goal, but we do think that we could address it in, think of like large format displays. So if you go to the Coca Cola headquarters, they want their Coca-Cola red to display correctly on the screens that are scattered throughout their entire building or their manufacturing facility, or anywhere where they have the control over their screens.

So kind of thinking of it from the brand level, as much as from the consumer level. 

Yeah, it’s not really the business application here, I mean, you mentioned that there’s a patent around it and the idea around that is for really super brand sensitive, color-sensitive companies like Coca Cola, and any number of other ones, that they have more of an assurance that the broadcast advertising is going to look in the color that is really important to them?

Erica: Right now, that’s what we’re looking at as brand applications. So, as I said, there are solutions out there to solve, like overall, you know, a correct profile so that your TV shows colors accurately. So we aren’t trying to necessarily do it across all colors, we’re trying to really focus on the brand colors. 

Right. So how does it work and how did you get started on this? This doesn’t strike me as one of those things that you wake up in the middle of the night and go, “I must solve this,” 

EMMA MAYES

Emma: Right, so the basic approach that our team took is that we were trying to make it where when you’re color-correcting, instead of correcting the entire frame, instead, we’re working more with image segmentation. So the current process with athletics is that, oftentimes, they have to pick something in the frame and color-correct to that, and just hope everything falls out. So with basketball games, they look at the court and they say, “okay, the Clemson paw print in the middle of the court has to be brand color. Everything else will just be what it has to be.” 

But we’re trying to just get that right. The idea is that, well, what if we can make it so they don’t have to compromise? So that way, it can be segmented, So we’re color correcting the correct areas and frames as opposed to everything else. The idea was also to decrease the kind of manual burden on the technician when it comes to the color correcting, so we looked at doing image segmentation through machine learning by creating a convolutional neural network. 

I know what those are. 

Emma: (Laughter) Without getting into the nitty-gritty. We usually just look at the acronym CNN, so you don’t even really have to know how to spell it, but what I’m saying is, just the gist of it that we basically looked at this game footage, we pulled it and we used Adobe Premiere Pro and the Lumetri color panel and we basically picked the range of colors we wanted to correct. So that way we can adjust it to kind of perceptually that natural approximation of what we’re looking at for that color brand, and then we pass in the color incorrect and correct footage into the model and it creates a mask and it’s basically just showing pixel by pixel what’s the difference in color.

And so the whole idea is that our model is able to generate these masks and automatically generate exactly what those corrections are gonna look like. So once we created this data, we trained it, and then that way it learned how to color-correct to these brand specifications in these image segmentation.

So that way our grass is in a weird color, our court is in a weird color. We’re just adjusting the jerseys and the Clemson football fan gear and the audience, so it’s fixing the colors that need to be fixed and leaving alone what needs to be left alone. 

And is that because you’re segmenting it and isolating certain elements of it, that’s how you can do it in real-time or near real-time as opposed to doing it in post-processing?

Erica: What makes it able to do it in real-time is partially the hardware. You need hardware that can run on that. And it really just looks like a desktop computer, like a regular box that you’re used to, but we do want it to run it in real-time. And so in order to do that, we try to make everything as slender as possible.

Some neural networks have just millions of parameters that they’re checking on and we kept making things smaller and smaller so that it could run more efficiently. Now there is a point where it gets too small, and it runs too quickly and it’s not as effective. So that’s part of the research piece of this is that the students are learning at what point do we make adjustments to make this efficient versus to make it effective? 

I have this idea in my head, and again, as anybody who listens to me knows I’m not an AI scientist or anything close, is that there is some pretty serious computing hardware, a big server room full of computers doing the work of the neural network, but it’s sounding like you’re saying this is just like a box? 

Erica: Yeah, it can actually run on something as small as a raspberry pie, believe it or not. It doesn’t run in the same frame rate that you’d want to run for an event, but we can run about 8-10 frames per second on a Raspberry PI. You don’t need an entire room full of servers in order to process this in real-time, it’s very doable.

I don’t pick it up and carry it around, but, but you certainly could if you needed to. 

So this is not a million-dollar addition to a TV studio or something? It sounds pretty elemental in some respects. 

Erica: Absolutely, you know, really when it comes down to it and Emma can probably speak on this better than I can, but really all an algorithm is a text file that you have to train, like the real meat of it on our end is training it and making it effective and making adjustments because it is in a new area that you can’t just go and Google, “Hey, I’d like this algorithm that can do this.” We’re actually doing it and modifying it as we go. 

So for a Clemson football game, if you have, I don’t know, 20-25 cameras, whatever it may be. Do you need a processing unit for each of those feeds or is a master feed funnel through one box?

Erica: We only need one. Actually, the way it works is, you’re right, they do have like 20 cameras and range from little tiny GoPro cameras up to, you know, high-end broadcast cameras with 4K, and so those are all processing that color so differently.

 But it all comes into live, it’s coming into a production studio. So if you watch a lot of athletics, like NFL or, even NCAA, sometimes they’ll show you the trucks and inside of the truck, all of those feeds are coming in, and they are making those adjustments on the fly as the feed comes in. They choose which camera feed they want to show, and then it gets projected out and all of that’s happening in real-time. 

And so we actually talked about different places that ColorNet can live within the system and the place where we landed it is that if we have it right inside of that production suite, you only need one device or you can have it on the other end of that production suite, and you still only need one device, but then you’re only color correcting the feed that’s actually going to get put out there, versus correcting all the different feeds on all the different cameras.

Is this a problem that’s common to any live event broadcaster or is it defined by the quality of the equipment you’re using, like would a local community cable operator have a much bigger problem than let’s say Fox Sports?

Erica: The problem is pervasive anytime you have brand colors. I’m gonna show my age on this, but I don’t know if you remember when Reese’s Pieces was the product that was advertised in E.T. when E.T. came out. And so, you know, even in a Hollywood film, you have a brand and that brand cares about their colors.

And so it is pervasive everywhere, but the piece of equipment actually can run anywhere, it doesn’t need a fancy studio, it doesn’t need ESPN type quality. It could run at any small studio just as well as it runs here, because once you’ve trained it’s really running on its own. It’s capable of doing the work without a lot of manual input. 

So in theory, is this a box, like I could order it, in theory, on Amazon, pull it out of the box, plug it in, plug the feed-in and plug the output in and give it power and off you go, or is this a whole bunch of tweaking and software and behind the scenes to make it all happen?

Erica: To answer that really the box that we ordered, the box that this runs in, it was ordered off of Amazon. It is just like a plain old normal computer box, you know, like a desktop, but the magic happens inside of the training and inside of the algorithm and inside of the adjustment to the code, so it’s not really the “special sauce,” so to speak really what happens, prior to receiving the box. 

Right. But do you train it? Let’s say heavens forbid that another SEC school uses this, would that box have to be trained for the Crimson tide colors or whatever?

Erica: Yeah, I think you understand a lot more about this than you’re letting on, but that is a 100% the case. We would have to train it each time, as needed per color, is our current structure, but I’m actually gonna let Emma jump in on what we’re thinking about moving forward.

Emma: When we trained for Clemson orange and Clemson purple, the way our data was set up, it was that you’re going to look for these ranges of colors around the brand color so that way, you know what kinds of areas you’re going to be shifting to be correct. Our goal is to try and kind of generalize it.

So the idea is, we can give some kind of hardware to deliver to the shader and painter with these corresponding teams. So that way they can change what color it is. So we’re going to come up with the new approach to it, where instead of looking for this range of colors, to then shift, we’re going to look for these areas. So we’re hoping to train so it can pick out the jerseys where the fan colors are and it’s very adjustable considering what those colors are. So that way you could pick up this technology and plug it in for a different team and it could work that way instead of just being limited to a specific brand’s color palette.

Right. Okay, so I’m a digital signage guy. This is a digital signage podcast. I wonder, of course, what the applications potentially are for the digital signage business. 

You mentioned, early on Coca Cola and how across its a corporate campus and its many corporate campuses really, if it has a signage network with the Coca Cola brand on there, if the output PC or PCs or media players are outputting nominally incorrect colors, this could be put in the middle of it?

Erica: Absolutely. So, that’s one approach that we’ve considered. So let’s say that let’s use our Coca-Cola campus example. 

They want to ensure that no matter what footage is going on what type of screens, they may have multiple brands, I don’t even know, that the Coca Cola red is always correct.

And so in that case, you actually would put ColorNet at the screen level, so we would want to pull it down to a much smaller device, more like that Raspberry PI size, so that you could actually just slap it right onto the back of each screen or each set of screens and have that screen Coca-Cola ready, you know? And so you can sell it that way to a brand owner versus having it at the live video remixed phase. 

Do you sense the addressable market for this has a whole bunch of brands in particular, who are that color-conscious or is it a subset that really cares and others who, you know, “our brand color’s blue” and that’s all they say.

Erica: Actually, coming from my background, I was steeped in brand from a print perspective. And so from a print perspective, the tolerance of brand colors on your box or bottle or flexible packaging, is very small. It’s measured in Delta Es and they say it’s a 2 Delta Es.

Most companies don’t want you to be any further off the brand color specs than Delta E. And that’s basically just a measurement saying, this is as close as we are willing to purchase the product. Like if it goes over 2 Delta E, we don’t want your printed product. And so coming from that background, all of the big brands care, all of them want their color to be correct. 

I know there’s an argument going on right now, that might’ve stemmed out of that recent in AB and SID type conversation, from Display Week. But this idea that screens are actually changing our tolerance for brand colors and at some point, are we not going to care so much about brand colors? Because we are willing to accept them further apart, from the brand spec, because of the screen differences that we see. 

I still think that brands are willing to put money, time, and effort behind their branding in general and that they are going to care if their product looks correct because it is as much a part of their identification as any other part of their business. 

Yeah. That would make sense. I’m sure there’d be some reticence around spending thousands upon thousands of dollars per site to do that, but if it’s, as you say, a Raspberry PI device that could just plugin via the HDMI feed or whatever into the display, then yeah, maybe they’d be happiest clients to do that. 

Erica: Yeah, especially for those big brands, I bet you and I’ve never sat in the branding room for Coca Cola, but both Coca Cola and Pepsi use a color of red, right? I bet you that their branding teams would just go to battle over making sure that all of their products are the correct color of red so that there is no confusion on the customer level of which product you’re actually looking at.

Yeah, well, I’ve certainly heard those stories in the past when it comes to digital signage and Coca-Cola red and a few other colors that the Coca Cola people flip out if it’s not right, and they had some big problems with early-stage video walls and things like that and there was a particular product that they really liked because of the saturation levels and everything that gave them as close to the print grant as they wanted to see, I don’t know if it was that 2 Ease measurement or whatever you were talking about, but it was good. 

Erica: Yeah, and you know, some companies will have different Pantone colors for their print products compared to their screens. So for instance, Clemson has two different oranges, and when it comes down to it, the Pantone that they’ve chosen for screen and the Pantone that they’ve chosen for print products, so the difference between CMYK and RGB, those two oranges look the same. 

So it comes down to this perceptual thing. So it’s not always about hitting the same Pantone and it’s about the perceptual brand recognition of that orange, whether it’s on a car, whether it’s on a screen, whether it’s on a Jersey, and so on.

Okay. So this is a combination product or initiative of a couple of professors, and I think four students, is that accurate? 

Erica: Yeah, that’s correct. We had four students, and then we actually just added a new student this semester. So obviously the great part about students is that they have wonderful, fresh ideas coming into a project. The sad part is that they do graduate and go away, like Emma graduates in December. 

And so, there is kind of this rotation of students who have worked on the project over time. 

So where does it go from here at some point Does this become a company or does it get licensed or was that just so far off that it’s hard to really kind of rationalize? 

Erica: Certainly from our perspective, our goals align a lot more with the research end and sharing what we find, but from a university level, we are involved with the university research foundation and their job is to help connect us with potential manufacturers or companies or lines of products that would benefit from us.

And so from the university level, they have a lot of interest in that. I’m not opposed to a company or partnering with an existing company. But certainly, you know, the students getting experience out of this and our personal research goals, our primary. 

In the conversations with the companies provide a lot of opportunities to, have funding and to expand, and to come up with new ideas of how this technology could perhaps be implemented. 

Is there an application as well for things like medical imaging and seismic imaging where life and death decisions or very expensive decisions are made based on the color of some high-resolution image?

Erica: Absolutely. We’ve been looking at expanding this out into some different applications and you really hit the nail on the head as one of the ideas that our team had bounced around is, what if this could be used to emphasize a lifeboat or something like that is lost at sea, you know, how could we make it really fast and really easy, despite all the reflections that waves make? And we’ve looked at it as an agricultural thing again, where it’s emphasizing, if there are healthy plants or if there are weeds, so it really could be modified and used in a lot of different contexts, just like you’re saying.

So what came out of SID in that presentation that you did? Did you have companies or other really smart people coming up or contacting you?

Erica: Yeah, exactly, but not so many from virtual conferences we’ve found, but when we’ve done some presentations in person and unfortunately, SID was not one of them this year, which I was super excited about that audience.

But when we have presented in person, it has led to lots of conversations with different companies and ideas of how it could benefit them and their customers. 

Okay, so if there are people listening to this who actually understand it fully, how would they track you down and how do they sort of get involved in this in some way, or get some questions answered?

Erica: We would love to hear from people. Again, it’s so exploratory still at this phase, and so hearing what real companies with real customers, what they need, what is their pain point and how could we consider ColorNet as a potential solver of that pain point, just reach out to us. My email is at eblackor@clemson.edu. 

Okay, and you guys have a football team, right? 

Erica: (Laughter) We hope we have a good one again, fingers crossed. 

Is it a challenge because people think so much about Clemson as, you know, a big sports school, football school, when this is a totally gearhead kind of science project with AI coming out of Clemson, do they go, “Oh really, you guys do that too?” 

Erica: Well, we’re hoping that we actually solve a problem for our athletic department. So fingers crossed, we’ve proved it out that it can be done. And right now we’re just kind of taking a back seat to whatever Coronavirus brings for this coming season.

But our original intent was to be up and operational for our athletic department this fall, which we’re capable of doing, but again, we’re just kind of taking a back seat to all the decisions that they’re having to make to keep their student-athletes safe and the fans and all of that. 

Which is a moving target right now. That’s broadcast may be more important than ever for the next few months. 

Erica: I agree. There’s no telling where all this is going to go, but we have our first football game on Saturday, and so fingers crossed, everybody stays healthy and well, and we can get that type of normalcy back for Saturdays. 

All right, Erica and Emma, thank you so much for spending some time with me. I really appreciate 

Erica: This was a lot of fun. Thanks for inviting us.

Leave a comment