US Lobby Group Warns Ultra HD Screens Can Be Energy Hogs

uhd-tv-energy-use-infographic

Via Display Daily

The emergence of LED-lit LCD displays and the disappearance of plasma displays would probably, reasonably, make end-users think that energy consumption is no longer much on an issue in digital signage deployments. LEDs, after all, are far more energy efficient than the older technologies that were light up flat panel displays.

Turns out there is a new issue with energy usage – 4K panels.

The American environmental action group Natural Resources Defense Council suggests, in a new report, that 4K displays consume 30% more energy than regular HD displays.

As reported in Display Daily, the NRDC’s findings suggest:

NRDC’s key findings:  

  • There are 300 million installed TVs in America. Without additional efficiency improvements, a national switch from HD televisions with 36-inch and larger screens to UHD TVs, alone, would cause America’s annual electricity use to jump by 8 billion kilowatt hours – three times the amount consumed by all the homes in San Francisco in a year and as much electricity as is generated by 2.5 large (500-megawatt) power plants. 
  • The switch to UHD also would create an additional 5 million metric tons of carbon pollution from generating the extra electricity required.
  • One-third of all new televisions sold today have screens 50 inches or greater.  TV power use often increases with screen size and NRDC’s analysis showed that some of the really large, least efficient models used as much annual energy as a new refrigerator.  
  • The new High Dynamic Range, or HDR, feature that provides brighter colors and deeper shadows could significantly increase national TV energy consumption. Our testing showed the HDR version of a movie used 47 percent more power than the same title in 4K format. More attention is needed to understand HDR energy use and reduce it.

The report suggests consumers can cut several hundred dollars off the lifetime energy costs of a new UHD TV by buying models with the ENERGY STAR label,, turning  Automatic Brightness Controls on, avoiding some features – like quick start feature on Internet-connected televisions – that waste power sitting in standby mode.

This is TVs, not commercial panels, but presumably some of the energy implications are similar for the 4K panels going into signage deployments.

I did a quick check of commercial displays and found a manufacturer’s 46-inch UHD had a typical energy consumption of 87 watts, whereas a 48-inch 1080P panel (lesser resolution, but 700 nits versus 500 – so brighter) typically needed 44 watts, or half of the 4K display’s needs. It’s hard to get to apples to apples comparisons, so my comparison is admittedly way less than scholarly.

The NRDC makes the full report and an executive summary available online.

 

 

Dave Haynes

Dave Haynes

Editor/Founder at Sixteen:Nine
Dave Haynes is the founder and editor of Sixteen:Nine, an online publication that has followed the digital signage industry for more than a decade. Dave does strategic advisory consulting work for many end-users and vendors, and also writes for many of them. He's based near Toronto.
Dave Haynes

@sixteennine

Decade-old blog about digital signage and related tech, written by industry consultant and shit-disturber Dave Haynes.
RT @StudioHaloLA: Looking for freelance motion graphics / after fx people to help out in the next weeks on some overflow - let me know if y… - 2 days ago
Dave Haynes