To 1080P, or not 1080P
April 2, 2007 by Dave Haynes
I usually feel like I need a little cheat sheet on an armband when I get ensnared in some acronym-riddled chat with video or PC nerds. I still don’t have all those SXGA, SVGA, WXGA and whatever GA things all sorted in my head, and now I am having to deal with people who want to know whether 720P or 1080I or P is the way to go for HD.
I’m still pulling a few prospective clients over the hump from just using standard def TV, never mind sorting out the merits of one HD standard over another. So I am definitely no authority. When in doubt, I research. And in this case, I see a wide range of opinions.
The common thread seems to be that for running video, and using these things as displays as we do in this industry, there’s not a whole pile of benefit in going all the way to 1080. But for monitors – for people who want to sear their retinas sitting up close and personal to the screens – then 1080P has benefit.
In looking around, I found tech guru Walt Mossberg, of the Wall Street Journal, has a pretty good explanation of where things sit.
Q: You didn’t mention a burning issue in the HDTV arena right now: whether to spend the extra money to get a set that can handle the highest resolution, called “1080p”. What’s your view on this issue?
A: I didn’t mention it because I don’t think it’s an important factor at all. Most HDTV sets max out at a resolution called “1080i,” which is gorgeous and is used by several of the TV networks currently broadcasting in HD. Theoretically, a resolution called “1080p” is even better (I won’t go into the boring techie details of the difference) and it will be used by some new gadgets, like certain game consoles and players that handle the new disk formats battling to succeed DVD. It can also be used by PCs connected to a TV set.
But there is no TV network using 1080p, or planning to use it, anytime soon. Plus, most people can’t tell the difference between 1080i and 1080p, especially at the distances at which people typically sit to view large-screen TVs.
So, unless you are a techie, or a hard-core gamer or videophile — or you plan to use your HDTV mainly as a PC screen — I see no reason to spend a penny extra, or wait a day more, just to get a set capable of handling 1080p. If you like a set for other reasons, and it happens to have 1080p capability, think of it as a bonus. But I wouldn’t make 1080p a major criterion for choosing a set.
When it comes to 1080i versus 720p, the difference appears to be neglible.
With 720p video you get up to 60 frames per second versus 30 for 1080i. So it is as good or better for moving images. For static images, however, 1080i will give you higher quality visuals. Whether anyone other than professional photographers will be able to see the quality improvement is another matter.
Oh, you’d see the difference. 1080p is decidedly better than 1080i/720p.
The latter two take similar bandwidth; as you say, a good rule of thumb is “720p better for moving, 1080i better for static images”.
The point most people forget is: YOU would see the difference, if you are the customer, excitedly staring from 12 inches away at his new Digital Signage investment playing on a 50″ LCD. Unless your signage includes flight delay details, no-one else will notice the difference; certainly not the viewer at whom this is all aimed.
And keep in mind: 1080p is double the bandwidth/data traffic cost of 720p/1080i. And while you can buy cheap 720p cameras, and there is 720p content available, there’s not much 1080p.