Talk:720p

From Wikipedia, the free encyclopedia

TV This article is part of WikiProject Television, an attempt to build a comprehensive and detailed guide to television programs and related subjects on Wikipedia. If you would like to participate, you can edit the article attached to this page, or visit the project page, where you can join the project and/or contribute to the discussion.
??? This article has not yet received a rating on the quality scale.

This is the talk page for discussing improvements to the 720p article.

Article policies


"NBC uses the tagline "the nation's finest high-definition standard" in advertising its 720p programming."

But the NBC website claims that all NBC HD broadcasts are in 1080i:
"NBC broadcasts HDTV in the 1080i format, which provides the highest possible resolution to our audience." [1]
Lee M 17:24, 12 Dec 2004 (UTC)
I see that the reference to NBC has now been changed to Fox. It's all a bit academic to me anyway because I'm in the UK... Lee M 17:08, 17 Dec 2004 (UTC)

which one is better 720p or 1080i? cheers

There is a consensus building that 720p is better for moving images, whereas 1080i is better for video with a lot of still or near-still footage. Thus, movies, nature, and drama content are better in 1080i, like on PBS-HD and NBC. Sports are better in 720p, like on ABC, Fox, and ESPN-HD. Obviously, 1080p60 would be the best of both worlds, but too bad, we can't have it. --Locarno 22:56, 9 Feb 2005 (UTC)
1080p60 just requires more efficient data compression. They'll get there eventually, and then they'll start working on even higher definition systems.
Interesting that the ATSC saw the two systems as content-specific, but the Networks, possibly for financial reasons, have each opted for just one. It'll be interesting to see what happens when Sky HD launches in the UK.... incidentally (or perhaps not) Britain will use 1080i25 and 720p50, meaning that despite having the same picture resolution as the US there'll still be frame-rate conversion issues to contend with. Not to mention the whole thorny question of up-and-down converting from PAL and NTSC.Lee M 02:50, 8 Mar 2005 (UTC)
Sorry, but what Locarno says is only true when comparing, say, 720p/60 with 1080i/30. If that is the case he should have stated so. If no refresh rates are specified one is forced to assume they are the same, in which case the opposite of what he wrote is true. Motion portrayal is smoother when using interlaced scanning, for a given refresh rate. 83.104.249.240 (talk) 05:25, 13 March 2008 (UTC)
No, that's not a sensible comparison because now you have 1080i at it's maximum temporal resolution against 720p with it's bandwidth cut in half for no stated reason. In a "which is better" comparison, you have to assume that both contenders are working at their maximum potential. Anyone who is interested in motion portrayal would use the maximum frame rate available for a given system, so 720p wold have the advantage of less line crawl on moving objects. Algr (talk) 15:11, 4 June 2008 (UTC)


Contents

[edit] Television resolution chart

How does one edit the Television resolution chart on this page? It contains incorrect references to "240i" and "288i" In fact, these are progressive, (240p) even when displayed on NTSC and PAL sets. Algr 05:01, 22 February 2006 (UTC)


[edit] Fox Revert

Fox uses 720p, but it is possible that some of their affiliates convert this to 1080i before broadcast. Many sets do a poor job with native 720p, and look much better if such a signal is converted to 1080i by something else first. Algr 05:49, 30 April 2006 (UTC)


[edit] citation needed?

Some U.S. broadcasters use 720p60 as their primary high-definition format; others use the 1080i standard.

We already have references showing that both 720p and 1080i are in use. Algr 07:05, 13 August 2006 (UTC)
What Algr says is certainly true, however I would say that the 720p material is almost entirely captured at 24 frames per second (or the NTSC-friendly 23.976 Hz variant). The small amount that isn't is almost certainly captured at 29.97 frames per second in the U.S. The assertion that Some U.S. broadcasters use 720p60 as their primary high definition format therefore does require citation. 83.104.249.240 (talk) 05:14, 13 March 2008 (UTC)

[edit] Interlace is misunderstood ...

These 2 sentences contain gross misunderstandings ...

Progressive scanning reduces the need to prevent flicker by filtering out fine details, so spatial (sharpness) resolution is much closer to 1080i than the number of scan lines would suggest.

and

The main tradeoff between the two is that 1080i may show more detail than 720p for a stationary shot of a subject at the expense of a lower effective refresh rate and the introduction of interlace artifacts during motion.

Interlace doesn't "filter out fine details". Some interlace systems may low-pass filter vertical detail (only) to a very slight degree, in order to eliminate interline twitter that occurs when the subject includes narrow horizontal stripes that approach the vertical field resolution. 1920 horizontal pixel resolution is definitely greater than 1280.

Interlace doesn't introduce artifacts during motion. Interlaced video is quite good at capturing and portraying motion. Interlace artifacts are only introduced when interlaced video is converted to progressive scan.

That is completely incorrect. Interlacing can introduce spatio-temporal aliasing (edge flicker, line crawl, etc) unless video is properly low-pass filtered in the vertical dimension. Interlacing samples space-time at half the rate of progressive scanning.

Don - both of us are guilty of not signing our posts. Clearly you know what you are talking about... but I don't completely agree with your statements. As you know, aliasing can occur when you digitally sample something that has information at a frequency greater than half the sampling frequency (the Nyquist frequency). Interlace has half of the vertical resolution per field, but the same vertical resolution per frame as progressive scan. If the picture being captured by the video camera contains information with a lot of detail, artifacts can occur. However there are many reasons why this is not normally a problem. The camera would have to be in focus on the object, otherwise the lack of focus serves as a low-pass filter. If the object or camera are in motion, the motion blur low-pass filters the detail. The accuracy of the lens and the sensor (CCD, CMOS) can low-pass filter the detail. In general, most scenes being shot don't have the type of image detail that causes visible artifacts. Very high-end cameras filter the vertical resolution in order to reduce interline twitter, but most cameras don't... it isn't necessary for most situations. Finally, I would have to say that interlacing samples space-time at the same rate of progressive scanning, to be fair. Space is sampled at half the quantization level at twice the frequency... but the trick of offsetting the samples between frames by on line (spatially) gives the eye a moving average of the detail and the motion. If the object is moving, your eye gets updated information at a better refresh rate. If the object isn't moving, your eye still gets all of the detail of the full frame (due to the way the human visual system works). I've got nothing against 720p... it's great stuff. Progressive scan has advantages over interlace, particularly when it comes to editing (like slow motion detail), or display on progressive scan displays (which are all the rage for TVs these days, plus they are a part of every PC). But interlace is a technique that works, even if it isn't well understood. Tvaughan1 23:52, 26 October 2006 (UTC)

It is difficult to work out who wrote what in the discussion above as only the last contribution is signed but it seems to me that both contributors are making assumptions without expressing them. Tvaughan1 is assuming the same frame rate between the two scanning modes, such as 1080i/25 and 1080p/25. The former refers to a system with 1080 lines of vertical resolution interlace scanned at 50 fields per second, giving a refresh rate of 25Hz. The latter refers to a system with the same number of lines of vertical resolution progressively scanned at 25 frames per second, also giving a refresh rate of 25Hz. This is the correct way of making comparisons between the two scanning modes, the result being that for static images the two modes give identical results whereas for moving pictures the progressively scanned system gives a better perception of vertical resolution at the expense of the quality of the motion portrayal, for a given refresh rate. The other contributor (Don?) is mistakenly assuming that moving from interlace scanning to progressive scanning necessarily involves doubling the frame rate. He is therefore comparing, say, 1080i/25 with 1080p/50 without explicitly stating so. This is an unfair comparison because the latter uses twice the bandwidth of the former and should naturally look better to the viewer—though I would say not better by a factor of two. 83.104.249.240 (talk) 05:01, 13 March 2008 (UTC)

[edit] 720p vs. 1080i

Just a suggestion, but I think it would be better to have a separate article for 1080i vs. 720p, and on that page include a table of what broadcaster uses what standard. Maybe even a separate page for that, updating as each one goes along? It would take some clutter out of this article, and would probably make a more sensible link on the 1080i page. Uagent 10:41, 5 February 2007 (UTC)

In order to compare the quality of motion captured and displayed at 720p with 1080i the frame rate of each must be specified or the results are meaningless. This section as it stands doesn't make sense. Is the author comparing 1080i/30 with 720p/30 or with 720p/60, or even with the much more common 720p/24? 83.104.249.240 (talk) 03:47, 13 March 2008 (UTC)

[edit] 768

Almost all LCD televisions in the 20"-40" range have resolutions of 1366x768. (Plasmas seem to be 1024x768, which is 4:3, but they have rectangular pixels.) Wouldn't this provide a worse image than an LCD with only 720 lines? With LCD computer monitors, you have to set your desktop resolution to the native resolution of your monitor or you'll get blurry images. Maybe because computer monitors deal with text, the problem is more pronounced on

[edit] Why does 720p even exist?

Does anyone know how this, or any of the other HD standard resolutions, came about? The number simply don't make sense to me. Can someone who really knows this stuff comment on my stream-of-conscious below?

The maximum theoretical resolution of NTSC is 525 lines, and of this 486 lines are visible. When they introduced DVDs, the obvious solution was to round this off to 480 lines, and provide the rest of the 525 "on the fly" in the players. So far, so good.

And then comes HDTV. This is where I get really confused. The painfully obvious selection for HDTV would be 480 x 2 = 960. This is very close to the existing 1080 standard, but because it's an even multiple it makes the scaling a whole lot simpler. So why isn't 1080 actually 960? Ok, so maybe you want a standard that can do both NTSC, PAL and SECAM, but that suggests you'd want to find a factor between 480 and 576, which suggests a number based on a multiple of 96... like 960.

The history section notes that the 720 was originally an analog system that was later converted to digital. OK, but how is it that they selected this format? I seem to recall the Japanese standard was something close to 1050 (which makes perfect sense given they used 525 at the time) so it seems like they simply selected this number "out of thin air". Does anyone know why it's 720 specifically?

And how was it that this selection of two incompatible standards has managed to survive to this day? It's one thing to pick a lower resolution standard for an analog system, but since everything went digital the cost of the extra resolution is pretty small. Why didn't it get eliminated somewhere along the line? Or, if the article is correct and the resolution of 720p is actually as good as 1080i, then how is it that 1080i survived? How did we end up in this ridiculous situation of having two standards that don't even have a common multiple?

Maury (talk) 21:57, 9 January 2008 (UTC)

I'm not sure about the exact numbers, but it takes about the same analog bandwidth to transmit 720p as 1080i:
720x1280x60=55,296,000
1080x1920x30=62,208,000
In the early '90s the computer industry experimented with interlaced computer displays, and consumers absolutely hated them, so they wanted to make sure that progressive displays were standard. For things like reading text on a screen, a 720p CRT would look much better then 1080i. Algr (talk) 00:07, 26 February 2008 (UTC)
The above assumes that the 720p signal runs at twice the frame rate as the 1080i signal—in this case 60Hz and 30Hz, respectively. However, that is by no means universally the case. 720p/24 is in fact much more common.
I believe that a vertical resolution of 1080 pixels is derived from the number of active lines used by the NHK 1125-line analogue HiVision system. The 1080-line standard was designed with interlace in mind, and with progressive scanning as an option. The 720-line standard was designed only for progressive scanning from the outset and the 33% reduction in line count was felt to give a similar perception of vertical resolution as the 1080-line interlaced standard, at a given refresh rate. Ideally both standards would have the same number of horizontal pixels because interlace has no effect on horizontal resoluton but to maintain an aspect ratio of 16:9 with square pixels means that the 720-line standard needs 1280 of them while the 1080-line standard needs 1920. So, while it can be argued that 720p gives a similar perception of vertical resolution to 1080i at the same frame rate, 1080i has potentially a significantly higher horizontal resolution. I say "potentially" because 1920 pixels of horizontal resolution are not often achieved in practice. Until quite recently HD television cameras had CCDs with only 1440 (rectangular) pixels horizontally. The most common HD videotape recording formats (HDV, DVCPro-HD and HDCAM) down-sample each line and record only 1440 anamorphic samples. In the case of DVCPro-HD working at 1080i/29.97 it actually records only 1280 samples per line, levelling the playing field a little, but not fully because at 720p/24 it records only 960 samples per line. The only two videotape formats in common use that record the full horizontal resolution of 1920 samples per line are HD-D5 and HDCAM SR. Then again, in order to save bandwidth broadcasters often reduce the horizontal resolution of the 1080i signal to 1440 anamorphic pixels when coding it up for transmission. How many people have 1920×1080 resolution television display, anyway? I know the number is growing but most displays branded "HD Ready" have only around 1360×760 pixels. Even if you've paid the huge premium and invested in a real 1920×1080 display are you sure you're getting 1:1 pixel mapping, and can you see the difference anyway? It really is a minefield. 83.104.249.240 (talk) 04:36, 13 March 2008 (UTC)

[edit] 720p vs 1080p - Screen Size

If a TV's size is only, let's say, 1366 by 768 pixels, is there a noticeable difference between 720p and 1080p on a screen of this size?--Just James T/C 14:45, 3 June 2008 (UTC)

Because 720 and 768 are so close, there is a fair amount of loss when images are scaled - every 15 lines in the signal has to be stretched into 16 lines on the display, resulting in a lot of detail falling between the cracks. A true 720p set would look much better, given a 720p signal. The 1080p signal would suffer less from this effect, but would not look as good as a hypothetical 1366 by 768 image. (Such as if the set were being used as a computer monitor.) Algr (talk) 15:21, 4 June 2008 (UTC)