Talk:HDTV blur
From Wikipedia, the free encyclopedia
Contents |
[edit] Wrong article title?
I can't help noticing that of the six factors itemized in the opening of this article, only one really has much of a connection to HDTV. LCDs, 3:2 pulldown, telecine lag, and intentional CGI motion blur all predate HDTV.Algr (talk) 07:41, 23 December 2007 (UTC)
I wouldn't disagree with these points. "HDTV blur" connotes the various artifacts more generally than any other terms. Also, this informative page is meant to ultimately bubble up to the top of a google search for "hdtv blur" and serve as an informative explanation for people who don't understand where the blurring is coming from on their hdtv set.
Many less tech-savvy consumers who are buying high definition sets to replace their tubes are noticing the blurriness and wondering "why is my HDTV blurry?" Even though the subject matter relates to a multiplicity of technologies and topics, this is what I felt the most common search would be in terms of the average person trying to understand the problem.
There are similar resources the pre-date this wiki page, such as "HDTV lag" that inspired me to create this page. Ahigh (talk) 07:58, 7 January 2008 (UTC)
[edit] Half Motion Blur
I removed the references to double the framerate halving the blur. It was unreferenced and likely the result of the false assumption that the perception of blur *must* be lineraly related to the refresh time. Without a reference or expert knowledge that this is the case (could be but needs to be tested not assumed) please don't add this claim again.Logicnazi 21:50, 19 August 2007 (UTC)
[edit] Understand retinal blurring before you jump on your high horse nazi boy
I appreciate the contributions (or deletions as it were), however, if you read through the references (especially the Poynton reference) and actually understand how retinal blurring works, you'd realize that 120hz doesn't solve retinal blurring. Strobing does. 120hz reduces retinal blur by 50% versus 60hz. That is all.
I've been making video games now for over 20 years, including an arcade racer from Atari Games coin-op - San Francisco Rush 2049 (my name is Aaron Hightower). I know what I'm talking about, and I created this page to educate people.
Obviously, it's open to edit from anyone who knows more than me, but you can't say that I don't know what I'm talking about when I say 120hz cuts retinal blurring in half, it does. I know. Because I know how it works, and you clearly don't.User:Ahigh 8:37pm, September 17, 2007. —Preceding signed but undated comment was added at 03:39, 17 September 2007 (UTC)
- Where is a source? Poynton does NOT say what you want. As far as I could see he makes no claims at all about a quantitative amount by which perceived blurring is reduced. He makes claims about 50% changes in some other things but NOT about the actual apparent perceptual blur. No one doubts that going to 120Hz vs 60Hz reduces retinal blur the question is whether it reduces it by 50%. Hell it's not even clear what it *means* to reduce retinal blur by 50%. My best guess at an interpretation of this is that if we sat down a bunch of people and showed them a bunch of videos (not telling them which was which) and asked them to estimate their relative amounts of blur they would say this one has half as much blur as that one but I didn't see any suggestion of this in Ponyton or anywhere else.
- Remember just because it might reduce the effects that *cause* blurring by 50% doesn't mean that it reduces the perceived blur by 50%. For instance doubling the intensity of a sound wave doesn't double the perceived volume and the same effect could be at play here. Besides at some frame rate there is going to be NO perceived blurring so it can't always be the case that doubling frame rate halves perceptual blurring. In any case I'm fine with the article as it stands now (saying it reduces blurring) but I would like you to point out where Ponyton contradicts me before putting back the 50% figure.Logicnazi 16:01, 23 September 2007 (UTC)
[edit] 50Hz
Are the problems greater or lesser at 50Hz? (well, obviously there isn't the 3:2 pulldown problem - but what about the other factors?) zoney ♣ talk 15:14, 20 September 2007 (UTC)
I don't have as much experience with 50hz, but all the issues are the same in general, except, as you mention the 3:2 issue. There really aren't any significant differences besides that. I do know that the flicker of 50hz is much more noticable than 60hz on a CRT. Another difference is that games can sometimes achieve a solid 50hz performance easier than a solid 60hz performance, and therefore may actually look smoother on a CRT at 50hz compared to another CRT at 60hz.
The strobe effect on LCD's (LED backlighting) has the ability to be designed to flicker less than a CRT trading off a slight bit of retinal blurring. I have yet to do my own personal tests on the Aptura and LED Motion Plus technologies. I do know that Aptura sets are more popular in the 50 hz formatAhigh 03:45, 7 November 2007 (UTC)
[edit] retinal blurring
It's not effects that cause retinal blurring. It is "sample and hold display technology" that causes retinal blurring.
Leaving sample and hold, and changing the frequency of updating to double, halves the effect of retinal blurring due to eye tracking moving objects. It's just that simple. Again, prove to me that you understand retinal blurring, and we can move on. You said poynton doesn't make claims about a linear relationship between hold time and amount of retinal blurring due to eye tracking, but he explains it very clearly. And in fact, leaving sample and hold and doubling framerate halves retinal blurring.
Since you're the logic nazi, let's assume that I am wrong, and I will demonstrate it's absurd.
Retinal blurring is the result of the pixels being on while your eye sweeps past an image.
If there isn't a linear relationship between the hold time period and the amount of retinal blur, then the relationship would be nonlinear. So what nonlinear relationship would it be? Of course it's absurd to believe that the reduction in retinal blur would be non-linear to the hold time, therefore it is proven.
Is that enough of a proof for you? I suspect the problem is that you simply don't understand retinal blurring or you simply refuse to believe it exists somehow. It's a very subtle effect and something that very few people understand. But that doesn't negate it's existence or comprehension of it's existence by people who understand it. Ahigh 03:45, 7 November 2007 (UTC)
[edit] 120 Hz adds to lag?
I was wondering if there was a source that mentioned why 120 Hz technology added to lag. I assume it's simply because it has to process an extra frame, every frame. It doesn't just duplicate the original frame? So a 120 Hz TV would actually be a bad idea when attempting to eliminate lag? A lot of clueless sales people would try to say the opposite. One looking for a TV for video games should stay away from 120 Hz TVs? There is an extraordinary amount of confusion about this, and Wikipedia could clear things up.--SkiDragon 05:37, 14 October 2007 (UTC)
On a 60hz signal, the frame period is 16.66ms. Half of that is 8.33ms, which is the 120hz period. In order to generate the in between frames, the circuitry needs access to the previous and next frames. So having access to the next frame introduces one frame of latency to the signal. This doesn't not include any additional latency due to signal processing. The signal processing I do not know a typical latency for generating the imagery itself, but I would not be surprised by very large numbers (for example 50ms). Most players can notice 200ms of lag or more. When the total lag is less than one frame period, this is the ideal case. 1 frame of lag is the typical lag inserted by a good LCD. 120hz sets can theoretically do just as good as typical LCD's. But in reality you will get the typical lag for an average LCD plus the lag associated with the image processing which will be at least one half a frame period of the input source (EG: 8.3ms). Let me know if this is clear. Ahigh 05:19, 29 October 2007 (UTC)

