Talk:Flicker fusion threshold
From Wikipedia, the free encyclopedia
Request for more info.
There is just a small blurb about autistics being able to see the flicker at aparently a much higher rate. Does anyone have any more information on that? The thing that brought me to this article was the other day my son (who is autistic) and I were walking next to a wooden fence with the sun on the other side. Thus as we walked the sun "flickered." My son said, "look mommy I'm watching TV." I asked him if TV flickered like that and he said "yes." I asked "all the time?" and he said "yes." I'm curious to understand the ways in which he percieves his world so if anyone has any more information on this I'd be interested. I'll continue my search and if I find anything I'll bring it back here.
Contents |
[edit] request for clarification
in the article it says
-
- "In actual practice, movies are recorded at 24 frames per second, and TV cameras operate at 25 or 30 frames per second, depending on the TV system used. Even though motion may seem to be continuous at 25 or 30 fps, the brightness may still seem to flicker objectionably. By showing each frame twice in cinema projection (48 Hz), and using interlace in television (50 or 60 Hz), a reasonable margin for error or unusual viewing conditions is achieved in minimising subjective flicker effects."
Questions:
- if the human threshold is preceived as 16, who would someone use a 24 frame rate?
- what is the difference between chemical film and digital imagery to make it leap from 24-->30 ?
- if you show each of the 24 fps twice (each frame for the same duration as noraml 24 fps) then you are actually showing the film more slowly, being half the speed! (each frames being static on the creen for tweice the time) this makes no sense.
thank you. --Procrastinating@talk2me 14:33, 30 May 2006 (UTC)
Answers: In response to your last question, you must understand the concept of a shutter. A cinema film projector shutter operates at 48 Hz, meaning an image flashes on screen 48 times in a second. Since there are 24 frames a second being projected from the film, each frame is flashed onto the screen twice in a row, giving the illusion that there are 48 different frames being projected, even if half the frames are the same. It's just to trick the mind. 24 frames being shown per second is actually very jerky. 48 is more asthetically pleasing.
In response to your second to last question, 30 fps versus 24 fps dates back to the 1930's when the NTSC standard was set by the National Association of Broadcasters. It has to do with the analog bandwidth allowed by the technology of that time. Digital has nothing to do with it.
- To expand on this point, NTSC's 30 fps rate (actually 29.97 fps) was based on the 60 hz alternating current of North American electricity systems. The AC provides a built in oscillation that set the field rate for NTSC (60 half frame fields per second = 30 full frames per second). Europe, which had a 50 hz AC system, chose 25 fps television for a similar reason.--NWoolridge 13:48, 6 June 2006 (UTC)
To the first question: why would one want to make a film at 16 fps? While a human may be able to perceive it as fluid motion, it may be very uncomfortable to watch. Filmmakers wanted it low so they could use more film stock and not run out as fast (today, a Panavision film stock lasts only 3 minutes, even at 24--imagine how little it would last at 30 or higher!) But they discovered that as you approach numbers below 24, it became uncomfortable to watch. 24 became the standard.
- There are two concepts being conflated in the article and we should probably try to disentangle them. One concept is how often a light has to flash before humans no longer can perceive any flickering of the light. That number varies a lot depending upon at least the following factors:
- The particular subject you're testing (you and I probably vary in our sensitivity to flicker)
- How fatigued the subject is (flicker sensitivity goes up as you get more tired)
- No, it goes down.
- How bright the light source is (flicker is seen more easily with brighter sources)
- Whether the subject is viewing the flicker with their central or peripheral vision (peripheral vision is much more sensitive to flicker)
- Whether the object is stationary or moving past your vision field; since the stroboscopic effect can be detected at much higher frequencies than flicker fusion threshold. —Preceding unsigned comment added by Mdrejhon (talk • contribs) 18:29, 22 April 2008 (UTC)
- Many folks can see flicker up to about 75Hz under typical "viewing the computer monitor" conditions. Monitors tend to be bright, viewed at least partially with your peripheral vision, and used by tired users, so they tend to get the worst of all worlds, leading people to choose refresh (flicker) rates of 75 or even 85 Hz. Television, by comparison, tends to be viewed with your central vision so 60Hz was deemed to be "good enough" and there were other good technical reasons to choose 50Hz in parts of the world with 50Hz power, even though nearly everyone can perceive the 50Hz flicker.
- The other concept is how many discrete images must be presented each second before we perceive them as representing continuous motion. This number is a lot lower than flickers/second. Most people see 24 frame/second movies as being "continuous" and many cartoons were only drawn at 12 frames/second. TV, at 25 or 30 half-frames per second is seen as fine, and if you've ever seen 50 frames/second movies, you'd say "Wow! That's great!*" And even though a movie is only showing you 24 discrete images per second, the projector shutter is arranged so the light source flashes 2 or 3 times per image, leading to a 48 or 72 Hz flicker rate.
- So maybe we need to re-edit the article to make this all clear.
- What does everyone else think?
- Atlant 15:54, 30 May 2006 (UTC)
- *This leads to a third confounding factor. Computer gamers tend to talk about "frame rates" as a bragging point of how studly their computer hardware is. It's perceived as "better" if their computer hardware can generate, say, 150 frames/second rather than 37 frames per second and, up to a point, they're correct: as with cartoons, movies, and TV, more discrete images per second leads you to perceive motion as being smoother. But there's a technological limit: once they exceed the refresh (flicker) rate of their monitors, the additional frames they're generating aren't even displayed, so all those "go fasts" just go to waste. If the monitor is running at 85 Hz, there's no point generating more than 85 discrete images per second because it just means portions of the additional images will be thrown away.
- Even if entire frames are never displayed, there still are reasons why very high framerates are desireable. Most of them could be seen to be based in suboptimal to bad programming, e.g. the various "magic framerates" in id software engines, or the higher responsiveness as higher framerates. Running at a consistent 150fps or even 333fps can give you a competitive edge in some games; it isn't necessarily done for bragging rights.
- Actually, I agree, 150fps and 333fps provides a competitive advantage. Competition gamer don't notice this because of flicker fusion threshold or that they can tell the fluidity, it just means that the pictures arrive more freshly rendered at their computer monitor's refresh. An image rendered only 1/333th second ago displaying at the monitor's next refresh, versus an image rendered only 1/150th second ago, gives a competition gamer a few milliseconds advantage for reaction time. Just like the 100 meter sprint, where milliseconds matter at the starting line. Even if two competition gamers have exactly the same reaction time and cannot tell faster than 60Hz or 85Hz, the competition gamer with 333fps will beat the gamer who has 150fps, because the image can arrive more freshly-rendered at their computer monitor by a few milliseconds ahead of time. Yes, it results in a lot of wasted images that never arrive on the computer monitor, what it ensures that what does indeed gets displayed on the computer monitor, is the freshest-possible frame that has been rendered (i.e. 3D generated) in the smallest possible fraction of a second prior to being displayed at each refresh on the monitor. True, this is not the only factor, there are also other complexities such as computer monitor refresh lag, since lots of LCD monitors have built in frame buffering, so many competition gamers still prefer CRT, amongst other reasons.—Preceding unsigned comment added by Mdrejhon (talk • contribs) 18:17, 22 April 2008 (UTC)
- Even if entire frames are never displayed, there still are reasons why very high framerates are desireable. Most of them could be seen to be based in suboptimal to bad programming, e.g. the various "magic framerates" in id software engines, or the higher responsiveness as higher framerates. Running at a consistent 150fps or even 333fps can give you a competitive edge in some games; it isn't necessarily done for bragging rights.
[edit] great reply, please revise this article
Thank you Atlant and others for your wonderfull reply and putting things in order! It is SO Much easier to read them annotated by points instaed of the clumzy ambiguated text present now in the article. please do feel free to add your information, nad possibly rewrite those sections. I still do not understand
- why cartoons use such a low rate,
- why are we using peripheral vision in computer screens (which we seat clsoer to),
- why is 48Hz shutter makes a moother image, while outputting a 24 fps film all it does is to have more "darken" images from the closing time periods of the shutter.
I've put a re-write tag on the article,. Procrastinating@talk2me
- Answers:
-
- Cartoons: Remember, every frame needed to be drawn, so doubling the frame rate (roughly) doubled the labor costs, at least in the days prior to computer animation. 12 frames/second was "good enough" to be an acceptable trade-off between the cost and the visual appearance of the finished product. Nowadays, the cmputers can certainly tween between the hand-drawn animation frames.
-
- By sitting closer to the computer screen, portions of the screen fall into your "peripheral vision" rather than your central (foveal) vision. The parts of the screen that fall into your peripheral vision are seen to flicker more than the parts of the screen observed by your central vision. I've always assumed this was because, in your retina, the rod cells (more-common at the periphery of your retina) are more sensitive to flicker than the cone cells (more-common in your central-vision area).
-
- 48/72 Hz is actually seen as two/three flashes per frame so it doesn't stimulate your "flicker" response as much. And your brain doesn't really notice that it's the same image two or three times in a row. Meanwhile, modern projectors have enough brightness that they can afford to waste the light that doesn't come through druing those periods when the shutter is closed.
- Atlant 14:15, 12 June 2006 (UTC)
- Meanwhile, modern projectors have enough brightness that they can afford to waste the light that doesn't come through druing those periods when the shutter is closed.
-
- Why don't they just leave each frame on for a longer percentage of time? — Omegatron 19:13, 12 December 2006 (UTC)
-
- Nevermind. It's because we're talking about film. :-) — Omegatron 19:14, 12 December 2006 (UTC)
[edit] CRT vs. LCD
Article states:
"LCD flat panels do not seem to flicker at all as the backlight of the screen operates at a very high frequency of nearly 200 Hz, and each pixel is changed on a scan rather than briefly turning on and then off as in CRT displays."
This is completely backwards. CRT monitors refresh in a scan, while LCD monitor's images change by turning each pixel on and off rapidly, changing the color and brightness accordingly. Draknfyre 01:11, 16 July 2007 (UTC)
- Actually, it works a little differently than either explanation. The backlight and the LCD are completely seperate systems in an LCD display, and not all LCDs exhibit flicker at all (some are backlit by non-flickering continuous light sources such as incandescent bulbs or LEDs, and some have no backlights at all, being lit by ambient light). The article is mostly correct insofar as MOST LCDs are backlit by high-frequency fluorescent lamps. As the backlight just makes light, it has absolutely nothing whatsoever to do with the images being displayed and is never adjusted for any reason other than total display brightness.
- As for the LCD (which is solely responsible for the images you see), what it does to display images is to act as a shutter, adjusting the brightness (I.E.:Luminance, luma) of the light passing through it. LCDs are incapable of adjusting color (I.E.:Chrominance, chroma) at all (this would require them to flicker trillions of times a second!), and it is for this reason that "color" LCDs actually operate by placing a tinted (Red, green, or blue usually, also incapable of altering it's color for obvious reasons) sheet of something transparent in front of or behind each liquid crystal shutter.
- Aside from color, I don't recall the shutters as adjusting opacity by flickering, but by actually varying their opacity. I could be wrong on that last count.
- This whole article (and the related articles on framerate and video/film) is riddled with moronic nonsense though, and I'm definitely going to rewrite it whenever I find the time to look up some good material for the poor thing. 207.177.231.9 16:09, 31 July 2007 (UTC)
[edit] Physiological Effects
I personally have a high flicker fusion threshold and there is evidence that this has always been so (as a child I would immediatelybecome irritable being taken in to environments only lit by fluorescent tubes, such as supermarkets). More recently I use to go round offices telling people the frequencies CRTs were refreshing I was always correct up-to 85Hz at which point my accuracy dropped. I can see flicker at at 100Hz; fluorescent tubes, sodium lighting and 60W bulbs and often 100W bulbs. Some environments I can't stand (these combined with multiple repetitive sound sources), I have Stemetil (Prochlorperazine Maleate) which seems to help. Having naps also does. The effects are worst in winter as there is little natural light making exposure to artificial sources almost continuous. The flicker causes tiredness so sufferers may believe they suffer from SAD. I have had problems finding information on the topic and would like to know more, any help would be welcome.
Charleskenyon (talk) 13:08, 19 November 2007 (UTC)
For all people, I believe signals of up to something like 150Hz do get down the optic nerve and can be seen superimposed on the brain's EEG readings. It's just that conscious awareness of this varies from person to person. In my case I can't actually see 75Hz CRT as an actual flicker but I can "feel" it, I get the feeling that the display is sort of unstable or ephemeral whereas 85Hz looks rock solid the same as a backlit slide (unless I'm tired when I can sense up to about 100Hz).
What you should do is use compact fluorescents, they typically operate around 30KHz which is imperceptible and the hum is also inaudible being above human hearing. Quite a few people get headaches from 100Hz fluorescents though, and I personally would not use a monitor below 85Hz. I've not met anyone who could perceive the flicker of an incandescent bulb before but they can certainly be used as very weak strobe lighting in certain situations so they do flicker a little, though it's a bright/less-bright thing not an on/off thing. Samatarou (talk) 20:15, 24 November 2007 (UTC)
There appears to be a discrepancy between this article and the article on Rod Cells.
Here we have the claim:
“the rod cells of the human eye have a faster response time than the cone cells, so flicker can be sensed in peripheral vision at higher frequencies than in foveal vision.”
But in the rod cell’s article we have:
“Rod cells also respond more slowly to light than cones do, so stimuli they receive are added over about 100 milliseconds. While this makes rods more sensitive to smaller amounts of light, it also means that their ability to sense temporal changes, such as quickly changing images, is less accurate than that of cones.”
I believe the latter is correct. —Preceding unsigned comment added by 87.194.193.70 (talk) 10:22, 13 February 2008 (UTC)
[edit] Link
The link to "The Flicker Fusion Factor Why we can't drive safely at high speed" should be removed. The article is inaccurate and does not contribute to this article. 72.87.188.240 (talk) 23:37, 23 February 2008 (UTC)
[edit] Stroboscopic Effect versus Flicker Fusion Threshold
I agree that it is important to distinguish the Stroboscopic Effect from the Flicker Fusion Threshold. When an object is in motion, it is much easier to determine that it's non-continuous. The Rainbow Effect of DLP projectors is an excellent example, since 6X colorwheels result in the equivalent of a color sweep of about 360 Hz, yet some people are still able to detect this -- NOT because of flicker fusion threshold, but because of the stroboscopic effect. For example, it is possible to detect 500Hz and even 1000Hz flicker if it is zooming past you - a 1Khz strobe light moving at 1000 inches per second past your field of vision, leaves a dotted trail of flashes 1 inches apart. Photography agrees with this; even when you use a slow shutter that would otherwise not detect this flicker, moving high-speed-flashing objects show up as a dotted line rather than a continuously blurred line. Some scientific references need to be added about this 'indirect' method of determining flicker. Mdrejhon (talk) 18:36, 22 April 2008 (UTC)

