- From: David Woolley <david@djwhome.demon.co.uk>
- Date: Fri, 27 Jul 2001 08:09:15 +0100 (BST)
- To: w3c-wai-ig@w3.org
> > Interesting, the 20Hz example [2] seems to be "going faster" than the 59hz > - - at least to the eyes and brains of a few folks around here. You cannot simulate high flicker rates accurately on a monitor; you will get a strobe effect with a net rate that is the difference between the frame rate and the attempted flicker rate. > I also checked my hardware and software display driver properties. It's > currently set at 60Hz. which seems very close to 55 to me. It appears that 50 is the UK mains frequency and TV frame rate. 60 is the US one. These were chosen to be around the lowest at which flicker was not objectionable (ignoring epilepsy triggering), and the flicker in question was from electric lighting, not CRT displays. TV and film actually use half these rates, but film flashes the same image twice and TV sends half the scan lines at a time to increase the flicker frequency. High resolution displays on older computer monitors also use the TV trick (interlacing) and will flicker badly if alternate lines are dark and light (something that I've seen at least twice on real sites). My guess is that the reason for quoting figures as high as 55 or 59 for flicker rates is not that these are a problem, but that when strobed with typical frame rates they produce difference frequencies in the 5 to 10 Hz range.
Received on Friday, 27 July 2001 04:10:08 UTC