flicker fusion rate - Computer Definition
Also called "flicker fusion frequency," it is the number of frames per second required to reproduce motion in movie film and video. Early movies were typically shot at 16 frames per second (fps), and the flicker was very noticeable. Today's movies are typically shot at 24 fps, but theater projectors double the film frame rate to 48 Hz by showing each frame twice. Broadcast TV in the U.S. is shot at 60 half frames per second. Lighting and Action Contribute The brighter the room, the greater the frequency required to eliminate flicker. Movie theaters are dark, and the projector doubling to 48 Hz is sufficient. TV is often viewed in lit rooms, which is why 60 half frames is required (see NTSC). The highest HDTV rate is 60 full frames per second (see HDTV). The speed of movement in the frames also contributes to flicker and juddering (shaking). Directors plan high-speed action scenes carefully. When they have to pan across the field of view, they often keep the background out of focus to eliminate artifacts.