View Single Post
  #6   Report Post  
Old January 4th 04, 07:56 AM
Roger Halstead
 
Posts: n/a
Default

On Sat, 03 Jan 2004 10:38:53 -0800, hnmm wrote:


I think the question was a bit different than the answers I've been
seeing.

IF I understand the question, he wants to know why when a monitor is
shown on TV it flickers, but not when actually looking at the monitor.

When video taping, TV sets AND computer monitors will flicker. It's
due to the differences in scan rates on the monitors compared to the
camera.

Even when they are running the same frequency there will still be the
appearance of either a rolling image, or a moving bar as the scan will
not be perfectly in sync between the camera and the monitor. If
they are the same frequency the dark line on the screen will not move.
It's rare to see one where the monitor and camera sync pulses are ...
well...in sync. When that happens the TV screen looks normal.

The image on a computer monitor and a TV screen consist of a bunch of
almost horizontal lines. With a TV set the image starts at the top
and is drawn every other line, one at a time. When the line reaches
the bottom it has completed one frame. It returns to the top and draws
in the lines that were skipped in the first frame. This noticeably
reduces the perceived flicker.

Computer screens may use the same method or they may draw the entire
image on one pass. Their much higher sweep rate may allow them to do
so without a noticeable flicker.

As to regular monitor flicker, I only notice it occasionally in my
peripheral vision, but not when viewed directly. Part of that is due
to the persistence of the phosphor, but with TV sets and monitors the
persistence is very short. OTOH viewing a TV screen under
fluorescent lighting can really accentuate the flicker.

You should see a moving image on a longer persistence phosphor. They
smear....


Roger Halstead (K8RI & ARRL life member)
(N833R, S# CD-2 Worlds oldest Debonair)
www.rogerhalstead.com

.