View Single Post
  #17   Report Post  
Old February 9th 04, 08:43 PM
Steve Nosko
 
Posts: n/a
Default


"Avery Fineman" wrote in message
...
In article , Paul Keinanen
writes:
On 05 Feb 2004 17:35:31 GMT, (Avery Fineman)
wrote:
Signal to noise ratio changes as the _square_root_ of bandwidth
change.

[...snip...]


My apologies for not stating a reference. Was thinking in terms of
voltage, a bad habit picked up with vacuum-state electronics many
moons ago. [...snip...]


Whew! Thanks... I'm glad to see this post. I was about to post a
question about this, but you cleared it up Avery (or is it Paul...or er,
Len...)

Digital TV
has an almost infinite S:N ratio down to the lower limit of its

processing
capability, none of the noise "snow" effects seen with early days of
analog TV and distant TV stations "down in the mud."


If it is like digital cellular, it will have really good noise down to
some level _at which_ an analog signal would still be watchable (with
noise), then just DIE completely - where the analog signal would me
watchable depending upon the viewer's noise tolerance.
--
Steve N, K,9;d, c. i My email has no u's.