View Single Post
  #1   Report Post  
Old September 23rd 03, 05:36 PM
Michael Ransburg
 
Posts: n/a
Default Theoretical Question

Hello!

Can someone point out any mistakes in the following conclusion? I'm
just trying to find out if my understanding of stuff is correct.

If I transmit a signal at a frequency of 2.4 GHz (2 400 000 000 Hz),
the wavelength of that signal is 12.5 cm (speed of light/frequency in
Hz). One wavelength can transport one bit of information in one wave
length. In one second the signal travels ~300 000 km (speed of light).
So I have 2 400 000 000 waves per second, which translates to 2 400
000 000 Bit per second. This translates to a "raw" throughput of ~2289
MBit / s.

This seems to be a bit high looking at current gear on the market, but
I guess much gets lost in error detection/recovery and stuff. Anyways,
can any of you pros take a short look on it and point out any mistakes
I made?

cheers
Mike