"running dogg" wrote
If the internet lines
were shut down, that would explain why BBC online was unreachable.
It is more likely that they were simply overwhelmed.
Hey, I was just throwing out a theory. Your idea is better though, the
always ubiquitous gremlin of "net congestion". Net congestion happens
all the time on normal days. Of course, if your connection suffers from
net congestion, your feed will abruptly cut off for "buffering" and/or
it will gradually degrade until it starts sounding like BBs rattling
around in a soup can. In the worst case scenario, the feed will simply
shut down and refuse to function completely. Over the air radio doesn't
have these problems, of course.
It may not neccessarilly have been net congestion, although it likely
contributed.
The server may have simply run out of available "sockets" from so many
requests. In which
case you won't be able to make any connection. It is somewhat akin to
repeatedly
dialing a call-in telephone number where there are only 10 lines available
and 1000
persons are calling. Everyone can get a dialtone to dial the number but
the other end doesn't have enough incoming lines - busy signal.
The TCP/IP network is the most robust communications network in wide use
today.
It was designed to be able to automatically route traffic around
"damaged"
fabric. If
servers need to be "bomb proof" they are mirrored and located at
geographically
diverse locations so that if one server takes a hit the other(s) will
survive. That is the
idea. Whether or not the now "public" non-military internet practices
such
mission
critical diversity is another matter.
Your robust communications network may work fine overall, but if too
many people try jamming on to one site that area quickly goes down.
The
network is only as robust as the servers which comprise it, and those
servers keep going down in times of greatest need. Radio Australia's
webfeed went down after the tsunami hit, and RA was playing cricket
matches as the horror unfolded.
See my "sockets" explaination above. It would require one hell of a lot of
users attempting a simple "connect" to jam up the internet. The bandwidth
required
for a single users' browser to request a simple connect is smaller than
miniscule.
Consider NASA's web site on July 4th with 1 billion requests - a record!
Server survived and
people were served. Plenty of internet bandwidth and robustness. What is
important
is the servers' capability and its subscribed bandwidth to its internet
provider. Once the
"packets" reach the internet backbone its whoooooosh - light speed baby!
|