25 Mar 1987 08:08-EST
Never confuse bandwidth with timeliness and utility.
For example, take the highest density storage medium you can think
of - say the 10**10 bits per square inch optical storage disks,
put a stack of them in a knapsack. Walk from east coast to west
The 14" disk has about pi*(5)**2 sq inches and you can put about
100 of these disks into the sack (at 1/2 pound each that would be
a reasonable 50 lbs). At an average speed of 3 mph for 10 hours a
day, it would take 100 days or 100*86,400 seconds.
So you have roughly:
(22/7)*25*10**10 bits per disk = 7.86 * 10**10 bits per disk
100 disks = 7.8 * 10**12 bits
At 3 mph for 10 hrs/day it takes 100 days to go 3000 miles.
100 days at 86,400 seconds/24 hr day = 8.6 * 10**6 seconds
So the coast to coast data rate for Johnny Appleseed is:
7.8/8.6 * 10**5 = about .9 * 10**5 = 90 kb/s which is about
double the bandwidth you can get out of an unloaded ARPANET
with today's 56 kb/s backbone.
But few applications can deal with 100 day transmisstion/propagation
If people want to pursue this line of reasoning, I suggest that
we invent a new unit of transmission: Appleseeds which are
measured in Megabytes per Fortnight. Since a Megabyte per Fortnight
is about a Megabyte in 14 days which works out to close to 1 byte per
second, it is easy to see that the optical disk/Adidas method yields
roughly 90 kiloAppleseeds.
This archive was generated by hypermail 2.0b3 on Thu Mar 09 2000 - 14:37:45 GMT