Tait Cyrus (firstname.lastname@example.org)
1 Apr 88 20:29:04 GMT
I am looking for an equation that roughly computes the theoretical
load on a network given only the following information:
1) time over which the load is to be computed,
2) the number of packets seen in that time,
3) the number of bytes seen in that time.
Currently I am using something like:
num_bytes / (MBYTES_PER_SEC * time)
where I have arbitrarally picked MBYTES_PER_SEC to be 1 MegaBytes / sec.
Given a 10MBit (1.25 MByte) channel, subtracting overhead for preambles,
CRC bytes, collisions & the minimum time between packets, I came up
with the VERY approximate # of 1 MByte/sec.
I know that 1 Meg is a VERY magic number because there are many
different media that the data can be traveling across which will
effect this magic number.
So, back to the question. Is this a good approximation? What
values for MBYTES_PER_SEC should be used for the various medias,
i.e. think wire, thin wire, broadband, fiber, etc.?
I don't want to worry about distances between stations and the
media between them, just a rough approximation.
I have looked at the Blue Book to try to find a "nice" method
for computing network loads, but I am having a hard time trying
to come up with a better equation. Any help anyone could supply
me (as far a clarification of what the Blue Book is saying and
how to use that information) would be greatly appreciated.
-- @__________@ W. Tait Cyrus (505) 277-0806 /| /| University of New Mexico / | / | Dept of Electrical & Computer Engineering @__|_______@ | Parallel Processing Research Group (PPRG) | | | | UNM/LANL Hypercube Project | | hc | | Albuquerque, New Mexico 87131 | @.......|..@ | / | / e-mail: @/_________@/ email@example.com
This archive was generated by hypermail 2.0b3 on Thu Mar 09 2000 - 14:41:54 GMT