I can already hear the groans, but I'd like to bring up the Gibit.
As you may know, a year ago the IEC published new prefixes which
recognize that what used to be called, for example, 1 Kbit is really
1024 bits. Some new terms are:
As you know, the difference between the conventional computer usage
and the actual powers of 1000 grows with each step; e.g. 1 Tibit is
about 10% larger than 1 Tbit. Sometimes these differences are
critical. Take, for example, the two data rates that the new "10 Gb"
Ethernet PAR is addressing. There is real potential for confusion
here. Besides, since many of our standards end up in IEC, it seems to
me a real possibility that IEC will some day want us to conform.
[By the way, IEC doesn't recognize the abbreviation "b" for bit; they
just use the word "bit".]
And don't let the pronunciation bother you; I just pronounce "10
Gibit/s" as if it were "10 Gibits". A lot easier than "10 Gigabits
So I think we need to start thinking about migrating from Gb/s to
Gibit/s within 802.
P.S. Here are some references:
(1) IEEE Standards supports the new definitions and thinks there are
significant inconsistencies with the conventional usage. For example,
did you know that a 1 GB disk drive has 1 million bytes, not
1024*1024 Bytes? See:
(2) NIST provides an excellent summary and a reference to the IEEE
(3) A few months ago, some of my comments on this topic
included in Jeffrey Harrow's web-based column "The Rapidly Changing
Face of Computing." Part of my note became the title of the article
"Een schijf van 10 tebibyte" in the Science section of the Dutch
daily newspaper _Reformatorisch Dagblad_ (May 4, 1999). It's a good
article <http://www.rdnet.nl/weet/990504weet02.html>, if you read