Orders of magnitude (data)


The order of magnitude of data may be specified in strictly standards-conformant units of information and multiples of the bit and byte with decimal scaling, or using historically common usages of a few multiplier prefixes in a binary interpretation which has been common in computing until new binary prefixes were defined in the 1990s.

Units of measure

The byte has been a commonly used unit of measure for much of the information age to refer to a number of bits. In the early days of computing, it was used for differing numbers of bits based on convention and computer hardware design, but today means 8 bits. A more accurate, but less commonly used name for 8 bits is octet.
Commonly, a decimal SI metric prefix is used with bit and byte to express larger sizes. But, this is usually inaccurate since these prefixes are decimal, whereas binary hardware size is usually binary. Customarily, each metric prefix, 1000n, is used to mean a close approximation of a binary multiple, 1024n. Often, this distinction is implicit, and therefore, use of metric prefixes can lead to confusion. The IEC binary prefixes allow for accurate description of hardware sizes, but are not commonly used.

Entropy

This page references two kinds of entropy which are not entirely equivalent. For comparison, the Avogadro constant is entities per mole, based upon the number of atoms in 12 grams of carbon-12 isotope. See Entropy in thermodynamics and information theory.