From Wikipedia, the free encyclopedia  View original article
Multiples of bytes  


 
Orders of magnitude of data 
Multiples of bytes  


 
Orders of magnitude of data 
The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB, but sometimes MByte is used. The unit prefix mega is a multiplier of 1000000 (10^{6}) in the International System of Units (SI).^{[1]} Therefore one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.
However, in the computer and information technology fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (2^{20}), a measurement that conveniently expresses the binary multiples inherent in digital computer memory architectures. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes,^{[2]} in which this measurement is designated by the unit mebibyte (MiB). Less common is a measurement that used the megabyte to mean 1000×1024 (1024000) bytes.^{[2]}
The megabyte is commonly used to measure either 1000^{2} bytes or 1024^{2} bytes. The interpretation of using base 1024 originated as a compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (2^{10}) approximates 1000 (10^{3}), roughly corresponding to the SI prefix kilo, it began to be used for binary multiples as well. In 1998 the International Electrotechnical Commission (IEC) proposed standards for binary prefixes requiring the use of megabyte to strictly denote 1000^{2} bytes and mebibyte to denote 1024^{2} bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST. Nevertheless, the term megabyte continues to be widely used with different meanings:
Semiconductor memory doubles in size for each address lane added to an integrated circuit package, which favors counts that are powers of two. The capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, and the number of disk platters in the drive. Changes in any of these factors would not usually double the size. Sector sizes were set as powers of two (most common 512 bytes or 4096 bytes) for convenience in processing. It was a natural extension to give the capacity of a disk drive in multiples of the sector size, giving a mix of decimal and binary multiples when expressing total disk capacity.
Depending on compression methods and file format, a megabyte of data can roughly be:
The human genome consists of DNA representing 800 MB of data. The parts that differentiate one person from another can be compressed to 4 MB.^{[5]}
