From Wikipedia, the free encyclopedia  View original article
Multiples of bytes  


 
Orders of magnitude of data 
Multiples of bytes  


 
Orders of magnitude of data 
The gigabyte (/ˈɡɪɡəbaɪt/ GIGəbyt or /ˈdʒɪɡəbaɪt/^{[1]}) is a multiple of the unit byte for digital information.
The prefix giga means 10^{9} in the International System of Units (SI), therefore in this context 1 gigabyte is 1000000000bytes. The unit symbol for the gigabyte is GB.
Historically, the term has also been used in some fields of computer science and information technology to denote the gibibyte, or 1073741824 (1024^{3} or 2^{30}) bytes. For instance, the memory standards of JEDEC, a semiconductor trade and engineering society, define memory sizes in this way.
The usage of the unit gigabyte continues to depend on the context of usage. When referring to disk capacities, it usually means 10^{9} bytes, often stated explicitly on the manufacturer's product labels. This also applies to data transmission over telecommunication circuits, as the telecommunications and computer networking industries have always used the SI prefixes with their standardsbased meaning. When referring to RAM sizes it most often has a binary interpretation of 1024^{3} bytes, i.e. as an alias for gibibyte. File systems and software often list file sizes or free space in some mixture of SI units and binary units; they sometimes use SI prefixes to refer to binary interpretation – that is using a label of gigabyte or GB for a number computed in terms of gibibytes (GiB), continuing the confusion.
In order to eliminate the ambiguity, the International Electrotechnical Commission has standardized the use of the term gibibyte for the binary definition. This position is endorsed by other standards organizations including the IEEE, the International Committee for Weights and Measures (CIPM) and the U.S. National Institute of Standards and Technology (NIST), but the binary prefixes have seen limited acceptance. The JEDEC industry consortium continues to recommend the IEEE 100 nomenclature of using the metric prefixes kilo, mega and giga in their binary interpretation for memory manufacturing designations.
The term "gigabyte" is commonly used to mean either 1000^{3} bytes or 1024^{3} bytes. This originated as compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (2^{10}) approximates 1000 (10^{3}), roughly corresponding SI multiples, it was used for binary multiples as well. In 1998 the International Electrotechnical Commission (IEC) proposed standards for binary prefixes and requiring the use of gigabyte to strictly denote 1000^{3} bytes and gibibyte to denote 1024^{3} bytes. By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST. Nevertheless, the term gigabyte continues to be widely used with the following two different meanings:
Since the early 2000s most consumer hard drive capacities are grouped in certain size classes measured in gigabytes. The exact capacity of a given drive is usually some number above or below the class designation. Although most manufacturers of hard disk drives and flashmemory disk devices^{[3]}^{[4]} define 1 gigabyte as 1000000000bytes, software like Microsoft Windows reports size in gigabytes by dividing the total capacity in bytes by 1073741824 (2^{30} = 1 gibibyte), while still reporting the result with the symbol "GB". This practice causes confusion, as a hard disk with an advertised capacity of, for example, "400 GB" (meaning 400000000000bytes) might be reported by the operating system as only "372 GB" (meaning 372 GiB). Other software, like Mac OS X 10.6^{[5]} and some components of the Linux kernel^{[6]} measure using the decimal units. The JEDEC memory standards uses the IEEE 100 nomenclatures which defines a gigabyte as 1073741824bytes (or 2^{30} bytes).^{[7]}
The difference between units based on decimal and binary prefixes increases as a semilogarithmic (linearlog) function—for example, the decimal kilobyte value is nearly 98% of the kibibyte, a megabyte is under 96% of a mebibyte, and a gigabyte is just over 93% of a gibibyte value. This means that a 300 GB (279 GiB) hard disk might be indicated variously as 300 GB, 279 GB or 279 GiB, depending on the operating system. As storage sizes increase and larger units are used, these differences become even more pronounced. Some legal challenges have been waged over this confusion such as a suit against Western Digital.^{[8]}^{[9]} Western Digital settled the challenge and added explicit disclaimers to products that the usable capacity may differ from the advertised capacity.^{[9]}
Because of its physical design, the capacity of modern computer internal memory devices such as DIMM modules is always a multiple of a power of 1024. It is thus convenient to use prefixes denoting powers of 1024, known as binary prefixes, in describing them. For example, a memory capacity of 1073741824bytes is conveniently expressed as 1 GiB rather than as 1.074 GB. The former specification is, however, almost always quoted as 1 GB when applied to internal memory.
Software allocates memory in varying degrees of granularity as needed to fulfill data structure requirements and binary multiples are usually not required. Other computer measurements, like storage hardware size, data transfer rates, clock speeds, operations per second, etc., do not depend on an inherent base, and are usually presented in decimal units. For example, the manufacturer of a "300 GB" hard drive is claiming a capacity of 300000000000bytes, not 300x1024^{3} (which would be 322122547200).
