Byte meaning

bīt
Frequency:
A set of bits constituting the smallest unit of addressable memory in a given computer, typically eight bits.
noun
15
4
A unit of data equal to eight bits. Computer memory is often expressed in megabytes or gigabytes.
noun
7
3
The basic unit of storage capacity in a computer system.
noun
6
1
A sequence of adjacent bits operated on as a unit by a computer. A byte usually consists of eight bits. Amounts of computer memory are often expressed in terms of megabytes (1,048,576 bytes) or gigabytes (1,073,741,824 bytes).
5
4
(BinarY TablE) The common unit of computer storage from desktop computer to mainframe. It is made up of eight binary digits (bits). A ninth bit may be used in the memory (RAM) circuits as a parity bit for error checking. See parity checking.A byte holds one alphabetic character such as the letter A, a dollar sign or decimal point. For numeric data, one byte holds one decimal digit (0-9), two "packed decimal" digits (00-99) or a binary number from 0 to 255. See space/time.From Bite to ByteIBM coined the term in the mid-1950s to mean the smallest addressable group of bits in a computer, which was originally not eight. The first spelling of the word was "bite," but the y was added to avoid misspelling between "bit" and "bite."Byte SpecificationsDrives and memory (RAM) are rated in bytes. For example, a 512-gigabyte (512GB) drive stores 512 billion characters of program instructions and data permanently, while eight gigabytes (8GBs) of RAM holds eight billion temporarily. The first hard drives in early personal computers held 5MB, and RAM was 64K. See memory and file size.
4
1
Advertisement
The definition of a byte is the central memory unit on a computer that is usually made up of a string of at least eight binary digits.

An example of a byte is combination of bits used in computer coding to represent a letter in an alphabet.

noun
0
0
A character in a computer coding scheme.A byte is a unique set of adjacent bits with a unique meaning in a computer coding scheme. A byte generally comprises eight (8) bits that represent a letter in an alphabet (e.g., a, A, z, or Z), a diacritical mark (e.g., ~ or `), a single digit number (e.g., 0, 1, 2, or 3), a punctuation mark (e.g., ,, ., or !), or a control character (e.g., paragraph break, page break, or carriage return).The size of a byte is specific to the coding scheme involved. For example, ACSII employs a coding scheme of seven (7) bits, so a byte is actually seven information bits, although the addition of a parity bit Bridge < 500 m < 2.5 km Terminator for error control results in a byte of eight (8) bits for storage and transmission purposes. An EBCDIC byte is truly eight (8) information bits. Unicode variously employs eight (8) and 16 bits. Computers create, store, manage, and output information in bytes.The origin of the term is uncertain, although it is certain that Dr. Werner Buchholz originated the term byte in 1956 when working for IBM on the STRETCH computer. Some suggest that byte is an alteration and contraction of bit, referring to the basic unit of information, and bite, referring to a morsel or chunk of data consumable by the early computer eight-bit processors. Others suggest byte is an acronym formed from binary digit eight. Still others suggest that byte is short for binary term. Computing and storage systems measure memory and storage capacities in bytes. For example, a kB (kiloByte) is actually 1,024 (2 10 ) bytes, since the measurement is based on a base 2, or binary, number system.The term kB comes from the fact that 1,024 is nominally, or approximately, 1,000. So, 64 kB of memory is actually 1,024
0
0
A string of binary digits (bits), usually eight, operated on as a basic unit by a digital computer.
noun
0
1

Origin of byte

  • Alteration and blend of bit bite

    From American Heritage Dictionary of the English Language, 5th Edition