a sequence of 8 bits (enough to represent one character of alphanumeric data) processed as a single unit of information
A sequence of adjacent bits (binary digits) that can be operated on as a unit by a computer; the smallest usable machine word; nearly always eight bits, which can represent an integer from 0 to 255 or a single character of text.
A unit of computing storage equal to eight bits
The word u201Chellou201D fits into five bytes of ASCII code.
Origin: Expansion of bit, coined by Dr. Werner Buchholz in July 1956, during the early design phase for the IBM Stretch computer.
The byte is a unit of digital information in computing and telecommunications that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit of memory in many computer architectures. The size of the byte has historically been hardware dependent and no definitive standards existed that mandated the size. The de facto standard of eight bits is a convenient power of two permitting the values 0 through 255 for one byte. The international standard ISO/IEC 80000-13 codified this common meaning. Many types of applications use information representable in eight or fewer bits and processor designers optimize for this common usage. The popularity of major commercial computing architectures has aided in the ubiquitous acceptance of the 8-bit size. The unit octet was defined to explicitly denote a sequence of 8 bits because of the ambiguity associated at the time with the byte.
The New Hacker's Dictionary
[techspeak] A unit of memory or data equal to the amount used to represent one character; on modern architectures this is invariably 8 bits. Some older architectures used byte for quantities of 6, 7, or (especially) 9 bits, and the PDP-10 supported bytes that were actually bitfields of 1 to 36 bits! These usages are now obsolete, killed off by universal adoption of power-of-2 word sizes.Historical note: The term was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer; originally it was described as 1 to 6 bits (typical I/O equipment of the period used 6-bit chunks of information). The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System/360. The word was coined by mutating the word ‘bite’ so it would not be accidentally misspelled as bit. See also nybble.
The numerical value of byte in Chaldean Numerology is: 3
The numerical value of byte in Pythagorean Numerology is: 7
Images & Illustrations of byte
Translations for byte
From our Multilingual Translation Dictionary
- δυφιοοκτάδα, μπάιτ, δυφιοσυλλαβήGreek
- bajto, bitokoEsperanto
- bæti, tölvustafurIcelandic
- łaʼííłkéNavajo, Navaho
- byte, baitePortuguese
- байт, октетRussian
- бајт, bajtSerbo-Croatian
Get even more translations for byte »
Find a translation for the byte definition in other languages:
Select another language: