Sailor, I could be mistaken but that’s the way I learned it lo these many years ago. I offer in support the following website (http://medic.bgu.ac.il/comp/course/defs/byte.html). Unfortunately, I can’t vouch for its accuracy but it does seem to support my original statement.
Nope, a byte can be an arbitrary number of bits, but is almost always eight these days. See the jargon file entry.
that site supports what I said. It says originally it could be several sizes but that is now obsolete and it has been 8 bits for some time now. Read it carefully.
>> The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard
What year are we in?
BTW, I found a whole course about computers on line. here’s the page about floppies:
http://viking.delmar.edu/courses/Cis312J/EBOOK/wrh11.htm
very interesting and good reference stuff (all of it).
AFAIK, the size of a byte is arbitrary, and is determined by the specific hardware. For example, in modem communications, if you use a parity bit, the “byte” size is 9bits. Byte is completely arbitrary, but is most often 8 bits.
Ok, I can find you one million gazillion documents online where a byte is used as 8 bits. Can you find me 8 or 10 where it is used to mean something else? I’d like to see them.
Every source I find credits the IBM guy for establishing the meaning of byte to be 8 bits in 1956.
the fact that with error correction you need 9 bytes to transmit 8 bits of information does not change that. the information is 8 bits not 9. Using compression it would be less. a byte is still 8 bits until you can show me it is commonly used in any other sense.
I think it’s quite safe to say that all modern computers use 8-bit bytes, and for all practical purposes a byte contains 8 bits of information. That’s not to say it’s necessarily encoded in 8 bits, though. When you’re calculating bandwidth in bytes, it’s the size of the encoding that matters, and that’s why I said that bytes can contain a variable number of bits. To send 8 bits’ worth of information (i.e. a byte), a modem has to send somewhere between 8 and 13 bits down the wire (those numbers might not be correct) depending on its settings (e.g. 8N1, 7E1, etc.). This is ignoring overhead from higher protocols, since the relative amount of overhead will vary even within a single connection.
The bottom line is that the only straightforward and honest way to report the bandwidth of a modem, T1 line, OC-768, or whatever is in bits per second.
Right - a byte is defined as the minimum number of bits required to represent a character. In ASCII that’s seven, with modem parity checking it’s nine, etc. If you’re doing math on things, it’s always eight, since designing logic with eight bits per byte is easy.
>> a byte is defined as the minimum number of bits required to represent a character. In ASCII that’s seven, with modem parity checking it’s nine
No sir. You are making that up. Please offer some support. Please show me some sites that would use it that way. I cannot find even ONE. (Not that one would validate the rule).
In unicode a character needs 16 bits. Show me where that is called a byte. Baudot code uses five bits. Please show me any place that uses 5 bits to mean a byte. A computer does much more than move characters around. It does math operations.
I have always seen byte to mean 8 bits. Every instance I have found on the Net supports that. If you disagree please show us some support. We already know your opinion, no need to repeat it. Just show us some convincing proof. Please.
Sailor is exactly right. A byte, by definition, is 8 bits. ALWAYS. Unless it’s eventually redefined. The computer world is full of standards, because to deal with somebody (or another computer) you need to use the same language to talk about the same things. This is one of them. A byte is 8 bits. A piece of data sent by a modem may not necessarily be 8 bits, but that’s not a byte. It’s a group of 10 bits used to encode 8 bits of data. Or 1 byte.
The number of bits needed to encode that byte is irrelevant. That doesn’t change the fact that you’re encoding a byte.
Processors and operating systems are often described as “16-bit,” “32-bit,” etc. How does this nomenclature relate to the definition of a byte as having 8 bits?
This is not at all related to the definition of a byte. It means that the CPU’s registers can hold 16 bits or 32 bits. (Or, in Intel’s new IA64 chip, 64 bits.)
If you are interested in learning more about this, I would recommend getting a textbook (no more than 5 or 6 yrs old) on computer architecture or “computer organization”.
According to this site:
(bold mine)
By that definition, in the context of Unicode, a byte is indeed 16 bits.
According to this:
(bold mine)
And according to this:
And those were the first few results from a google search for “byte definition.”
I agree that nowadays a byte is usually eight bits, but saying the definition of a byte is eight bits is wrong, the definition of a byte is the number of bits used to represent a character. 99.9% of everything operational now uses ASCII, though, so there is room for confusion. And here’s the Jargon File definition for byte again (the only definition that matters ):
Friedo, you are going to do better than that to convince me (or others, I think). Your last quote says any other use is obsolete and the others are vague and support my view more than yours.
You have not produced one single solitary example of the use of the word byte to mean anything else than “8 bits”. Can you please provide a real world example where the word byte was used to mean something else?
In unicode they say two bytes are required for each character NOT that a byte is now 16 bits.
Please supply some examples of usage as you describe it. I have searched the net and not found even ONE. And man, you can find stuff on the net today to support almost anything!
If your definition of byte is “however many bits it takes to represent a character” then the expression “multibyte character” is a contradiction in terms. Do a net search and you will find hundreds of instances. Like here.
quote: Character sets used in the early days of computing had only six, seven, or eight bits for each character: there was never a case where more than eight bits (one byte) were used to represent a single character …
The simplest character sets are one-byte character sets…
The ISO 2022 standard defines a mechanism for extended character sets where one character can be represented by more than one byte…
Any search engine will turn up thousands of examples of usage that support what I say. you have not supplied even ONE.
You’re right, I can’t find any current ussage of the word “byte” that means other than eight bits. But I still maintain that it is incorrect to offer the definition of byte as “eight bits.” The fact that other definitions are obsolete does not make them incorrect, and they should not be excluded.
>> The fact that other definitions are obsolete does not make them incorrect
No, it just makes them … obsolete! Obsolete means they were correct usage in the past and are no longer used that way any more.
Here’s one that is not obsolete:
Main Entry: pig·head·ed
Pronunciation: 'pig-"he-d&d
Function: adjective
Date: 1620
: willfully or perversely unyielding : OBSTINATE
Bxxxt!
That one’s over the line, sailor. We do not insult each other in GQ.
Now that was a bit uncalled for, sailor. Let’s just say we agree to disagree.
>> We do not insult each other in GQ
Of course we don’t and I would not dream of doing it.
Or are you telling me you somehow have reasons to believe the word “obstinate” might fit anyone around here? If so, don’t blame me. I never said it. But anyone is allowed to try on the shoe for fit, and if they believe it fits don’t blame it on me.
BTW, Webster’s does not associate any offensive connotation to “pigheaded”, just neutrally defines it as “willfully or perversely unyielding : OBSTINATE”
I am trying to think of someone I might use as an example but right now I cannot think of anyone… can you?