Check Wikipedia if you like, but “byte” doesn’t necessarily mean 8-bits - although it almost always does.
Recognizing non 8-bit bytes is a good way to show of your computer nerd skills. Chicks dig that, or so I tell myself.
Check Wikipedia if you like, but “byte” doesn’t necessarily mean 8-bits - although it almost always does.
Recognizing non 8-bit bytes is a good way to show of your computer nerd skills. Chicks dig that, or so I tell myself.
Just like my example. I very much object to the use of byte as a synonym for character. Besides ASCII characters, PDP-11s had a way of encoding 3 characters in 16 bits.
Bytes at their most fundamental level became standard when we got rid of processors with word lengths not powers of 2. I’m not sure what “usable” data means - status and parity bits are damn well usable, if not by the end user directly. In any case, in many of these applications there is no good reason to have any basic size of data across several applications and standards. There is in architecture, at least now.