BCD makes more sense when you know it stands for ‘Binary-Coded Decimal’: Each decimal digit (0-9) is encoded into 4 bits (a nybble). This was invented because it’s easier to store and sort numbers in BCD form when you have to build a special circuit, as opposed to relying on a microprocessor; historically, this was associated with punch card data and processing, as cards would be automatically sorted by specialized hardware as opposed to wasting extremely expensive computer time on such a task.
I’ve used BCD a few times as an interface between instruments and computers. Advantageously, it is pretty easy to make the computer and the instrument display agree without doing any kind of handshaking or processing. Beats having people call about why there is some kind of difference between the two.
But I haven’t used it in a few years.