A ternary computer?

Morse code is binary encoding used in a communication protocol based on 2 states for the code, and asynchronous timing. A third state is required to maintain the timing. Computers usually use synchronous timing, or contextual asynchronous timing, so 2 voltages could be used to represent the states. However, some asynchrounous communication is also based on a third state, generally considered as negative and positive voltages to distinquish between 1 and 0, and ground to represent the spacing (the sign of the voltage being relative to the ground). With synchronous timing, Morse code could be purely binary with an additional character used to represent an interword space, just as most electronic character coding like ASCII use. Other contextual asynchrounous timing modes could be used, but would require more bandwidth to distinquish the changes in state. There is a difference between data encoding, and communication protocols. And protocols may be low level timing considerations, such as the inter-character spacing in morse code, or high level packet protocols such as the interword spacing, and the code sequences used to start communications.

If the bit strings turn constant at some point, aren’t there a finite number of results from the binary operations on any set of infinite strings? The infinite bits that remain constant sound like the infinitie number of leading zeros in front of any number then.

The infinite tail-end of constant bits actually is a lot like the infinite leading zeros in front of the digits of any natural number. Yet there are still infinitely many natural numbers, aren’t there? So I’m not sure what conclusion you are drawing here.

For example, {0000…, 10000…, 110000…, 1110000…, 11110000…, …} is an infinite set of eventually constant infinite bitstrings. But, if I were to apply, say, negation to it, I wouldn’t end up with just a finite set of results. No, I’d get the infinite set {1111…, 01111…, 001111…, 0001111…, 00001111…, …}.

Does that answer your question about “aren’t there a finite number of results from the binary operations on any set of infinite strings”? I may have misinterpreted it.

I should have said ‘on any set of infinite strings that become constant at some point’. But I guess I was considering the strings to all go constant at the same finite point. The natural numbers (those are integers right?), could still have an infinite number of bits before they become constant. Or something, but I think I understand what you are saying now. Thanks.

Right, it’s key that the point at which the bits go constant isn’t fixed, and can be different in different strings.

(And, yeah, the natural numbers are just the non-negative integers. I tend to not say “integers” unless I really want to talk about the negatives as well, though I suppose it’s often overcautious in casual conversation.)

Well, I think of everything as integers. Computers were the natural choice for me, I can only count to 1. So it’s hard to remember that other numbers aren’t just representations in bits, of which there are always a finite number. I’m going to get a headache from the answer I guess, but what can you do with those non-2^n boolean ops?

The non-2^n Boolean algebras don’t have any different Boolean operations. They have all the same Boolean operations as the 2-element Boolean algebra; they’re just applying those operations to different things.

Knowing what kinds of Boolean algebras exist only matters if you care about the relationships between different structures that happen to have the same theory applicable to them. If all you care about is what the operations and laws of Boolean logic are, then the 2-element Boolean algebra is perfectly sufficient…

General Boolean algebras arise when you’re interested different ways of producing bits from unknown data, which you can combine using any finitary Boolean operations, but which you aren’t able to combine infinitarily.

So, for example, imagine we’re playing twenty questions; I’ve picked an item, any item out of the infinitely many things that I can think of in the world, and you’re allowed to ask me any yes/no question about it which you like in (a suitably idealized, finitary form of) English. What’s the structure of all the queries you could make like?

Well, you can combine questions using any Boolean operations (e.g., combining “Is it a person?” and “Is it in this room?” into “Is it both a person AND in this room?”), and some questions are guaranteed to amount to the same thing (e.g., “Is it a bat?” and “Is it a real-world flying animal AND a vertebrate AND (NOT (a bird or extinct))?” are guaranteed to come out the same). Furthermore, all the rules of Boolean logic hold (e.g., “Is it American AND (either (NOT American) OR a woman)?” is guaranteed to come out the same as “Is it American AND a woman?”, in accordance with the rule p & (~p | r) = p | r). So we can combine queries with Boolean operators, and query-equivalence satisfies the laws of Boolean logic. In this sense, the queries you can ask have the structure of a Boolean algebra; a non-2^n Boolean algebra at that. Is it a very important structure to study? Well, that’s up to you. But it’s there and has certain relationships with the 2-element Boolean algebra and other Boolean algebras which are sometimes worth thinking about.