I don't know how to read a simple clock, please help me.

If you scroll down it tells you how to read it.

If you scroll up, the OP linked to that image. :stuck_out_tongue:

:smack: :smack:

He did indeed.

I have one, too, but mine’s one of the older ones that doesn’t have the pure-binary option. And like Scarlett67, I waste way too much time watching it form interesting patterns.

Having posted that, now I’m wondering if I could actually build it.

I have a clock that works using rolling balls in accumulators, too. It dropped one ball in the top every minute, and when enough balls rolled into a levered tray, most of them would roll back to the ball reservoir, and one would roll to the next tray. Unfortunately the motor burned out in a power surge, but while the motor worked, it kept time pretty well: It’d only drop about 2 or 3 balls a week. One of these times, I really ought to see about replacing the motor in it.

BCD is commonly used for digital clocks. The internals of this binary clock are probably the same as a standard digital clock. Except the binary clock doesn’t need to map the BCD digit to a set of light segments to activate in the 7-segment display.

Yep. I remember learning these instructions back in the day.

I had one of those. I think I ended up tossing it because it was so noisy. 1:00 was especially bad as I remember.

There are only 10 types of people in this world:

Those who understand binary numbers and those who don’t.

(Okay, old joke. Sorry.)

There are only 10 types of people in the world: Those who understand trinary, those who don’t, and those who don’t have any clue of what trinary is.

Here’s a real binary watch.

Yeah, I first tried to read it as real hex; stopped by the thought “How can it have more than 64 minutes in the hour”? It’s BCD. So it’s only semi-geek.

BTW - Even the early (ASCII) microprocessors had packed-decimal math operations. Useful if the chip was intended for calculators, or business programming for abitrary-sized numbers where decimal accuracy was necessary. (COBOL COMP-3).

What’s an ASCII microprocessor? I know what ASCII and microprocessors are.

The earlier post implied that packed decimal (COMP-3) and decimal arithmetic operations were features of the IBM Mainframe-type EBCDIC-based computers.

Just pointing out that the early microprocessors, like the 8080 based ones that typically used ASCII in their operating systems, also had packed-decimal op-codes. It’s just that most software nowadays tends to use the floating point and pure binary integer math instead. Not sure if the current Pentium-based processors retain packed-decimal opcodes; it was advantageous for business computers, espcecially running COBOL, that wanted to do pure decimal arithmetic.