'computer' method to determine date of Adam and Eve.

THere’s a joke currently going around on reddit about the oldest computer, https://www.reddit.com/r/Jokes/comments/6090zx/the_oldest_computer/?utm_content=title&utm_medium=hot&utm_source=reddit&utm_name=frontpage

basically says the first computer was one byte and occurred in the garden of eden.

So, for purely pointless reasons, can we use Moore’s Law, or some equivalent to to extrapolate back in time to where the first computer had 1 byte, and ‘scientifically prove’ that Adam and Eve (and subsequently the age of the universe) lived xxx number of years ago.

WHile this might be fun, I’m stupid enough with computers to realize that this could be a complex question. I don’t know what 1 byte is, or if there is some ‘Moore’s law’ equivalent that can be used. or if ancient abacus calculations technically had more than 1 byte. I’d just be amused to be able to say, based on the computing power regression of this law, we know the age of the earth to be August 17th, 1836. or something like that.

SO this is purely a bullshit ‘let’s drink and give a shot at it’ exercise. I don’t want to hear about how this is a stupid idea or impossible to determine. If I wanted abuse, I’d go down the hall and take the first right into room 12.

This is nothing but a joke, but if you want to take it seriously, let’s suppose that Moore’s Law says that the number of transistors in a chip double every two years. (There are various other ways of interpreting it. Some say that the doubling is some slightly different time period. Some say that it has to do with the speed or the amount of storage or something else in a computer. I’m going to use the interpretation I’ve just given.) It appears that it was possible to fit 2,300 transistors on a chip in 1971. 2,300 is a little more than 2 to the 11th power. So it would be possible to fit one transistor on a chip 22 years before 1971, which would be 1949. So Adam and Eve ate the apple in 1949. O.K., maybe 1948, since 2,300 is a little more than 2 to the 11th power.

A byte does not require any transistors. The famous computer that coined “bug” stored data in relays. And the unbuilt Babbage engine stored data mechanically. The trick is to discover a computer that has one byte (eight bits, or a numeric value between zero and 255) that is capable of doing useful work with that amount of information.

If you can’t use Moore’s Law, there’s no way of determining the answer to this question.

That’s funny because 1948 was about the time the transistor was invented, so one "on a chip. However, the first integrated circuit was about 1959 and had only ten transistors, not enough for one byte, maybe enough for 2 bits.

Incidentally, “byte” is not a fixed size. I am aware of computers whose byte size was as small as 6 bits and as large as 9, before it settled down to 8. Which makes the French word for byte, octet, somewhat misleading.