Alien data storage?

I read a book about paradoxes by Martin Gardner a long time ago. That’s where this comes from. I just can’t remember why he said it wouldn’t work (although I remember there was definitely some problem).

Alien data storage:
1)Assign a number to every letter and character in our language. for example 1 = a, 2=b …345=& and so on. Maybe let 0 be a marker between numbers to keep clear where one ends and another begins. Also assign a number to a single space and another number to the FINAL character.
2)Translate a text into this long string of numbers (much like binary I assume).
3)Place a decimal in front of this number.
4)Since all decimals can be expressed as a fraction, change this long and complicated decimal into a long and complicated fraction.
*5)Using a super-duper-uberprecise machine, make a tiny mark on a metal rod which divides it exactly using fraction from #4.

Supposedly, this rod could contain the entire Encyclopedia Brittanica or for that matter all the works of mankind. Obviously, #5 is not within our technological means. And this device would require an equally accurate decoding device, but I seem to remember something more fundamentally wrong with such an idea.

Thoughts?

Unless it’s a very large rod, you’d be trying to scratch a mark two-thirds of the way across an electron; not just beyond our current technology, but actually impossible.

Anyway, I think you have the idea a little mixed up; using the string of numbers that you described, your mark would encode a single character - what you need is a string known to contain every possible combination of characters.

If you’re working in base-26 (assuming just using capital letters), then you can turn the entire (letters, capitalized) Britannica encyclopedia into a fraction as suggested, not just a single character. Each character would be a base-26 digit of the fraction. Your base-26 fraction can be converted to a base-10 fraction if you prefer to measure in base 10.

If you work in base-256, you can represent one entire character from the entire ASCII character set in each digit of the fraction, and so encode the entire Britannica including relevant punctuation.

However, the point is moot – measuring 2/3 across an electron is still impossible as mentioned. :slight_smile:

If your rod was one meter long, you could only get about seven characters before your scratch mark would be smaller than an atom, which is clearly impossible. You couldn’t even encode “What’s up?”.

Here’s last years take on the problem:
Incredible information in just one Scratch

Another possibility is to interpret the digits of, say, pi as alphabetic characters, then find the position where your encoded text naturally occurs within it; the drawback is that the time taken to search for the string increases way out of proportionto the increase in string length; you also end up encoding your string as a truly massive number describing the position; I’m not sure if there’s a saving to be had.

Perhaps a better idea would be to send a stopwatch and a million monkeys with typewriters with the instruction to start taking notice of what they typed after x seconds.

Thanks everyone for jogging my memory. I’ll stick to my tape drive for now.

Okay then, let’s go with something a little more prosaic. Make the “rod” bigger: Have, for example, a mark with a resolution of, say, a millimeter in San Fransisco and another in New York. The distance between them in thousandths of a mm is your number.

Or, with perhaps not-so-impossible measuring devices, millionths.

We can measure to the Moon and back with a resolution of centimeters- how about that distance as of X date and time, measured from X point to X reflector, artifically converted to nanometers?

Sure, we couldn’t do a huge works, but maybe a “Congratulations on finding this!” kind of like the gold record on Voyager.

I think you’re referring to Gardner’s “History of Logic Machines” or some similar title?

He takes us from early mechanical devices (ie:clocks) to computers of the 50’s and the current tape drive “computers” that are capable of doing math.

He theorizes on the future of computers and their logic capabilities and what type of storage system would be required. Tape would be too slow and would not hold nearly enough data.

The biggest problem IIRC that he discusses is in the programming software. HOW TO MAKE THE MACHINE THINK, so it isn’t just a big calculator.

I don’t recall any talk of “alien technology”.

Not even. Taking a rod the length of the galaxy (30 kpc) and making a mark the radius of a hydrogen atom, you get 22 characters.

It would be very easy to encode a lot more than this on a rod. All you’d have to do is use more than one notch.

That doesn’t sound quite right; are you saying that the precise position of a hydrogen atom within the length of the galaxy requires only 22 (or 22 pairs of) digits to represent?

Mangetout: No, it takes the length of the galaxy to encode all possible permutations of 22 characters.

Doh! Assuming Achernar worked the numbers correctly.

I jumped. I shouldn’t have. Archernar and Mangetout are both right.:smack:

That doesn’t sound right to me either. It depends on what you mean by the “size” of an atom, but the Bohr radius is about 10[sup]-8[/sup]cm. This means given a 1-cm rod, there are 10[sup]8[/sup] places where you can make a distinct mark. So you can record an 8-digit number on this rod. If the bar is one light-year across, that’s 10[sup]17[/sup] cm so there are 10[sup]8[/sup]*10[sup]17[/sup]=10[sup]25[/sup] places for a mark, and you can record a 25-digit number. A galaxy is about 10[sup]5[/sup] light-years across, so you can encode 30 digits.

What about thermal expansion? If the temperature of the rod changes, the message will be garbled. You can keep the rod in a controlled environment, but I suspect the limit of your data storage will be more determined by the accuracy of stabilizing the temperature than the size of an individual atom.