So I read up a bit and found this from Shannon:
“The quantity which uniquely meets the natural requirements that one sets up for “information” turns out to be exactly that which is known in thermodynamics as entropy.”
If that is true, is it really true you can destroy information? Do most information theorists think Shannon was right or wrong?
Can you put a link in for that? I thought the definition of entropy in information theory was different from thermodynamics.
It’s funny, the Chaitin stuff carried the twist that it’s about random strings that can halt a Turing machine, something like the problem that the OP was looking for.