Knowledge given or recieved (The Highroads Dictionary, Nelson, 1969)
But I’ve been reading Scientific American, and I am intrigued by the idea that information is actually the most important aspect of this universe, with matter and energy acting as supportive players.
How can this be though, considering that information (in essence) is an abstract entity?
I don’t want to get into a philosophical debate about how ideas are also intangible, “but look how much they affect the world, man”. What I mean is how can information, by itself, affect the physical world so profoundly? What property of information allows it to do so?
Slight side Q: I know that the study of information is called Information Science, but it is my understanding that this relates primarily to information that is processed and stored via electronic means (i.e. Information Technology). So what is the subject that deals with information with respect to the above question(s) called? I mean not just the path of information through electronic systems, but its path through all types of systems?
By itself I don’t think the information is capable of anything, if there is no intelligence to organise the information into a logical process and then use the information, what good is information by itself?
But this probably doesn’t help answer your question…
The issue here is a number of different concepts sharing the same names. Probably it’s best to start with Claude Shannon.
Shannon founded “information theory” while working for Bell Labs in the paper A Mathematical Theory of Communication[1]. In it he developed a notion he called “entropy”, named after the quantity in thermodynamics. It measured the extent to which one didn’t know the result of a random event beforehand. I’ll skip the technical details. Suffice it to say that it came to be measured in “bits” (this was the first use of the term), one of which was the entropy of the flip of a fair coin, and that it posessed certain properties such as the fact that the entropy of an event consisting of two independant events was the sum of their entropies (flipping two fair coins has two bits of entropy) or that events with more possible outcomes (generally) had higher entropies. “Information” was the amount of entropy by which a system was reduced when an observation was made. If you have two fair coin flips to consider (two bits) and flip one coin, you still have one bit of entropy, so the coin flip has given you one bit of information. To say this paper’s influence on mathematics (and now computer science) was enormous is an understatement. Computer science as we know it literally could not exist without Shannon’s work.
Now, much more recently, some physicists have taken the name “entropy” as if it meant the quantity from thermodynamics. No, I’m not saying they were misinformed or anything. They saw that “real” entropy sort of is a measure of how much detail is missing in a description of a system. While there may be very few possible configurations for a crystal of carbon near absolute zero (low entropy), when it’s turned into a gas, there are quite a few possible configurations with the same temperature and pressure. If that’s all you know, there’s a lot of entropy in the system. The physicists tried to adapt the theory of information to the thermodynamic picture with (as I see it) mixed results. Formally it goes through well enough, but I think the picture is unwieldy and much more difficult to work with.
Now, what you’re probably reading is the result of some physicists pushing this idea to its breaking point. In fact, there’s a mathematical theory of quantum information used in quantum computation, and some physicists (I believe at the Perimeter Institute) have developed a theory that the universe “is” a giant network of quantum computers and everything we experience is an “emergent property”. Whether this is true or not is really more of a metaphysical question; thus my scare quotes on “is”, since it’s really their model that is a quantum network.
As to your side question: What I see called “Information Science” is generally a watered-down version of computer science which looks more at the “big picture” and is almost more managerial than engineering in character. This is in contradistiction with information theory, which is Shannon’s opus.
[1] C. E. Shannon, “A Mathematical Theory of Communication”, Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.
What you might be looking for is theoretical computer science.
Computer science has a misleading name (although it’s not really their own fault). At least theoretical CS has nothing to do with electronic information technology. There are far more general concepts of computation used for theoretical purposes but those are not what people think of when they hear computer.
There’s a newer notion of information dealing with bit strings (any sequence of 0s and 1s). Basically, the information contained in a string is the length of the shortest program with no input that will print out that string. “Kolmogorov complexity” is a good search term to start with, although you may have difficulty finding an explanation that’s comprehensible without some knowledge of computation theory.
This is not related to the article mentioned in the OP, but still worth mentioning.
This sort of thing has always bugged me. In what sense do they mean “program”? I can see that it’s shorter to say “print 1, then 0 alternately” than “print <sufficiently long string of random bits>”, but what certainly some language must be picked as an arena for testing, even just to compare lengths. Otherwise the words (or commands) used to describe a given pattern might make a given string have a shorter “program” than a certain random string in one language, but a longer one in a different language. If you want to measure any of these “lengths”, you certainly need a specific language picked out. Which one?
As long as the languages are the same, it doesn’t really matter. Generally, you’d use the standard encoding of a Turing machine, but there’s no requirement.
Since the information content of a bit string is generally not computable, it doesn’t really matter.
“We want information… information… You won’t get it!”
(Sorry, sorry. Back to the point.)
Physicists and computer scientists may share common hairstyles, but they don’t quite agree on the meaning of “information”. To a computer scientist studying information theory, the atom of all information is a single binary digit. The more bits there are, the more information. What those bits might represent — a work of Shakespeare, an image, a sound waveform — is a separate concept. Likewise, the physical representation of those bits — whether as magnetic particles on a hard disk, or little tiny pits on an optical disc — is not considered important to the discussion.
I think to a physicist, information means those physical effects that can be measurably distinguished from one another. I’ve usually seen the term brought up when discussing quantum mechanics phenomena. For example, two photons with the same polarization angle leave in opposite directions from the event that created them. This angle is indeterminate until you measure one of the photons, at which point you instantly know the other one’s angle, even though it’s now far far away. Although this “information” about the state of a remote object seems to travel faster than light, there is no real information (physical effect) traveling at that speed.
So when a Scientific American article is discussing information, it’s most likely using the term in this sense — the propagation of physical effects through space and over time.
And then of course there’s this everyday meaning of the word “information”, which probably can’t be pegged down any more precisely than that (seems to me), as it depends on the nature of human consciousness. Physicists and computer scientists don’t like to complicate their models with such fuzzy and subjective notions.
Oh, but plenty of Dopers do. Post it, and they will come…
Well, as I said, there needs to be a requirement as I can construct pathologies when there isn’t one. I’ll take your word that there is a “standard” encoding of Turing machines.
So, what’s the use of this concept if you can’t compute it? I’m hardly a concretist, but I can’t see what one could do with these if one is never allowed to compute them.
Again, the only requirement is that the language be fixed. The entry at DADS fixes it as input to a UTM, but there’s nothing wrong with using C or Lisp.
Just cause you can’t compute it (in general–the information content of 0[sup]123[/sup] is computable) doesn’t mean you can’t prove theorems about it. There are whole big books on the subject out there. I’m really only familiar with the basics, but it’s supposed to be a pretty important concept in theoretical CS.
It is my understanding that “information” in physics refers to the states of matter. For example, you might describe a particle by spin, isospin, quantum number, etc. I you had complete information about all aspects of a system, you could in theory determine the prior state of a system by a reversal of the time direction.
This leads to the so-called “information problem” with black holes because it seems like information about the matter falling into a black hole is irretrievably lost so could could not “run the tape backwards” to find the state of the matter before it entered the black hole. Some phyicists feel that black hole evaporation may be a mechanism to recover the “lost” information.
I cannot think of a good definition of “information” and, in fact, the word would have diferent definitions depending on how it is being used, but a general description of information would be that it is anything which enters through the senses and increases knowledge.
Information is a concept that requires the understanding that something means or represents something else. It requires the intelligence to understand and process that. Intelligence is meaningless in a world with no intelligence – which means most of the universe.
Information implies a certain level of abstraction and therefore a need for intelligence to interpret it. Some sound waves hit my ears and those of my friend. We both received the same stimulus and yet we received very different information. All I can figure is the guy talking is speaking in Chinese while my friend can actually understand what is being said. The information is independent of the medium used to carry it they are separate and distinct things. The door slamming and the noise of the door slamming are different things. When I hear the door slam shut I think of the dorr slam shut, not of the noise itself.
Actually, in my experience, even within the technical literature on the subject, physicists tend to use fairly informal notions of “information” in discussing the impossibility of FTL transmission. Greene seems squarely within this tradition.
I say again: there are two different ideas here with the same name. There is the informal notion of “description about a system” like you say and the technical notion of “loss of entropy” as espoused by Shannon. This is what the OP was referring to as a proposed building block of the universe.
The “information loss” in a black hole as generally discussed is mostly the first sort of information, but on a very technical level, it has something to do with the second as well. Telling where words like “information” and “entropy” are meant in casual, information theoretic, or thermodynamic senses can get to be a delicate matter.
Information is more than all these things. It has mass. Or, perhaps better to say, information is what mass is ultimately made of - about 10^65 bits per kilogram, IIRC. This picture has been growing over the last twenty years. Of course, for a century now we’ve known that mass and energy are different sides of the same underlying thing - so now information is yet another side of this same thing.
Great article in Scientific American perhaps a year ago about it.