Einstein was a determinist. Michio Kaku( some american/asian theoretical physicist is an indeterminist.
I am personally, and very strongly, a determinist. I have many reasons for this and would like to discuss it with some smarter people who listen and reply, not fight.
All truth has to be independent of human intelligence and its ability to be aware of some natural laws and even explain them.
That’s why Mr. Kaku’s statements are more close to reality than any “determinist”.
Don’t forget: Einstein improved the accuracy of Newton’s gravity laws. Someone will improve on Einstein’s gravity laws too… some time in the future.
The outlook of people like Mr. Kaku allow for improvement of human knowledge. I’m not saying Einstein didn’t but since you call him a “determinist” then he must be wrong by definition.
Have to agree with the other fellow — it sounds as if you are staking out a philosophical position. You’re welcome to your opinion, but it doesn’t have much to do with physics.
Einstein formed his views when quantum mechanics was in its infancy. He had great success applying his intuition to certain problems, and was understandably reluctant to accept something as counterintuitive as some results of modern QM and QFT without very strong evidence. He died over 50 years ago, and overwhelming evidence for indeterminacy has been mounting up since even before that, so his views (at least on that subject) have pretty much been refuted.
can you lay out your rationale for why you are strongly a determinist?
i’ve been called a determinist (usually in a pejorative sense) but i don’t think i am per se.
then again, i’ve been called a marxist as well. i certainly don’t think i am that, either.
anyway, i think it’s a nice starting point to hear how you came to this strong conclusion…
I’m always quick to point out I’m not very smart, so bear with me.
But as I understand it, Einstein had a difficult time accepting a lot of QM (or QT) because of how much it stepped on Newtonian physics.
The way my lower-ape brain grasps the two is that one is a group of theoretical models for understanding the largely unobservable quantum scale, a world where chaos abounds, but things such as quantum indeterminacy–when scaled up to any size that matters (atomic scale)–stops being so random and chaotic and starts behaving more predictably so that on the atomic level, QT starts being impractical.
Am I just way off…? My quantum understanding results from crash research and late night conversations. My statement of “the universe is ordered, not random” was rebutted with “oh yeah? Just ask quantum indeterminacy about it!” Which now that I’ve done some research into it, I guess I totally agree. But ultimately it doesn’t affect things on a practical level.
So all that to say this: Einstein’s aversion to sub-atomic physics probably played into his deterministic view…but on what I probably ignorantly call “the practical scale,” aka the atomic scale, being deterministic doesn’t really conflict with perceivable reality…?
Before we go any further, I have to ask the OP: How familiar are you with Bell’s inequality? If you’re not, then that’s absolutely the first thing that needs to be addressed in this discussion.
It depends how you define “practical level.” If you’re playing pool, then no, the balls are not going to take off at random angles. But if you’re working on the next generation of Intel microprocessors, trying to cram more transistors into a smaller area, then quantum effects create very real problems.
Another perspective is to look not at size, but time. When people thought that Newtonian mechanics was the final answer, philosophers pounced on that as meaning that with perfect knowledge of every particle’s position and velocity at a given instant, the entire future could be deduced. Obviously, perfect knowledge is unattainable, but in principle everything was determined by its previous state an instant ago, and that state was determined by the previous instant, and so on back as far as you want. The logical consequence of that is that our future is predetermined (making the additional assumption that our thoughts are merely the result of molecular interactions in our brains).
With quantum indeterminacy, one instant does not foretell the next. The difference between the predicted and actual state from one instant to the next will be small, but they will add up, so that the future quickly becomes unpredictable.
I am also a determinist, people have an aversion about it for religious reasons, and justify it by grasping at whatever theory in science will back them up. They don’t like knowing they have no control.
On most of this stuff Einstein’s instincts were correct.
The only reason we can’t predict things on a very small scale is a combination of not understanding what’s going on and the observer effect. We change the outcome by recording it in some way we don’t realize.
In large scale the “randomness” balances out, there is no observer effect and we can predict exactly how many years it will take a comet to reach us and if we did our math right whether it’s gonna run into us or not.
If we knew everything about an area of time and space even involving humans and were able to separate ourselves and not interfere we could then predict exactly what was going to happen in the future.
I predict that the narcissistic ape called “human” will tend not to like this however and refute it emphatically as a threat to their sense of self and security.
It’s a popular (and occasionally, professional) misconception that quantum mechanics necessarily implies indeterminism. In fact, it hinges on the interpretation: collapse theories, like the Copenhagen interpretation, introduce an aspect of indeterminism; theories in which no collapse occurs, like the various flavors of Everett interpretations, are completely deterministic. Indeed, one could argue that taking the theory seriously, one arrives at a deterministic point of view: the Schrödinger equation, that details the usual linear evolution of the quantum state, is completely deterministic. It is only when you break that evolution, by manually inserting a collapse dynamics into the theory, that indeterminism emerges. So all of modern physics is in fact consistent with determinism.
Personally, I would argue that determinism is the view that should be preferred from a philosophical point of view. I’ve just written up my views on the interpretation of quantum mechanics (somewhat at length…) here, so I’ll just lazily quote myself:
Another thing to consider that’s often missed in this discussion is that special relativity seems to straightforwardly imply determinism: there’s no notion of simultaneity anymore, which entails the breakdown of an absolute distinction between past and future. This is the grounds on which the Rietdijk-Putnam argument works: what to one observer is in the future, to another, is in the past; but if the past of that second observer is fixed, then so must the future of the first one be.
Bell’s inequality basically says that local hidden variables have been ruled out. I’ll explain what this means through a great analogy I’ve read from some physics forums I frequent.
Imagine a machine that spits out two scratch cards at a time. Each card has three areas you can scratch to reveal what’s underneath – the A square, B square, and C square. When you scratch a square, you’ll get either + or - underneath. You can only scratch one square per card.
So the machine spits out two cards. You and a friend take the cards, scratch, compare notes, then repeat the entire process over again.
You notice that whenever you and your friend choose the same box, you ALWAYS get the same results.
Easy, you think. The machine is just printing identical cards out.
IF this were true, what should you expect, statistically, if you choose DIFFERENT boxes? Consider a scenario where both cards are A+, B-, C+. Then:
I choose A, you choose B:+ and -, DIFFERENT
I choose A, you choose C: + and +, SAME
I choose B, you choose A: - and +, DIFFERENT
I choose B, you choose C: - and +, DIFFERENT
I choose C, you choose A: + and +, SAME
I choose C, you choose B: + and -, DIFFERENT
So out of the 6 different outcomes we can possibly have, we expect that 2/6 = 1/3rd of the time, if we intentionally choose different boxes, our results should match if we assume both cards are the same.
Except this isn’t what we see. In reality, the probability winds up being 1/4th or 25%. So, somehow, you and your friend are getting 100% consistent results when you check the same box on your cards, and yet the proof is undeniable: The machine isn’t spitting out identical cards for you both. Even if you and your friend take your cards and separate by an untold number of lightyears, the experiment still holds.
This would be a violation of Bell’s inequality, and it’s what we see when we measure the spin of entangled particles along one of three possible axes.
So, more on topic about determinism and randomness, it’s unclear.
There are a few ways you could interpret these results. None of them give any better measurements or predictions than the others. There’s Copenhagen, MWI, Bohmian Mechanics, etc… or simply “shut up and calculate!”
I like this analogy, but would it be closer to the EPR experiment to say “whenever you and your friend choose the same box, you ALWAYS get opposite results”, then conclude “the machine is printing out opposite cards”?
Either way the analogy IMO is a good way to explain Bell’s theorem to a non-professional.
Yes. I just figured it’d be easier to explain to a layman by talking about the possibility of a machine printing two of the same card (as opposed to two opposite cards). It feels easier to intuit, at least to me. The idea is just to teach the counterintuitive statistical nature of quantum mechanics but on a macro scale that makes sense to people.
In practice, though, yes, it’s all about opposite spins.
Occam’s Razor… Say you’ve got a big pile of Uranium atoms, and you watch them decay. The distribution of decaying atoms certainly seems random. But…
Suppose, deep within each atom, is a mechanism, a little timer, slowly counting down until the atom decays. The problem with this is, to match the observed distribution of decaying atoms over time, this mechanism would have to be incredibly complex! It would have to have tens of thousands of “moving parts,” in order to store the information needed to count down to the instant of decay. Even if it were as simple as a “burning fuse,” it would still have to be able to count time to tiny fractions of microseconds, and thus would have to come in many billions of different “lengths.”
This doesn’t mean it isn’t possible, but, egads, what a grave set of extra complications! The random model is vastly more parsimonious.
Besides, you could construct an experiment where entanglement led to getting the same results, anyway. It’d be slightly more complicated, but only slightly. For instance, you could just have opposite definitions of + and -.