Particle Colliders

After reading a bunch of articles on the LHC, several questions came to mind.

When you smash two particles together, at a specified energy, are the results predictable? If you have an energy budget of X eV from the collision, what determines the number and types of particles that are produced?

At the energy levels that the LHC is operating at, how do they prevent the detectors from being destroyed by radiation? Is neutron activation a problem?

From the CERN website

( my bolding )

Help me here. How can something pick up mass rather than speed? If a particle were to be in a perfect vacuum, how can it “pick up” mass from anything of nothing is around it? Where would additional mass come from? And, if a particle could indeed reach the speed of light and it picked up mass rather than speed, then is it presumed that no particle of matter can ever surpass the speed of light? If it could, then it could not, right? The faster it goes past the speed of light, the more mass it picks up, not the more speed.

Cartooniverse

  • I am aware these are simplistic questions to complex problems, but it’s what has me curious.

They are predictable in a probabilistic sense. Different things can happen, but they know on average how often different types of things should happen. They know this both from theory and seeing it before. In the case of the new CERN collider, they know only from theory since they’ve not done experiments before at those energies. And different theories make different predictions. That is part of what they’re testing – which theory(ies) may be right (or more correctly not rejected by the evidence).

The energy level they are operating at is only very high by particle physics standards. They are very very low by every day standards. They are much less than the energy in an ordinary match for example. So nothing macro-sized like the detectors are going to be damaged. I’m sure over time they must be replaced.

The quote is simply wrong. The particle doesn’t reach the speed of light and then switch to picking up mass rather than speed. Not only can no particle go faster than the speed of light, no particle with rest mass can ever reach the speed of light. (Photons, which are the particles of light, and a few other similar things that have no rest mass always travel at the speed of light.)

What happens is that as the particle travels faster and faster its mass increases. The mass of a particle with rest mass m moving at velocity v is m/SQRT(1-v^2/c^2) where c is the speed of light. As v approaches c, the denominator goes to zero and the mass goes to infinity.

The particle doesn’t “pick up mass” from its surroundings so it doesn’t matter if it’s in a vacuum. What happens is the energy that is used to accelerate it partially makes it go faster and partially is “converted” into mass. Recall that by Einstein’s equation E = mc^2 energy and mass are equivalent or rather two aspects of the same thing. As the particle goes faster and faster it gets more and more massive. This makes it harder to accelerate so more and more of the energy goes into making it even more massive rather than faster.

Their word choice is questionable. Nothing can be accelerated to the speed of light. However, the closer an object gets to the speed of light, the more difficult it becomes to accelerate it. From the physics formula F = m * a, where F is the force, m is the mass of the object, and a is the acceleration of that object by that force, it would then stand that m, somehow, is getting larger as the velocity of that object gets larger.

Regarding what gets produced…
In the case of electron/positron colliders, all of the energy of the colliding particles goes into the particles produced in the collision. Since you know how much energy you gave the incoming particles, you know how much is available to make stuff, and theory tells you what you can and can’t make. Of course, theory could be wrong, especially as you move into uncharted waters, so folks also look for things that don’t fit expectation.

With a proton/proton collider, however, it’s a little messier. You give the protons a fixed amount of energy, but that energy is shared among the protons’ constituent parts (namely, quarks and gluons – collectively, “partons”). It’s actually individual partons that collide, and at any given moment there’s no telling how much energy a given parton will have. So, you end up with collisions where some of the energy is available to produce stuff, and the rest of the energy/quarks/gluons are just debris making a mess of things.

Despite the messier outcome with proton/proton collisions, the basic idea holds though: theory tells you (via simulations of the collisions) what to expect. You try to rely on the simulation only when necessary, and in many cases you can base expectations firmly on decades of real life measurements.
Regarding detector damage…
Absolutely, the detectors get pwn3d. A lot of R&D goes into developing “radiation hard” components, but it’s still not a very nice place to be. The worst case scenario is if the beam shoots through a piece of the detector. There are plenty of safeguards in place to keep that from happening, but if it did, it would ruin whatever it hit.

Under normal operation, the more delicate components will eventually need to be replaced. Particularly sensitive are the silicon tracking chips. These are basically electronic chips on circuit boards that can tell you precisely where a particle went through. After a while, the silicon wafer gets too beaten up to work well.

Neutron activation is a problem, although that’s more of an issue for so-called “fixed-target” experiments. A fixed-target experiment is one where a particle beam is directed at, well, a fixed target of something (say, carbon) with a detector downstream to do physics with the produced particles. Such targets can become brittle from radiation damage, and in many cases cannot be approached by people because they are too radioactively hot.

To follow up on the mass sub-thread: The mass only gets larger if you insist on Newtonian expressions. We’ve known for a century that Newtonian physics is only an approximation. When you use the more general description of nature (namely, special relativity), mass is constant. The former variable-mass terminology is long outmoded. It’s to be expected from an article that also says the particles reach the speed of light.

So… <just a thought path my mind has wandered down>

Assuming the Standard Model is correct, a particle with mass interacts with the Higgs field as it goes - giving it what we call mass. That mass at rest is a measure of the interaction of the particle with the Higgs field. Trying to accelerate it to close to c gives the particle more energy, but it traverses more of the Higgs field per unit time, and thus interacts more strongly, and thus gains mass. Eventually the interactions with the Higgs field grows so strong that effective mass grows towards infinity.

Am I right, or is this too simplistic an approach? <don’t answer that>

Si

To elaborate further: In familiar Newtonian physics, the momentum of a particle is given by P = mv (momentum is mass times velocity), but this is only an approximation. The correct formula is P = mgammav, where gamma = 1/sqrt(1-v[sup]2[/sup]/c[sup]2[/sup]), and m is what’s sometimes called the “rest mass”. In many older physics books, this is interpreted as meaning that the “real mass” is given by the rest mass times gamma, and so the “real” formula for momentum is still P = mv. You can work with this, but ultimately, it’s a lot clearer if you instead say that the gamma is associated with the velocity: P = mu, where u = gamma*v is what’s called the “proper velocity”. In this interpretation, the mass is absolutely invariant, and is the same no matter what the speed is. It’s a lot easier to work with this interpretation, since u behaves in much the way that you would expect v to behave: There’s no limit on how large u can get, and if you add together the velocities of two particles, you can just add u[sub]total[/sub] = u[sub]1[/sub] + u[sub]2[/sub].