Question about particle accelerators

Hi, I’ve been earnestly poring over the wiki pages on the LHC and particle accelerators in general; but I have a few questions which I couldn’t find answers to.

First, I want to see if I have it straight: At the LHC, in their search for the Higgs Boson, they have been aligning two beams precisely to collide with one another. These beams consist of a stream of (usually) protons which are emitted one at a time but in incredibly rapid succession; the result is a series of proton-to-proton direct collisions which are photographed individually, separately from the next and preceding collisions, and in their entirety

Each of these photographs (I understand that they may not be photographs in the ordinary sense) is then examined by computer to look for certain predicted anomalies or a characteristic that would indicate the presence of what they are looking for.

Grossly oversimplified, but is that basically it?

My questions would then be:

  1. Although each of these collisions is, in theory, a separate thing, if the particles are so close together as to constitute a beam, is there really time for each separate collision to happen and fade away completely before the next collision occurs?

  2. Is there any reason to think that the preceding and subsequent collisions may affect a given collision in some way? In other words, if you took just two protons and collided them, might the resulting photograph look different than the pictures of individual collisions in two beams?

  3. Did they start out just looking at single collisions, or have they always used beams?

I do apologize for any major misconceptions I may have. But I really want to know. Thanks!

Even with a strong beam, most of the protons end up whizzing right past each other, without colliding. The events are discrete enough that there’s almost no chance of interference between them. And if you tried to do just one pair at a time, you’d have to wait a very long time for the results, since they’d almost always miss.

Now, you can do work with individual particles if you’re shooting at a piece of metal, or the like, instead of just another particle: It’s a much larger target and so easier to hit.

Ignorance fought! Thank you, Chronos.

Your description of the beam was close, but not quite right. Imagine that the beam is made up of a bunch of buckets, which are spaced equally apart. In your description, each bucket holds one proton, so there’s a constant spacing between protons. How it actually works is that each bucket holds many protons, so what you have is regions of very high proton density, followed by open space and so on.

At design conditions for the LHC, each bucket holds roughly 10^11 protons (called a “bunch”) and the buckets are spaced ~ 8m apart. Currently only every 3rd of so bucket is filled, although they do have the full number of protons per bunch. This means that when people talk about a “collision” really what they’re talking about is two bunches passing through each other, which can result in any number of collisions. Currently, there’s ~10 “interactions” per bunch, where an interaction is a collision between two protons that’s strong enough to break them apart.

From the physics perspective, each individual proton-proton collision is completely separate and don’t really influence each other at all. In fact, simulating these events is done by literally simulating a bunch of collisions and stacking them on top of each other.

Where it gets interesting, though, is from the detector angle. Collisions from a single bunch are very difficult to separate from each other - sometimes you can tell because they occur at physically different points in space, but a lot of time you just have to accept uncertainty in your measurement. Even worse, the spacing in time at the LHC between two bunch crossing is ~75ns, and a lot of times the electronics can’t respond that quickly. In terms of your photograph analogy, its like the “fuzz” of the previous image is still there when you take the next one. Needless to say, this can also make things more difficult. These two issues are called in time and out of time pileup respectively.

I think I answered this above, but the answer is that it doesn’t affect the physics, but does very much affect the experimental detection of what happened.

A lot of particle physics is still done using particles that are incoming from space (either from the Sun or further out in the galaxy), which can have very different rates. Accelerator physics is only one part of particle physics.

For a graphical view, see this image, which shows a single bunch crossing in which 13 proton-proton collisions have been identified. The particles produced in each independent collision trace back to a single point, so you can pick them apart. The imaginary line passing through the 13 vertices is the axis along which the protons are traveling.

Huh, I would have thought they’d keep the bunches small enough that they’d only have approximately one event per bunch crossing.

You are limited by the RF structure of the accelerator. Protons can only orbit stably at one phase of the RF system, and there are only so many RF cycles in the ring’s circumference. The LHC RF system runs at 400 MHz, which translates into about 36,000 proton buckets circling the ring. But, you can’t fill all of them because you need gaps for magnet switching (which is slow and which can’t happen when protons are passing through lest you irradiate everything nearby). So, the LHC has 2,808 (eventually) usuable buckets. The only way to add to the data rate is to add protons to these buckets, up until you can’t pick it apart anymore.

I was assuming just one interaction at a time as well, so Krinthis, thanks for the additional detail. From the exclamation mark in Pasta’s link, it sounds like 13 is about the upper limit they can resolve.

This event pile-up will increase further, by another factor of two or so perhaps.

Note that the Poisson random nature of these interactions means you either have to deal with pile-up, or you have to be annoyingly inefficient. That is, if you want to get two events at the same time only very rarely (say, 1% of the time), then you inescapably only get one event pretty rarely, too (about 10% of the time in this example). Looking at it the other way: say you don’t want to waste more than 15% of bunch crossings. That condition implies a rate of 57% for getting two or more interactions in a crossing.