Possible for building materials in a room where X-rays were taken to become permanently irradiated?

Just curious.

In a doctor’s office where lots of CT scans and X-Rays were taken is it possible for the floor concrete and surrounding building materials to become permanently irradiated and potentially pose an environmental radiation hazard a year of so after the office shut down and all the equipment was removed?

I never heard about this as a possible hazard and I’ve been leasing commercial and medical space for 25 years. A potential leasing client has expressed some concerns about this possibility.

It’s hard to make materials radioactive by exposure to radiation. In your case the concern would be more toward spilled radioactive waste than building materials. If your client is concerned, a Geiger counter sweep should convince then the space is safe.

No. It is not possible.

X-rays are a completely different kind of ‘radiation’ than the kind that (can, sometimes) turn other material to become radioactive.

Well, it IS possible for induced radioactivity to occur via photodisintegration but this requires high energy gamma rays. Gamma rays are the same “type” of radiation as X-rays (and are in fact the same as ordinary light), just higher energy. But the minimum energy required for photodisintegration seems to be around 10,000 keV. Ordinary diagnostic X-rays are typically 20-150 keV, and therapeutic X-rays used to treat cancer are less than 600 keV. Higher energy X-rays are occasionally used medically, but producing them requires a linear accelerator, not something you’d find in a typical doctor’s office, or even a typical hospital.

–Mark

I would argue that this depends on how you define types; you could certainly divide all radiation into broad types where this is true; EM radiation, particle radiation, grativational radiation and so on. But it’s at least as valid to divide the spectrum of EM radiation into different types based on its energy level and effect, as we do with radio, “microwave”, infrared, visible, ultraviolet, X-ray, gamma, and so on.

It’s certainly possible for the building to be rendered radioactive. It could happen, for instance, if a neutron bomb is detonated nearby. It just won’t happen from the X-rays.

I think it would be simple to rent or borrow a radiation detector and demonstrate that there is no more than normal background radiation in the room. It would be a lot more convincing than saying “these scientists on the internet assured me it’s safe.”

Though I do agree that there’s no way an X-ray generator could leave residual radiation in a facility.

Well I also called up the chairman of my state environmental radiation hazard commission and after he finished sighing loudly several times as I relayed the question he simply said “No that’s not the way it works. That’s impossible. It’s not going to happen, a medical X-ray is not going to irradiate concrete.”

Now that would bring down the rental rate.

But reduce surcharges on X-rays for the visit to the dentist.

Doesn’t seem like the kind of client who would trust a Geiger counter, anyway… people who distrust science don’t tend to believe the devices based on science, either (except when the results match their preconceived fear).

“Normal background radiation” could very well be interpreted to mean “NO I DON’T WANT ANY RADIATION IN MY BUILDING!”.

This is just the first issue… is that client going to blame your voodoo for every random problem in the building?

A note, by the way: “natural background radiation” will depend on what the building is made of. Cinder blocks are more radioactive than most natural mineral building materials, which are in turn more radioactive than wood (though even cinder blocks present only a minimal danger).

X-Ray and Gamma are not necessarily different on the basis of energy or wavelength. The classical definition of these is that X-Rays come from an X-Ray tube and Gammas come from nuclear decay, even if the two are of the same energy. And though most X-Ray tube emissions are lower photon energies than most nuclear decay emissions, the two ranges overlap. If you actually met an individual photon of around 15 kEV to 150 kEV, there’s no way you could tell if it was an X-Ray or Gamma photon, as there are plenty examples of both in that range. You could take it apart and stain its innards, you could use a stethoscope, you could dip it in liquid nitrogen and see if it glowed. Still there’s no way to tell based on the photon itself. Unless the photon could remember its birth, it wouldn’t know itself. You’d have to learn about its history to find out which it was.

But this definition isn’t used in practice, either. Choice of term correlates with field of study, photon source, photon energy, age of speaker, and several other bits of context. But the long and short of it is that there is no strict definition to distinguish the two, particularly in the overlapping energy range.

Anyone who actually uses either term regularly has some definition to distinguish between them, usually based on an energy cutoff. It’s just that different people (especially working in different fields) use different cutoffs.

The definition based on origin is certainly in use by some, but it fails not only because you don’t always know the source, but because sometimes the source isn’t either.

That classical definition breaks down when the source is something else, such as a supernova, a quasar, or just extremely hot interstellar gas. Because of that, the usual astronomical practice is to label shorter wavelength stuff gamma and longer X-rays. But unlike the rest of the EM spectrum, there isn’t an official or even widely accepted boundary between the two.

I’d argue (although we’re moving out of GQ here) that even that’s too strong a statement. When does a fiddle become a violin? There are contexts when I would choose one word and not the other, but I’ll be damned if I could actually write down my internal linguistic rules.

Same with x-ray vs. gamma ray. It’s certainly not a hard energy boundary. There isn’t any risk with sloshing the terms around over an order-of-magnitude or two, so the language as used in practice allows for context-dependent flexibility. It’s just easier that way.

In my experience, all of the astronomers, at least, had an energy cutoff they used (a definition based on origin is mostly useless for astronomers, who often don’t know), but I think the solar folks used a different cutoff than the deep-space astronomers.

One will sometimes see an instrument that detects a range that spans that cutoff, but in that case they’ll usually just describe it as a vague “high-energy”, or refer specifically to “the ____ band”, where the blank is the name of that instrument.

I’d just be worried about the neighborhood.