Does running electricity through a radioactive metal like uranium or plutonium cause anything interesting to happen? These metals do conduct electricity though they’re far from the best conductors it seems. Would anything neat/strange/unique/interesting happen from running electrical current through wire made from a radioactive metal? I guess the most logical thing to expect would be that it might change the decay rate. Would it?
How much electricity?
would it matter? Would any amount do anything interesting vis a vis the radioactive conductor? Let’s start with the same amount that comes out of a normal household plug and go up from there.
No , electricity has no effect at all.
Temperature doesn’t affect spontaneous fission, such as from a lump of yellow cake or in magma.
Due to doppler broadening, a nuclear power station must be aware of the variances in the power generated when the fuel is at different temperatures, but they are very much more worried about the temperature the moderator is at…
Pressure doesn’t directly affect fission, all other things being equal.
If you apply enough electricity (enough to totally ionise a radioactive atom) the half life may decrease. This happens because the lowest orbital slot can be energetically favoured over the neutron binding energy - a neutron splits into a proton and the electron jumps into the first slot.
But this is a really special and totally artificially created situation.
si_blakely are you saying enough electrical current would affect the amount of radiation given off from the radioactive material?
We are talking about total ionisation - vaporisation to a gas and then stripping off all the electrons. This takes a massive amount of energy and a strong vacuum. Not going to happen in any reasonable situation.
Because the decay products (beta particles) get captured into the first orbital, the radiation output will actually decrease, but the rate of transmutation will increase.
si_blakely is (I think) talking about something quite different from running current through the metal. Just running a current through, no matter how large, is not going to have any effect on the radioactive decay of the material. Electric current flow happens in the outer electron shells of an atom, radioactivity happens in the nucleus. They are well isolated from one another, especially considering that most radioactive metals are of high atomic number, and so will have several layers of non-conductive electron shells separating the outer, valence shells where current flow takes place and the nucleus itself (although the current would not affect any radioactive decay, even in low atomic number elements where the valence shells are the only ones).
I guess the current will produce an electromagnetic field around the metal, which, if it is radioactivity of a type that produces charged particles (alpha particles, which have a 2+ positive charge or beta particles, which are actually electrons, and have a negative charge) should affect the direction in which the particles travel, but it will not affect how fast they are produced. Gamma radiation, which consists of uncharged photons, should not be affected, though. (I believe all natural radioactivity produces alpha, beta, or gamma rays, but I think some artificially created radioactive materials radiate positrons, which are light like electrons but have a 1+ charge.)
In principle, the electric field will make a difference, but in practice, the nuclear forces involved are going to be much stronger than the electromagnetic forces, and so the difference will be utterly negligible.