What is the original definition of a Coulomb?

I found this: physics - Origin of the coulomb and ampere - History of Science and Mathematics Stack Exchange
"
my question: why is the coulomb (and ampere) the size it is today, which an outrageous size that makes it impractical for direct use?

The coulomb has that value because in the mid 19th century electrical engineers needed practical units for submarine cables and telegraphy.

In 1861 a committee of the British Association for the Advancement of Science was appointed to propose a system of units that included electrical and mechanical units. The committee defined two coherent systems for scientists, called electromagnetic and electrostatic units, and added several “practical” units that were decimally compatible with the electromagnetic units, for the electrical engineers involved in telegraphy and submarine cables.

In 1861 the electrical engineer Latimer Clark suggested to the committee that practical voltages were in the range of 1 to 10^6 volt, resistances of conductors and insulators were in the range of 1 to 10^8 ohm, and the smallest current was about 10^-3 ampere (translated to modern units). He proposed names like volt and ohm for these practical units, and the prefix mega for 10^6. The unit of charge was unimportant for submarine cables and telegraphy, so nobody cared about its practical value, it merely had to be coherent with the unit of current.

In 1881 the International Metre Convention adopted the practical units and their names: volt, ohm, and ampere. These practical units are the ones that scientists and engineers are using today."

This person seems to be saying that the definitions of a volt and an ohm were decided first, and the the coulomb was made to fit. So a coulomb is the number of electrons such that 1J applied to them fit the kind of numbers these people were looking for?

However, we also see this: “The ampere was originally defined as one tenth of the unit of electric current in the centimetre–gram–second system of units. That unit, now known as the abampere, was defined as the amount of current that generates a force of two dynes per centimetre of length between two wires one centimetre apart. The size of the unit was chosen so that the units derived from it in the MKSA system would be conveniently sized.”

This paragraph is too much for me, and assumes quite a lot on the part of the reader. Can someone tease this apart? Which one is the original basis for C? Thanks.

OK, when two wires both have a current flowing through them, there will be a magnetic force between the wires. That force depends on things we already had units for, like the length of the wires and the distance between them, and of course force is also something we already had units for. But it also depends on the amount of current in the wires. So the force between two wires can be used to establish a standard for current. You take two wires of known length, put them parallel to each other a known distance apart, set things up to measure the force between them, and then adjust the current until you get the right amount of force, and when you do that, the current you have through the wires is what you use as your current unit.

And yes, some of the electrical units end up being inconveniently sized. That was inevitable: Once you have the meter, kilogram, and second (and hence the newton and joule and watt and so on), if you want to keep the electrical units coherent, then you only have one degree of freedom for the electrical units. As it is, the coulomb is impractically large, but the volt is a reasonable size. If the units were redefined so that the unit of charge were of a reasonable size, then the unit of voltage would be impractically large instead. And voltages are used a lot more than charges are.

I’m not sure how much a Coulomb is an unreasonable size. 1 amp x 1 second is not a huge amount of electrical current at macro scales.

It’s only a silly number of electrons because electrons are silly-tiny things compared to human macro scales. In other news, the meter is a really large number of hydrogen atom diameters. So?

I spent more time that I’m used to needing in Google searching, and I can’t find the parameters of the original definition. I assume it was something like, “2 wires 1 meter apart, carrying identical currents, such that they attract each other with 5N of force.” Do you know what it was?

And, relatedly, a tesla must have represented an unimaginably large magnet at the time it was defined.

Then you get Faraday’s constant - converting between Moles and Coulombs. Which if you are an electrochemist is a useful thing to do.

The Ampere was previously defined as the current required to deposit 0.00111800 grams of silver from a solution of silver nitrate in one second. Which also therefore defined the Coulomb.

A blog post from Derek Lowe last year on photochemistry made the lovely point that your quantum yield for a reaction needs to understand how many moles of photons you need for moles of product.

From time to time I think about a new system of measures. I consider what stable natural units might be best to base it on. Of course it is a well and often run thought experiment. I usually tend to basing it on a mass. Proton or Neutron. Then various extrapolations of gravity and velocity to get those units.

Is there an alternate set of units that have been derived that look more intuitive and are more easily inter related? Many of the units and scales are derived from observations at a time of more limited knowledge. Or created without concern of integration and interaction overall.

There are a lot of informed folks here. Would a universal units of measurement thread be fun? Is there one?

Well you get the Plank units. Not all of these are useful. But that just seems to be the nature of the game. In some ways this probably relates to the Hierarchy Problem.

Yes, it used to be that 1 A = the current such that running it through two infinitely long, infinitely thin straight parallel conductors 1 metre apart produces a force of 2 x 10-7 N

THANK YOU.
Do you know why that amount of force was chosen? Was it just so that the coulomb would make the volt a nice number for normal use?

Was this before or after the definition mentioned by DPRK?

And the Farad.
I remember hearing about how a Farad-sized capacitor would be as big as a milk bottle when I was in high school. Now you can get 10F capacitors that are the size of a AAA cell.

That is true, but, think about it this way: a 10F, 2.5V capacitor can store 31J. A 100 uF capacitor charged to 5000V has 1250 J of electrical energy.

By that point there had long been standard realizations of the ampere, ohm, and volt, so I do not imagine anyone was going to make any radical changes to those units. Note that these units had already been defined in order to have practical magnitudes (vs what might be called “abvolts”, “absohms”, etc.) Furthermore, 1 coulomb will then be the charge delivered by 1 ampere in one second. In short, the volts and ohms and amperes came before, and the coulomb was derived from that.

When I was a freshman (1985-1986), our lab equipment was set up for 5 volts. Our capacitors were coin-sized disks, rated in microfarads and picofarads.

One day we came in, and found an object about the size of a beverage can, lying on a workbench. It was a capacitor. I think it was either 0.9 or 1.1 farads.

Naturally, we put 5 volts across it.

No one was willing to lick the terminals. (We may have been freshmen, but we weren’t that dumb.) We finally used a piece of steel plate to short the terminals. The spark melted a notch in the steel. We went home with silly grins on our faces.

Of course, you now know that licking the terminals would only cause a slight tingle, right?
Less than a 9v battery.

It ain’t the volts. It’s the amps. As I said, the spark cut a notch into a steel plate.

No, sorry, you are very wrong.

Remember: V=IR

a 1 farad cap charged to 5 volts, “instantly” discharged, would produce about 50 amps of current assuming very small resistance (say 0.1 ohms). That’s not a whole lot. You aren’t going to get an arc, let alone physically damage a steel plate.

I did something similar in college. Whenever some major piece of equipment was decommissioned, they would throw all of the circuit boards into boxes on one side of the engineering lab. We were free to do whatever we wanted with the parts.

After they disassembled an old Amdahl mainframe, one of the parts that ended up in the bins was a capacitor that was about the size of a soda can. Out of curiosity, I charged it up with a bench power supply and then shorted the terminals with an old screwdriver. I expected some sparks. What I didn’t expect was that it blew the corner of the tip off of the screwdriver and arc welded the screwdriver to the terminals.

I don’t remember what voltage I charged it up to, but it was definitely more than 5 volts.

It’s the volts that jolts but the mills that kills. :slight_smile:

While there is some truth to that old saying, voltage and current aren’t completely separate things. You cut a notch in your steel (and I blew off the tip of my screwdriver) because steel has a very low impedance. Your tongue, on the other hand, is going to have a much higher impedance, a lot lower than dry skin, but a lot higher than steel.

The RC time constant for a capacitor discharging through a 0.1 ohm piece of steel is going to be 10,000 times shorter than the time constant for the same capacitor discharging through a 1k ohm resistor (a very rough value for your tongue). Much less current will flow through the 1k resistor, but it will flow for a much longer time until the capacitor completely discharges.

As well, the near instantaneous local heating is what leads to welding and arcing, rather than just the metal warming up a bit. The same total energy is deposited into a small area faster than the heat can conduct away. So the local temp skyrockets instead of the bulk hunk of metal warming slightly.