The way I’ve always heard the saying is “it’s the volts that jolts but the mills that kills” (mills in this case referring to milliamps).
There is a point to be made here, but it gets a little confusing because voltage and current are related to each other. It gets even more confusing because the human body is not a simple resistor and does in fact have a very complex response to electricity.
As Chronos said earlier, electricity kills you in basically one of two ways. The first is that the current produces heat as it passes through your body and you basically get cooked to death. Here it is the total wattage (voltage times current) that matters. This is how lightning bolts and accidental contact with power lines kills you. It is also how the electric chair kills you.
Generally speaking, the higher the voltage the more current you will get to flow, so the higher the voltage the more dangerous it is. There are other things than the voltage that can affect how much current flows, so it is entirely possible for someone to get struck by a lightning bolt (several million volts) and live, while someone else can get killed by 120 volts from a defective lamp. To make it even more complicated, your body’s effective “resistance” to electricity varies quite a bit depending on the voltage being applied. At a low voltage, your body may seem to have millions of ohms of resistance, but at higher voltages your body has maybe a thousand ohms. Generally speaking, you can safely touch voltages under 50 volts and anything over 50 volts can much more easily kill you. This is why you can grab both terminals of a car battery and not get shocked but touching both conductors in a power outlet is a much less pleasant experience.
The second way that electricity can kill you is that it can screw up your heartbeat. This is related to the amount of current that you can get to flow through your chest (it’s the mills the kills). You can be killed by a surprisingly small amount of current. Most safety standards these days are built around anything under 5 mA being “safe”, though a current of only 5 mA is not at all likely to interfere with your heart’s rhythm. Once you get up around 50 mA though that’s when you are almost guaranteed to run into trouble. To put the numbers in perspective, 5 mA is 0.005 amps, and your house breaker won’t trip until you exceed 15 amps. So you can very easily be killed without tripping your breaker (that’s why they invented AFCIs and GFCIs).
Your heart is kind of a weird thing. It is much more immune to getting its rhythm all thrown out of whack at certain times during its heartbeat cycle than it is at other times, so getting your heartbeat screwed up is very much hit and miss (unlike being cooked to death by high voltage/current, which is much more predictably fatal). Hit your heart with a shock at some times, and nothing happens. Hit it at just the right time though, and your heart goes into fibrillation. This is where your heart gets weird. If you get it into this funky state, it is stable in this state. Instead of beating it will just sit there and shake, and unless someone happens to be standing next to you with a portable defibrillator it’s going to stay in that state and won’t be pumping blood, which is generally a very bad thing for you.
Another weird thing is that as you increase the current you generally increase the risk of your heart going into fibrillation, but at some point later, as you continue to increase the current the risk of fibrillation actually starts to drop. What happens is that instead of screwing up your heartbeat, the current just causes all of your heart muscles to contract. Your heart isn’t pumping blood at that point, so this is still a very bad thing, but usually when you remove the current the heart will go back into a normal rhythm. The important word in that sentence is “usually” thought. It isn’t “always”.
Coincidentally, if you use alternating current, the frequencies that are most likely to throw your heart into fibrillation are right around 50 or 60 Hz. Basically, from a safety standpoint, we couldn’t have picked worse frequencies if we tried. It is fairly easy to see how we ended up with those frequencies though. The lower the frequency, the bigger and heavier you need to make your transformers. Higher frequency transformers can be smaller and lighter (which is why aircraft tend to use a lot of 400 Hz instead of 60 Hz) but then the transformer hum from loose laminations and such gets really freaking annoying. Early power systems ran on a wide variety of frequencies, such as 25, 30, 40, 50, and 60 Hz. 50 Hz beat all of its competition in Europe, and 60 Hz won in the U.S. Exactly how we ended up at 60 is a bit of a mystery though. The story that I heard (that is likely an electrical urban legend) is that one of the early test systems was designed to run at 50 Hz, but they couldn’t get enough power so they cranked up the generators to 60 Hz. Everything after that had to be compatible with it, so the standard stuck.
Most electrical things that a person will encounter tend to be voltage sources, meaning that they will keep a constant voltage and the current will vary depending on conditions. So it may be the mills that kills but keep in mind the mills depends at least partially on the volts. It’s not one or the other.