Lots of internet sites out there, here’s one…
http://www.eei.org/industry_issues/industry_overview_and_statistics/history/
When electrical distribution systems were first developed, it was found very rapidly that the longer the cable, the greater the losses, so that you could churn out 600V dc at one end, and if you were needing a large current, you could drop well over 10% in output voltage.
This was for what would be called relatively short cable runs of a few hundred yards, and certainly not suitable for city wide systems.
The problem comes from the following, where,
V(volt drop along cable) = I[sup]2[/sup](the current going down the cable) X R(the electrical resistance of the cable per yard X the number of yards of cable)
or
V= I[sup]2[/sup]R
This loss of voltage along a cable actually meant that the cables were losing energy energy in the form of heat, and so the cable got warm, and this in turn limited the amount of power you can supply to your users.
If you can increase your supply voltage, then you can deliver the same amount power using a lower current, and the lower the current, the lower the losses in your cable, so if they early power companies could have sent say 6000Volts dc, instead of 600volts dc down the lines, they could reduce the amount of current needed by a factor of 10, and reduce power losses in the cable by a factor of 100.
Unfortunately there is a problem with this, 6000Volt generation during these early days was not realisticly possible as the insulators required in a rotating generation device were nothing like as good as they are today.
The other problem of course is that even if this were possible, 6000Volts is not a very safe way to supply customers direct, so you need to find a way to reduce this voltage to more manageable levels.
There is a way to do this, using the supply voltage to drive another generator producing a safer lower voltage, but this would have been utterly impratical and uneconomic, and back then there was no other way to do it with dc, except to use huge resistors to drop the voltage, which would have been extremely wasteful of power and uneconomic, as well as not being an intrinsicly afe way to do it, and a few other reasons I won’t go into in this post.
It so happens that the answer to th problem was to use ac, because by continuously varying the voltage, you can use transformers.
Transformers are a device able to transfer power through them, they can increase the voltage, which in turn reduces the current, hence the power remains the same, since
P = V x I
or Power = Voltage X Current.
for power to remain constant if voltage rises, then the current must fall.
This was an ideal solution at the time, you could generate power at say 600Volts AC, put it through a transformer and step it up to 6000Volts, and at the other end of the cable you could step it down to 120volts using another transformer at the other end of the cable.
At the time, and today as well actually, it was possible to make cables that could handle far higher voltages than could be generated in a large, fast rotating machine like a generator.
As a result of the use of transformers, the current transmitted along long cables was reduced, and this reduced the losses in those cables, so they could be made far thinner and cheaper.
The entire electrical distribution system of virtually every nation used AC power because of this.
There are some disadvantages to AC over DC, and nowadays we have the high power high current and high voltage electronics to distribute power using DC, however, it will not happen as it would be immensly costly to change.
AC power generation is also more easily carried out using alternators, which can churn out more power than can the DC generator - the dynamo.
I have greatly simplified this, I hope that other posters will not come in and make it too technical, I have observed that this happens often in electrical/electronics threads and it does not take long for the OP to suddenly get overwhelmed by a huge amount of information.