Charging Today's Car Batteries

Did a little research. Looks like the magnet wire used in a lot of motors, alternators, etc. is NEMA MW-35-C, which has a polyester basecoat and a polyamide topcoat. It is spec’d for continuous use at temperatures up to 200 °C. I don’t know at what temperature the insulation would start to melt or carbonize, but I’d guess it’s above 300 °C.

The regulator doesn’t actively limit the alternator’s output current, but the current is limited as a consequence of the design.

Using a proportional controller with negative feedback, the regulator continuously monitors the output voltage and continuously adjusts the field current to maintain a constant output voltage, which will be between 13.5 VDC and 14.5 VDC depending on the temperature. As the load impedance decreases, the output current will increase (Ohm’s Law), and the controller will source more and more current into the field winding to maintain the fixed output voltage. If you keep decreasing the load impedance, a point will be reached where the controller is sourcing max current into the field winding. (This is also the maximum output current the alternator can source into the load for the given RPM.) If you keep decreasing the load impedance, the current will remain more-or-less constant and the voltage will start to decrease due to the non-zero source impedance of the alternator. If you keep decrease load impedance to the point where the alternator’s output voltage is less than 12.6 V, the battery will start supplying current to the load.

There is also some difference in the type of battery you are using. Supposedly different charging cycles are used for optimum performance of conventinal flooded batteries vs. AGM type batteries. Most automotive charging systems are designed for flooded batteries. Using an AGM type in a car might not yield the nameplate capacity of the battery because the charging system never fully charges it.