The idea is pretty simple, really. Should the government provide health and auto insurance as a non-profit program? Or, at the very least, should the insurance industry be regulated along the same lines as water, natural gas, waste removal, and electricity companies?
My thoughts are that since obtaining car insurance is required by law in order to live a normal lifestyle, companies shouldn’t be allowed to profit from it, or at the very least, the profits they make should be strictly governed and regulated. I can’t think of another industry that enjoys such a legal mandate, and it pisses me off that the government is telling me I have to buy a particular service, and that service provider is allowed to charge Whatever the Hell He Wants for it.
Home insurance doesn’t have a similar legal mandate, as far as I know, but I think a heavily regulated or nationalized home insurance provider would have a major positive effect on housing and home ownership. I also think that insurance companies providing home insurance enjoy just as much of a windfall from banks demanding their loan recipients use the service as car insurance providers enjoy from the law. It’s only slightly less annoying that it’s the banks and lenders demanding it than my government demanding the same.
I understand that banks should be able to require home insurance before loaning to you. I understand that the government should be able to require drivers to have liability insurance. What I’m not happy about is that these essential insurance services are only available through companies making enormous profit margins with basically no consequences. These guys shouldn’t be able to collect X dollars based on a legal mandate, and provide only X/2 for their services. It would be like the trash company deciding that they’re not making enough money, and charging you 5 bucks per bag to dispose of your trash, in order to make their investors happy and pay the top executives a forty million dollar bonus.
Am I the only person annoyed by this?