Last year, my English teacher would periodically take time to rag on insurance and its questionable (in his mind) necessity. The way he put it, insurance is “just a way of making sure everyone is indebted to the rich” and that, when a policeman pulls you over and asks for your proof of insurance, he just wants “proof that you’re indebted to the rich.”
How much of this is reality based? Is insurance – the way it currently works in the United States – a bad system that doesn’t work for the consumer’s best interest? Is there a better way? What did my teacher mean?