(With apologies for the wording in my first attempt, I was neither trying to be in anyway provocative or political, I was struggling for a succinct descriptive title, beyond ‘seeking history lesson’ and used a phrase I read here all the time. Oopsy, totally my bad, sincerest apologies all around!)
However my post stands unchanged:
(Forgive me for not being more informed on American history, as a Canadian some periods are spotty.)
So Americans have always had the right to bear arms, and we can all agree, I think, there was a time, when the country was being settled, that probably many citizens walked around armed, had access to a gun, maybe even a gun in most homes?
So my question is what did that transition to NOT walking around with a gun, look like? Was there a campaign after several accidents? Cities and towns pass ordinances? Bar owners restrict access? Or did it just happen somewhat organically, when the actual ‘need’ for it fell away? When organized, trustworthy policing became the standard perhaps?
Was there pushback against the slightest restriction, no matter how common sense it might seem? Was it partisan? A political divide? Was there frothy rhetoric?
I got to wondering about this, the other day, and find I am increasingly interested to learn more.
I am sincerely interested in how that transition happened, what initiated it, was it met with resistance? What was happening that society moved towards being less armed?