I am so sick of hearing people say “Well, the Constitution gives us the right to…” or “You can’t do that because there’s no right to [whatever] granted by the Constitution.”
The idea that the Constitution, and hence the State, grants rights to it’s subjects is completely contrary to it’s language and intended purpose. Someone seemingly has convinced everyone that they’re subjects to the US Federal Government and must beg for rights to be granted to them.
People have inalienable, innate rights. The Founders recognized this, and the Bill of Rights is written STRICTLY AS A RESTRICTION OF WHAT THE FEDERAL GOVERNMENT CAN DO. It does NOT grant any rights. It recognizes that rights exist, and specifically forbids the Federal (and State through the 14th amendment) government from doing anything to violate or abridge that right.
I’ve seen people argue in support of the war on drugs by saying “there’s no right to smoke drugs in the Constitution”, and that sort of logic. Is that what we’ve become? A society that will try to screw over everyone any way that want so long it doesn’t violate the rights “granted” in the Constitution?
It’s sad that people seem to think rights come from the State, and hence can be restricted or taken away by the State, and that anything not specifically enumerated for protection in the Bill of Rights obviously isn’t a “right”.
We don’t stand much of a chance at freedom if this is the common belief among even the more educated people of the SDMB, let alone society at large.
Note to mods: This is sort of GDish, but it was intended as a rant. I read, yet again, someone talking about how the Constitution grants this and didn’t grant that, and decided to bitch about it.
If you decide it’s GD material, lock this thread and I’ll rewrite this stuff in a more GD-friendly way. As it is, I’m just ranting at the fact that the idea that we’re granted rights is so widespread. It really pisses me off.