According to the Brady website One in three people in the U.S. know someone who has been shot
And
Nearly one in four American teens have witnessed a shooting.***
I don’t see any cites to their claims on the website.
I suppose the first one could be come about. If someone who was shot was known by dozens or hundreds of people I guess each one could say they knew someone who was shot. That doesn’t prove the amount of total people shot is a significant number, though. A large amount of people in the sample could say they knew someone who has been shot and it could just be one guy known by all in the sample.
I call bullshit on the second one, though!
Especially since I can’t seem to find their cites for this claim. Can anyone here point me in that direction?
The point isn’t an attempt to quantify how many people have been shot; it’s an attempt to illustrate just how common it is for someone’s life to have been affected, in some way, by guns and gun violence.
In my case, I can think of three people I know / knew, off the top of my head, who had been shot. If I sat and thought longer, I could probably think of a few more.
A grade school / high school classmate. He was accidentally shot when his father’s hunting rifle, which had been improperly stored with a bullet in the chamber, went off when my friend accidentally knocked it over in his parents’ closet – he was ~12 when this happened, and he wound up a quadriplegic, and died in his early 20s.
A high school friend, who tried to kill himself by shooting himself in the head with a hunting rifle. He survived, but it took years of rehab before he could effectively function in school.
A college friend, who was working as a cashier at Taco Bell. Just as the store was closing, a guy tried to rob the place with a handgun…during the robbery, the gunman panicked, and shot my friend in the shoulder.
Says who? They didn’t say either way what their point is on that claim.
But the fact remains if one guy has been shot, and he was known by 100 people, all 100 people could say they knew someone who has been shot. If the sample survey taken was 300 people, with only those 100 people saying they knew someone who has been shot, the survey would show 1 out of 3 people have known someone that’s been shot. While statistically it would be true, it’s a form of informal fallacy as it makes it appear more people have been shot than really were.
Here’s a 1999 poll from Gallup that asked respondents, “Not including military combat, has anyone close to you—such as a friend or relative—ever been shot by a gun?” 31% of respondents answered yes, which is pretty close to one in three. I wasn’t able to find whether Gallup had conducted the same poll more recently; but it doesn’t seem likely to me that the patterns of gun violence in the U.S. have changed significantly in the past decade. (I welcome correction in this regard, though.)
Well, yes. Is there any other way to read that statistic? I don’t think it’s misleading or cleverly worded or anything. It’s quite plain English. Like I said in the other thread, I’m surprised the number isn’t higher.
Your OP was asking about the claims, not the interpretation of those claims. Which are you interested in?
And, for the record, I know a few people who have been shot (one fatally), and at least one person who shot someone. I have never witnessed a shooting although I have heard gunshots in the distance that were aimed at someone.
As far as I know, the only person effected by a shooting was in that urban legend where the bullet went through a testicle and into a woman’s uterus, thereby causing a pregnancy.
Like I said in the OP, the 1 in 3 claim could be statistically true, though it still is a skewed presentation. If 1 guy get’s shot and they survey only the 100 people who knew him, their survey would show that 100% of everyone asked knew somebody that’s been shot. That seems more dramatic than it really is.
But let’s move on to the second claim. Nearly one in four American teens have witnessed a shooting. I don’t buy that.
Are you saying that the sample was not sufficiently random?
To me, this claim is actually worse than it would appear at first blush. Everyone knows a couple hundred people, so extrapolating that, you’d think only one out of almost a thousand or so people have been shot, but real life is not distributed so evenly, so some people know a great deal of people who have been shot, but they would still only count once in the survey.
Even if the stats mentioned in the OP are true, there’s still a the issue of whether the shootings occurred under circumstances that would lend weight to the Brady Campaign’s agenda. Based on the question as presented by MikeS in Post #6, I know two people who have been shot - one was a senior law enforcement officer who accidentally shot himself with his service pistol while cleaning it, another was a childhood friend’s adult brother who committed suicide with his father’s hunting rifle that was stored in a locked cabinet. Both tragic, of course, but neither shooting would have been prevented by any of the various gun control laws that the Brady campaign has advocated for.
Also, while they claim it was a nationally cross sectioned survey, they don’t give specifics as to where. Interviews with kids from inner city Chicago and LA are going to give different answers than those in River Hills, thus skewing the results.
Lastly, their sampling doesn’t even scratch 1/10th of 1 percent of the population of teenagers in this country.
While that might be an issue, it is an issue for GD or most likely the Pit. The OP questioned if the statistics were accurate, and cites have been found for both of them.
It says they used a random-digit dialing methodology, and they have three cites for the methodology.
The sample size yields a sampling error of 1.5% at 95% confidence.
But you are correct that 22% is closer to 1 in 5 than 1 in 4. I’m going to guess they liked the sound of “nearly one in four” better than “more than one in five.”
Good point. My post wasn’t meant as an argument against gun control (I’m generally in favor of it), but rather just an observation on my part, but still probably too subjective for GD.:smack: My bad.
If it’s a truly random sample (and this is the big caveat and an art and science unto itself), you only need about 4000 samples with a population that large to reach a sampling error of 1.5% at 95% confidence. You can do the calculations here. Once the population gets large enough, the amount of random samples you need to take for a given error and confidence interval don’t change much. For example, with the error and confidence above, for a population of 1 million you need 4250 samples. For a population of 1 billion, you need 4268 samples.
Speaking as a market researcher, who does this sort of thing for a living…if you are taking steps to make sure that your sample is proportionately representative of the overall population you’re trying to emulate, then you don’t need to sample even 1/10th of 1% of that population…as my friend pulykamell has illustrated in what he posted while I was writing this.
As he notes, the “Law of Large Numbers” says that, once you get to a certain number of responses in a sample*, adding more responses doesn’t really add any additional statistical / predictive power.
Anyway, I get it…you don’t like the Brady folks, you disagree with them, and you’re trying to poke holes in their arguments. “Not having a big enough sample” doesn’t seem to be an area where you have a legitimate beef, from a statistical standpoint.
** - What drives that number of responses is the statistical confidence level that you’re seeking. The absolute number of members of the population from which you’re sampling is a much smaller factor.*