Why are we more likely to be victimized by people we know?

We keep on reading that the vast majority of cases of murder, rape, and many other violent crimes are committed by people we know. But why is this? Is your child really safer among strangers than among his family and family friends? Statistics would seem to indicate that a child is VASTLY safer with a random group of strangers, and yet somehow this seems to defy common sense so brutally that it’s still tough to accept. So what are the causes of this great statistical discrepancy? I have some hypotheses:

  1. We are ultra cautious of strangers and thus much tougher prey. For example, a woman walking down the street after work alone is probably carrying self-defense devices. Whereas a woman who has known a guy for a short time may let her guard down. Heck, some women will let their guard down after spending an hour with a guy. Seems like a much more fertile feeding ground if you’re a rapist to invest just a little time in gaining trust. Another example is child molestors. If little Johnny runs to mommy and says that the weird dude on the street corner touched him, the whole neighborhood will chase him with torches and pitchforks. But if he says his uncle, or his teacher, or his priest, or his father did it, that just makes everyone uncomfortable.

  2. Murder is a very personal crime. The odds of someone just deciding they want to kill you on a given day while you walk down the street are extremely low, whereas the odds that you’ll piss a friend or relative off enough for them to wring your neck are somewhat higher.

  3. Kidnappings are almost always a relative. There is no motivation for a non-crazy person to snatch a random kid. Ransoms are hard to collect in this country, serial killers of children are extremely rare(they tend to prefer women 16-30), and keeping a child for any other purpose is a lot more trouble than it’s worth. I dare say that fear of kidnapping, while the thing parents fear the most, is almost non-existent no matter how much your child wanders the streets alone. Unless there’s a custody dispute, the odds of someone wanting to grab your kid are something like 1 in 10 million.

So are strangers actually safer than people you know? I guess a lot also depends on the kind of people you know. I’ve continuosly been shocked by how many people I know bring dangerous people into their lives and then act surprised when bad things happen. Recently lost a friend of a friend to a guy with a rap sheet a mile long who decided to joy ride with them at 100 mph while he was high on coke, and ended up in a canal. He survived, his passengers all died. But some people seem fine and you can know them forever, but as soon as they have an opportunity…


No. A simplistic and flawed interpretation of available statistics would indicate that. For something sensible you have to at least start by dividing the given risks with time a child spends with family and friends and the time they spend with random strangers.

This question reminds me of the old joke about the guy who read the statistic that 70% of car accidents happen within 10 miles of a person’s home. So he moved.

Also the ones about how most people die in hospitals or bed, so you should avoid those. :slight_smile:

Here’s an incredibly stupid idea showing ignorance of probability:

  • since the chances of two bombs on an aircraft is practically nil, you should always carry a bomb on an aircraft. :smack:

It seems really obvious to me what other people have said; you are likelier to be hurt by an acquaintance because you spend more time with them. Consider:

People bitten by dogs are much, much likelier to be bitten by dogs who live with them than they are by other people’s dogs.
Far more people are killed in cars than on motorcycles.
Far more people die in commercial aviation accidents than die climbing Mount Everest.

But having said that I think it very obvious that familar dogs are no more dangerous than unfamilar dogs, it’s way more dangerous to drive a motorcycle than a car, and climbing Everest is insanely dangerous while taking a flight really isn’t dangerous at all. All those paradoxes are because of availability. Not many people climb Everest; zillions of people fly somewhere.

I don’t know if this is immediately relevant, but I am reminded of Bill James’s book “Popular Crime” in which he notes that until reasonably recently (the late 1960s/1970s, roughly; Ted Bundy was a big game changer) serial killers often got away with their crimes because police often refused to believe serial killers were even a thing. The paradigm that “people are killed by someone they know” was so strongly held that the police could often be blinded to the fact that someone people are killed by someone they DON’T know.

I found this curious given how famous Jack the Ripper has always been, but upon doing a little research into the matter, James appears to be essentially right. (The book is very much worth reading.)

For murder specifically, it’s a big crime that most people don’t commit opportunistically, but instead for a significant reason - that is, most people don’t go ‘oh, I could get away with killing this guy, I’ll do it’ but instead do ‘this guy deserves to die, I’ll kill him’. You don’t generally get enough motive to kill complete strangers because you don’t really care about them, but your girl/guy cheating on you, business partner ripping you off, associate not paying a debt, rival insulting you, and similar circumstances can push someone relatively balanced into killing someone. If you do have poor enough impulse control to kill someone over a minor matter, then you’re essentially murdering at random, and odds are the murder will land on someone you spend more time with.

It’s really only serial killer types that don’t fit that pattern, since their motive is essentially “I want to kill someone” and not “I want to avenge this wrong”. And that’s a big part of why finding serial killers is harder than regular murderers, there’s no normal trail of motive connecting the killer to the victim.

Though there is an element of statistical error involved, it’s not just that.

Consider this scenario: imagine you get up every day early, go on busy public transport to a crowded gym then to a busy person-filled workplace, then go out after work with a bunch of people then come home on public transport. You spend 14 hours out that day, then come home and spend 10 hours with your husband.

It’s still more likely that your husband will kill you than the entirety of all the stranger, workmates etc you meet during your 14 hour day out.

This is because of the OP’s point 2).

A lot of murder is a crime of passion.

Similarly child molestation doesn’t usually take the form of grabbing a random kid. It involves an existing relationship and grooming. So if a kid spends ten hours with a hundred random strangers they are less likely to be molested than if they spend 8 hours with one person.

Familiarity breeds contempt.

If you look at a chart of what ages people die at, any fool can see that if you can just make it to 124, you’ll live forever because nobody dies after that.

Absence makes the heart grow fonder.

Good points all, except for one: you don’t actually spend more time with acquaintances than strangers, or at least not if you measure it by time per person. In other words, you might spend more of YOUR time with people you know, as in hours with say, 10 people you know well, but chances are you’re also being exposed for minutes each with hundreds of other people in a given day.

If you’re looking for a victim for a certain activity, it would seem likely that you’re going to consider everyone you know first.

A lot of it is just opportunity. If you are a scumbag, grandma has cash and a house and you can weasel your way into taking care of her, you can steal from her. Otherwise it would take a lot of work to find that opportunity elsewhere.

Likewise with child molesters. They have access to the kids in their extended family and some family members might refuse to believe it happened or help cover it up. They’d have to work to gain access to other children at a higher risk.

If you’re a scumbag kid gone wrong, you can play on your parents guilt to rob them blind, over and over. You’re not going to find that easy opportunity elsewhere. Especially if they’re religious and believe that they have to forgive you.

Absence of familiarity makes the heart grow fonder of contempt.

I suspect that largely hinges on how you define “spend time with” / “being exposed to”. For example, would we count all the people I pass on the freeway? After all, I’m within about 15 feet of the people in the lane next to me, but it’s only for a few seconds or minutes at a time, and I have pretty minimal interaction with almost all of them. Same question for pedestrians I might pass on the sidewalk? Does the answer change if I’m driving past them at 35 mph or walking past them at 3 mph? When I’m in the office working, where I am for many hours each day, I’m spending time “exposed” to my co-workers, but they’re all acquaintances. What about people on the floor above me? Would we say that I’m “exposed to” those people too? We’re in the same building, but not the same room. When I go to a movie theater, am I exposed to all of the people in the theater with me? I suppose I am, and most of them are strangers. What about the theater next door? or the one down the hall? or the one across the complex from me? I’d certainly say that most days I “spend more time with” acquaintances than strangers (although I can imagine lifestyles where that’s not the case), but “exposed to” seems to be a broader category than “spend time with”, and the answer largely hinges on how broadly we define that “exposed to” category.

There’s something that most people have only touched on: certain circumstances can skew the stats.

For instance, a large percentage of homicides are related in some way to illegal drugs. Sometimes drug dealers shoot each other; sometimes they shoot customers when suspected of cheating; sometimes customers shoot dealers; etc.

Gang members mainly fight other gangs, or sometimes each other. Again, it’s a case of criminal impulses being directed at someone that the person already knows about, not some random stranger.

It’s too bad there isn’t a well-written, relatively easy to understand checklist for people who want to try to use (or check someone elses use of) statistics for everyone to refer to. Since the advent of successful use of statistics to solve problems, there has been an endlessly repeating earnest misuse of them by all sorts of people. Some innocently, but too many with intent to deceive.

I am not an expert in such, being educated primarily in History, but I have also spent a lifetime as a troubleshooter, and that has resulted in my developing a fairly strong “field skill” with recognizing mistakes in reasoning.

One of the ones seen here, that is often missed, is where someone uses statistics which were gathered from INSIDE of a set of examples, but the person trying to use them thinks that they apply from OUTSIDE. This seems to be common in discussions of crime, for some reason.

When your study group consists of people who have been convicted of a given crime, you will get different results from a situation where instead, you study who the crimes are committed against.

It also seems to be common to confuse statistical correlation, with CAUSE of crimes.

And the most common really “dumb” mistake I see again and again, is where someone tries to start from the GENERAL statistical observations , and try to use them to make a decision about a SPECIFIC person or event. Such as learning that at least a plurality if not a majority of women are who are murdered, are killed by their primary significant other, does NOT mean that for a specific murder, that the husband should be sole or primary focus of investigation.

The thing I always remind myself and others, is the common coin toss statistics mistake. That goes, that it is a fact, that a fair coin, when tossed a near infinite number of times, will come up heads exactly 50% of the time. Knowing that, many people think that if a coin has been tossed ten times and come up heads every time, that the statistical likelihood that it will come up tails is RISING, the more times it is tossed. But of course, people who understand such things, know that the chance of heads or tails is ALWAYS 50%, no matter what (again, with a fair coin).

Bottom line here, the fact that most crimes against children are committed by people who know them (if that’s true) has absolutely NOTHING AT ALL to do with how safe it is to allow any given complete stranger to be alone with them.

Googling something like “List of Statistical Fallacies” brings up some possibilities, but so far I haven’t found a list that I really really like.

I’m not sure, but you may be talking about the Base Rate Fallacy.

Thinking that correlation necessarily implies causation is a fairly well-known fallacy.

And this is what is known as the Gambler’s fallacy.

However, I have to nitpick your statement “it is a fact, that a fair coin, when tossed a near infinite number of times, will come up heads exactly 50% of the time.” What does this mean—what’s a “near infinite” number?

If you mean that, as the number of coin tosses approaches infinity, the percentage of the tosses that are heads approaches 50%," this is correct. But your statement might also be interpreted to mean “For an extremely large (but still finite) number of tosses, exactly 50% will be heads.” This is incorrect, because of your word “exactly.” If that very large number of tosses is an odd number, there is zero chance of getting exactly 50% heads. And even if it’s an even number, the chance of getting exactly 50% heads goes down as the number of tosses increases.