What is the name of this informal fallacy - not accounting for second order consequences?

I don’t have a name for the situation either, but a good example that was (no doubt falsely) attributed to Abe Lincoln was the question- If there are 10 birds sitting on a wire, and you shoot 1, how many are left? Answer: None, because all the others will fly away.

Actually, the more I contemplate this I agree there should be a concise name for this spectrum of things that feel as though they are all variations that fall under a conceptual umbrella. But I’ve got nothing.

There terms to use, short sighted, myopia, tunnel vision. There should be a better one. It’s not so much a fallacy in the sense of faulty reasoning so much as it is a failure to reason at all.

This bugs the hell out of me as well, can we put the great minds of the dope together and come up with a snappy name for it?

“isolated variable fallacy”?
“disregarded consequences fallacy”?
“fixed point fallacy”?

Thing is, I’ve actually heard announcers call it something, because it bugs them, too – but I can’t recall the term. I even tried Googling it to no avail.

Yeah, I know what you mean. It feels like it should already be a thing and I’m sure I’ve heard the concept being discussed but no ready-made term comes to mind.

I propose The ceteris paribus fallacy. A term invented just now by me but based on a legit academic term. See:

This is a formal statement that we change one thing in our experimental setup and everything else remains unchanged. So we have a simple deltainput → deltaoutput relationship to consider or to model.

Sometimes ceteris paribus is a reasonable assumption for small changes in high inertia systems. Or is a reasonable application of a small dose of handwavium to avoid getting bogged in fine details.

Other times ceteris paribus is hopelessly naïve. Which is when IMO it rises to the level of @Princhester’s informal fallacy. e.g.

If we just double income tax rates across the board we’ll double our annual tax take and that’ll solve the deficit.

Pretty good bet substantially every person and every business in the land will do something different in response to a change like that. Assuming ceteris paribus would be folly here.

Considering all of the above, how about calling it the “No Reaction” fallacy.

That encompasses both conscious agency & inanimate reaction of any type in response to the proposed simple change. I think it’s maybe a little better than the suggestion “fixed background” because it emphasizes more that it’s responsive dependent change that’s being ignored. And ceteris paribus is already in use to mean something slightly different.

I’m not sure that ceteris paribus implies anything about ignoring complex second order effects that are dependent on the test variable. Isn’t it more about holding other independent variables constant as you change the test variable?

I think you’re spot on about that. Which makes my idea a little less apt than I’d hoped.

I don’t have an answer but there ought to be one, especially in economic terms. Something like the fallacy of believing you can sell all of something you own at full price, because you’re not taking into account your own actions driving the price down.

This is actually an excellent example. I wish I’d thought of it.

There’s a Peanuts cartoon where the first two batters get out, then Charlie Brown comes up to bat and gets out. Then the first two batters turn on him and say “You lost us the game!”

The opposite of systemic inertia I guess would be “ertia”. So, “failure to account for systemic ertia”.

(The Hindenburg was filled with hydrogen, an ert gas.)

I recall an article by Robert Heinlein (so long long ago) talking about writing science fiction. The interesting part was predicting the secondary effects. Anyone could probably predict the horseless carriage would make transportation faster and easier, eventually for everyone. But, he said, who could predict it would turn into a rolling bedroom providing privacy for young people to make out, thereby forcing a change in how morality was able to be enforced?

Another amusing anecdote - the scientists doing medical experiments with lab rats; they needed to analyze urine samples at precise intervals. How to get a rat to pee on schedule? Some clever fellow suggested electrifying the pan under the mesh cage with a timer, so rats got a shock if they peed off schedule. This stopped working after a while - until some lab assistant came in one night and found the rats lying on their back peeing out the side of the cage.

And to me this is the most interesting part of what I call SF.

Not sci-fi - not the dumb space adventures that are just action movies with spaceships in place of fighter planes and laser-y things in place of standard guns.

To me true SF is precisely the exploration of the social and technological secondary effects of technological (or other) change.

You’ll find that a lot more in written s-f than media because it’s a lot easier for the writer to get that hidden background in front of the reader. One of the most powerful stories I’ve read was “The Tunnel Ahead” by Alice Glaser more than 60 years ago and anthologized many times since then.

It opens with a family returning from an outing to the beach and little by little you learn that the world has become terribly overcrowded. New housing has five-foot ceilings for example and the World Series is a chess match because four acres for a ball park is no longer sustainable.

Then you learn that the only way out of and into the city is through a tunnel which is always filled with bumper to bumper traffic and then that at random intervals both ends of the tunnel are closed, all within are gassed, and their vehicles disposed of. The family escapes by the skin of their teeth – the doors roll down trapping the car immediately behind them.

Final revelation

The tunnel was voted into place democratically as the only viable way to get the population down. Furthermore, after turning away from the closed door behind them the father draws a shaky breath and proposes another beach outing next weekend – life has become so constrained, cheating death at the tunnel is the only thrill left.

Indeed, I have read much SF. Though I don’t recognise this particular story. I remember another one with a similar theme where there are carnival rides which have a low but real chance of killing you, for the same reason - population control.

What they didn’t predict was falling birth rates in the USA.

They almost had it: they were worried that smarter (wealthier, better educated) people had falling birth rates, but they didn’t foresee that the trend might extend to ordinary people.

I think it’s just something like “not thinking ahead”, not really a fallacy.

Because every proposal must necessarily stop at some level of analysis and so we have some idea of a “reasonable” amount to try to look ahead and/or speculate. Doing this to an insufficient degree is definitely a problem, but what do we call “insufficient”? Yes, we could say we draw the line at “second order” but it’s pretty arbitrary in what way we break down a situation and therefore what things are secondary vs tertiary vs whatever.
In reality it’s something which will depend on many factors regarding the nature of the phenomenon being analyzed, and the costs and resources available etc to decide what is an acceptable amount to analyze consequences.

So while it’s definitely a common mistake, I don’t think it fits so well within a “fallacy” framing.

These are sometimes called “cognitive fallacies” when someone wishes to be clear that they aren’t talking about “logical fallacies”. Sometimes called “informal fallacies” when someone wishes to include fallacies that aren’t errors of thinking. The rest of the time, just called “fallacy”, indicating that the premise, method, or result is false.

No, I think fallacy is more specific than that.

For example “being overzealous” might be considered an error in methodology, but we wouldn’t typically call it a “fallacy”. Because, for one thing, it’s subjective. We can agree that being “overzealous” is a bad thing, but may often disagree about whether a specific action could be considered such.

And likewise here I would say.
Although at first blush it sounds like “not considering second order consequences” is very specific and objective, we can disagree about what the second order consequences are, and whether analyzing that far forward is sufficient (we always need to stop analyzing at some point).
So while we can agree that having too shallow an analysis is a bad thing, we may disagree about what specific analysis could be considered as such.

None of this is to criticize the OP. It’s a good question, I’m just giving my opinion that I don’t think we should think of it as a fallacy.