This is tough for me to get at, but the idea is this:
Sometimes within an intellectual discipline, one sees an almost excessive internal consistency, which may be at odds with reality.
An example is astronomy. There were many examples of circles and circuits in ancient astronomy. Circles and spheres are mathematically simple yet elegant structures. Planetary orbits were presumed to be circular (which is not a bad initial estimation). Where observations did not match, circular epicycles were added to better fit the data. There was a lot of internal consistency going on here that was perhaps unhelpful.
Medicine is similar. Traditional Chinese medicine has elaborate theories of energy flow through meridians and its interactions with various organs, Medieval Western medicine had similarly elaborate theories about the balancing of the four humours. More recently we have seen solid, internally consistent theories about lipid metabolism overthrown by observation (I’m looking at you, Margerine!).
Many conspiracy theories have fairly elaborate internal consistency, as well.
So- is there a word or phrase to describe this kind of phenomenon, where the self consistent theory gets in the way of the facts? More to my point, is it possible that “too much” internal consistency may be a warning sign, and is there a word or expression for that?
I’m thinking engineers may have something to offer here.
in your examples, a too-complicated theory becomes even more complicated as it’s twisted to fit contrary facts. The solution may be to make things simpler (more “beautiful”!) rather than more complicated. Paul Dirac commented on this:
To be clear, the epistemic foundations of medical science differ here and there to astronomy and the other sciences; Heracles’s note above is grounded in Thomas Kuhn’s 1962 The Structure of Scientific Revolutions. [cite to wiki]
OP mentions “intellectual discipline,” which is a much larger net–ask any college teacher of English Lit, or, even older, Archaeology, Biblical Studies, whatever, where whatever word OP is looking for is out there also. I just can’t think of it at the moment.
In machine-learning, there is the concept of overfitting, which essentially means that your model twists itself in knots trying to account for Every. Single. Data. Point. - some of which may have been in error in the first place, and not lead to any useful predictive ability.
What you describe above is faulty generalization; i.e. because some features of a phenomena would appear to fit a certain mechanism; therefore that mechanism is assumed to apply to all aspects of the phenomena.
In the case of planetary orbits, the Copernican system was sort of true, insofar as planets follow regular orbits dictated by following a fixed metric about the Sun. Copernicus, knowing nothing of gravity, assumed that planets were linked to the rotation of Platonic solids rather than following geodesic paths dictated by gravitational potentials and energy balance, but he was actually on the right path in assuming that the motion of celestial objects followed some universally applicable regular principle. It took the precise measurements and curve fitting by Johannes Kepler to determine the precise trajectory shapes followed by all objects in orbit, and it was not until Newton that it was a fundamental force that acted to control the shape of the ellipse. (And if we take it further, it was Einstein who determined that the effect of gravity is to shape the underlying plenum of space-time. Doubtless, someone will eventually come along and give us an even more fundamental revelation, one that will hopefully marry quantum mechanics and gravitation.)
The Ptolomic system of deferants and epicycles, on the other hand, was completely wrong. Not only did it not accurately describe the motion of planets, it provided a totally wrong and misleading mechanism for why Mercury, Venus, and Mars have retrograde arcs in their trajectories. Nor was it really based upon any universal principle; epicycles were piled upon epicycles to explain motion every time more precise measurements broke the previous organization. The Ptolomic system of planetary motion was completely arbitary and without consistency or logic. It is an example overfitting, or what we might call excessive linearization.
As for conspiracy theories, they rarely actually have any internal logic. Most are pieced together from supposed factual discrepencies and hyperbole that really don’t indicate that the supposed conspiracy is true as much as attempt to shade the official explanation. It does happen on occasion that actual conspiracies exist, and can generally be identified by the frequent incompetence or lack of necessary secrecy, leading them to fall apart and be discovered fairly quickly, e.g. the Kaytn Massacre or claims of weapons of mass destruction in Iraq justifying the unilateral invasion by the United States in 2003.
That expression has a number of meanings, “fallacious” (ie, loosely used) or not; however, of the long list of logical fallacies [Stanford] in OP – of which overfitting is a card carrying member–that wouldn’t work.
Scientific sense
A case may appear at first sight to be an exception to the rule. However, when the situation is examined more closely, it is observed that the rule does not apply to this case, and thus the rule is shown to be valid after all.
[An] example is of a critic, Jones, who never writes a favourable review. So it is surprising when he writes a favourable review of a novel by an unknown author. Then it is discovered that the novel is his own, written under a pseudonym. Obviously the rule doesn’t apply to this case (although the rule may need to be more precisely stated in future) and the previous evaluation of Jones’s ill-nature toward others is re-affirmed.