I’m still waiting for a definition of the term on which we can all agree so we can debate that. Otherwise we’re talking at cross purposes.
That said, I will make reference to Thomas Gilovich’s book How We Know What Isn’t So, and assert that there are aspects of human cognition that give rise to what might be called “reasoning artifacts.” In other words, the way we interpret and selectively process the available information about the world can cause us to arrive at perfectly reasonable conclusions that are just plain wrong.
The most common example, I think, is the old coin-flipping exercise. Flip a quarter ten times. The first nine times, it happens to come up heads. How does this affect the odds of the tenth flip?
If you know anything about probability, you know the preceding flips don’t affect subsequent flips in any way. The odds on any given flip will always be fifty-fifty (discounting hugely unlikely events like landing on edge, being snatched out of the air by a bluejay, quantum tunneling to the other side of Rigel, etc.).
And yet common sense will nag at you, trying to convince you that the last flip will somehow be more likely to come up tails. This is because the human brain, by default, groups events and seeks patterns. Most of the time, this is a very useful skill. (“Hmmm, everybody who walked into that cave was thrown back out after a minute or two covered in bloody claw marks. Guess I’m not going in there.”) Sometimes, though, this habitual approach is inappropriate, and causes us to draw improper conclusions, as in the coin-flipping exercise.
In learning critical analysis skills, one must learn the limitations of those cognitive defaults, and the ability to recognize where they aren’t sufficient to untangle the nature of reality. I refer again to Gilovich’s book as a good starting point.
That said, I think the OP overstates the issue considerably. The human mind is a powerful device, and one of the most amazing things about it is that it can be trained to recognize its own weaknesses. “Common sense,” if you want to call it that, can be very useful, up to a point. The trick is to neither put too much faith in it nor dismiss it entirely.