In my experience of learning about the conflicts and struggles the Christian religion caused in the Western world, I began to lose faith in it myself. I really had never known the atrocities that fanatic leaders attempted, from the Crusades to the religious wars after the Reformation. All through history, the “holy” religion advocating a “good” life only caused conflict and suppression of other people. Religion caused people to be imprisoned, silenced, or killed. It also prevented discoveries from getting out, as the discoveries countered certain aspects of Christian belief (and are things that we commonly accept today, yet we had no idea how it contrasted with idea of Christianity). These types of things could be seen in the Scientific Revolution when natural philosophers such as Galileo were silenced in order to prevent spreading ideas of “heresy.”
I would feel more comfortable with deist thought from the Enlightenment, but I am still quite skeptical.
So my question is, if everyone, even devout Christians, were to know what religion caused throughout history, would they stick to their oh so spiritual faith?