Well, I’ve seen a lot of posts about America’s foreign policy, however I think that our foreign policy has improved or at least tried for a better direction since the end of the cold war. I’ve been thinking about this a little bit and then this article that I’ve read brought it to the forefront of my mind.
http://atimes.com/ind-pak/CJ20Df01.html
Now I know there will be the usual cries of bias but I want to point out that CNN picks and chooses what to report, Time Magazine posted one of the most hateful vitriolic rants I’ve ever seen a couple days after the attacks and the New York Times has posted a picture of an Israeli soldier holding a gun over a bloodied man in an article about the beginnings of the conflict there in September of 2000 on their front page that just so happened that the bloody man was an Israeli soldier that the other guy was protecting and they posted a retraction somewhere in the middle of the paper. (No I’m not going to provide cites as this isn’t what my thread is about) The point is, I just want to keep this thread from being hijacked on the idea of bias, unless it fits the context as you cannot discuss anything without context.
So what I would like to discuss in this, is cold war foreign policy and how it may be a major factor in what’s going on as opposed to blaming ALL US foreign policy. While it may not matter to an Arab terrorist, the beginnings of the cold war were not a shining time for Americans either. We seem to paint the 50s and 60s with Nostalgia, but they brought out McCarthyism, Kent State, Hippies spitting on returning soldiers and many other ugly incidences here at home. So could it be all this policy in the cold war that we were kept mostly in the dark (well I wasn’t born yet) about, could they be more to blame than American foreign policy as a whole?
Erek