Hi All,
Let me start by saying that I’m putting this up, not to have a shot at the USA, but to get some opinion from the locals whether this perception is right or not.
A woman who works for me just got home from a 5 week USA holiday with her husband and 3 young daughters, She did LA, Las Vegas, New York, Washington and Hawaii.
While she really enjoyed the holiday, she had a few comments about the USA in general that made her very happy to be home.
- She said the food was horrible (her words). She likes to eat healthy but found it very difficult to get what she considered healthy food. There were hamburger joints on every 2nd corner and the other one had a different fast food joint. Trying to get some simple food like fish or salad involved having to go to an expensive restaurant.
Her opinion was that almost everything was greasy and the orange cheese was just damn weird.
-
There were homeless people everywhere in public parks and on beaches, having built temporary shelters out of trolleys and cardboard, begging for money
-
The roads all seemed to be full of holes and are as bumpy as fuck.
Overall, her opinion as someone who had never been to the US before but having grown up watching US TV programs and movies, was that the USA seemed to be a place that had peaked several years ago and now was in decline. She was happy to be home.
Granted that most people in the USA have no real knowledge of Australia so can’t compare, and her only prior knowledge of the USA was TV and Movies.
So my question to the good citizens of the US are:
Are her observations fair?
Would you agree or disagree?
Did she just cop the narrow view from doing the tourist beat or is what she saw consistent with the rest of the country?
I’m intending to visit sometime in the not so distant future, so I’m interested to see how the inside view compares with that from outside.