Hehehe… don’t get testy.
I live in Los Angeles, and I have my entire life. I actually spent the first 20 years of my life dead in the heart of Hollywood. I LOVED it. I loved living in a city that I knew thousands and even millions of other people would love to live in, or at the very least, visit a whole lot. L.A. has always been famous, influential, and beloved. (Yes, I know there are dissenters to this view, but you can’t deny that it is widely held.)
I know that lots of people my not appreciate LA proper, but then we widen up to California in general. it’s fantastic. The 6th largest economy in world, a huge variety of strong, stable industries and businesses, tons of money, spectacular weather, extraordinary geographical variety (beach, mountain, desert, forest, subtropical, dry, wet, cool, hot, freezing, mild) and beauty, cultural diversity, amazing food - theres almost nothing you can’t have here. (I assume there must be stuff but none of it is occuring to me.)
I look at the rest of the country, and while I see some places that obviously have tremendous charm and are very appealing, I see vast swaths of the country that seem, to my LA sensibility, like places you only want to get the hell out of. Ugly towns in ugly locations with bad weather and horrible economies. Why the hell does anyone live there to begin with?
So, if you live somewhere that you think might seem to others like a yucky place to live…why do you choose to? And if your family is there so you have roots you don’t want to leave, why did THEY move there?
(I’m not going to offer any examples of places I’m thinking of cuz I don’t want to specifically offend. And trust me, there are places that suck hard in California, too. And I don’t understand why people pick them, either.)
So, enlighten me. Please.