My basic premise is this…you hear constantly about how young people are influenced by the media’s “standards” of beauty or masculinity, how the media scares people into thinking there’s much more crime than what’s going on, how they distort news stories, how they’re slanted more to “infotainment” but, with all of this going on, we still haven’t outright dismissed the media as a general statement. I mean, hell, they practically directly fed the Donald Trump train and allowed him to have free media all hours of the day. They’re still given credence. It’s not a basis of “If they’re right, that’s really nice and when they do good work that’s really nice but don’t expect it.” I find it really disheartening that people still listen to garbage journalism and People magazine when they haven’t had credibility for a long, long time.
Is there an argument to be made for the mainstream media? Do they provide more good than harm?
(Sorry if this didn’t come out clearly - this was written very stream of conscious)