I have never been in a union and I am ambivalent about them for the most part. However, I feel that many Americans view unions very differently than they did even a generation ago and I wonder why.
One can make a legitimate case that in these times, unions are no longer necessary, or at least are not in a great many industries or places where workers do have leverage over most employers. One can point out that strengthened laws in the country and many states adequately protect most workers in most industries. However, I don’t see people merely voicing the opinion that unions are outdated and unnecessary - they say that unions are actually bad.
In the past even when unions were viewed more favorably, there were concerns about corruption, but those kind of complaints were different: Those charges were levied at the top of the union hierarchies, whereas today the productivity of the union members come under fire.
Unions were looked at as places where workers had someone to look out for them. Nowadays, the narrative has changed and there seems to be a much larger number of Americans who view unionized workers as the problem.
They are considered pampered, lazy, and unanswerable for disciplining from poor job performances. They will strike at the drop of a hat. They should just be happy to have a job in our lousy economy.
And it’s not just the more famous union-busting techniques employed by the Scott Walkers of the world - go to any online story that discusses how a place has voted in a union and comments will complain about how bad the employees will be under the union.
Which isn’t to take away from the union busting that elected officials have been involved with: In the recent past, teachers were viewed as almost saints who had to go to school for much longer than the average american while making considerably less money than most. Bust a couple of teacher’s unions and suddenly teachers are overpaid and you are stick with tenured bad teachers. How did we reach the point where teachers are the problem in this country?
In the past, organized labor was a populist movement that inspired solidarity across gender and race. Folk songs canonized fictional and semi-fictional accounts of unions forming and union busting. Whereas today, you have giant corporations such as Wal-Mart notoriously squashing any attempts for employees to organize and there is hardly a whimper from the public.
When did the unionized employee become such a pariah in this country? Are unions really a place for lazy workers who don’t want to be accountable for performance rather than a place where workers have a fair voice? Can unions get back to being the good guys again? If so, how?