So many employers today require that one have a degree. I can understand why in some cases, but for many others it makes no sence to me at all.
In times gone by, one could start out as say, an electrician in a factory, and work one’s way up to management. But today, the same management position requires a college degree, regardless of one’s skills. Said electrician may show great managerial potential, but company policy says in order to hold position X, one must have a degree. What is the point of this?
Even more silly, IMHO, is that many companies don’t even care what your degree is in. I used to work in social service. In order to advance to my boss’ job, I needed a Master’s degree. It was not necessary that the Masters be in counceling, psychology, social service or anything even related to the job. My boss, in fact, earned her MA in Education.
I think the current system keeps a lot of people in lower-paying jobs who could, if given the chance, do well in more advanced positions. I have beat out people for jobs whom I honestly felt would be better at the job than I, but I won because I have a degree. Stupid criteria, if you ask me.
Any opinions?
“I think it would be a great idea” Mohandas Ghandi’s answer when asked what he thought of Western civilization