For me, not for a while, but I have kids who have done well. Plus I have hired a bunch of people. My wife is a biologist (now medical writer) so I’m close to that field.
What you call “personality” is a sequence of actions in response to stimuli. It’s well within the same problem domain AI techniques apply to. So hypothetically in this world, at this near end state, there would be people employed in these ‘human touch’ jobs.
There would also still be this army of AI engineers, working 12 hour days to make various human seeming robots. That’s because the market demand for robots that can replace some of these expensive humans doing soft skills would be high.
All the other jobs would be gone. Eventually, some of these AI engineers might make the final insights needed to roll it all up, and then you have a system that can self design all the westworld-grade humanoid robots to take all the soft jobs.
Yes, it’s called the Singularity, but it might take 500 years to reach the endpoint. It probably won’t, but this process is happening right now, so…
Middling programmers, sure, but I’ve seen the work of bad programmers and I would rather keep looking. But it is better if universities convince people who seem to be incompetent programmers to switch fields, reducing their supply.
There is a classic study showing a 10X difference in programmer ability - Barry Boehm, IIRC - so hiring good programmers is cheaper than hiring somewhat cheaper mediocre ones.
I agree. One daughter has a BS in Psychology, and my research showed almost nothing available. But she got a Masters in International Business and leveraged that into project management.
Just about everyone should be aware of job prospects, but then do what they love. I know someone forced by her father to take business instead of art history. She was gainfully employed her entire career - but hated every minute of it.
My comment comes from maybe outdated experience. My wife was in a Biology PhD program, and a friend got a PhD in Chemistry, and both their departments seemed to think that graduates who worked in industry were sellouts or failures in some way. Computer Science and EE departments, which I was experienced with, had no such bias.
Now I know plenty of PhDs in CS or EE who are not doing anything resembling research, but most people don’t seem to really like doing research - the kind who get assigned projects by their advisers.
I graduated from college with a CS degree almost 45 years ago, and none of the facts and programming languages I learned were at all useful during my last 10 or 15 years of employment. What was useful was the principles behind them that I learned in college and in grad school. Designing a programming language for my dissertation was far more useful than learning a language in college, since I could more easily see the similarities among new languages.
The problem is the CIO who doesn’t want his new employees wasting time learning principles instead of the latest platform. That guarantees the new employee will be obsolete in 10 years - a feature, not a bug for management. That also explains the lack of training.
The problem with self-learning is not money or skill but time. If your employer supports learning you can take time to get trained at their education center. If not, you need to find the time and energy to do it while working a full day and maybe even having a life. Not to mention that if you didn’t anticipate or anticipated incorrectly you wind up taking the class at the same time you need to apply it. Even if you manage this, your first code is not likely to be very good.
Bell Labs paid tuition for getting the next degree, but people who worked for me and did this had a much harder time than I did in full time grad school. They had to sacrifice not money but a lot of their lives for it, and I really respected that.
Maybe we are seeing the gig economy applied to full time work. Not hire for the job, but hire for the current skill, and then toss.
I second the bolded part. I got an electrical engineering degree over thirty years ago. Lots of what I learned is obsolete. I haven’t coded or soldered in ages. But I’ve done well in jobs where I explain the underlying issues and, assisted by people with expertise at more current technology, analyze and report things to decision makers.
I see a lot of complaining about rampant age discrimination in your field. Worries me quite a bit myself. Basically it doesn’t matter that you learned the latest languages over the years, employers will just look at your birthdate/appearance and just refuse to even see if you can perform on their employment screening tests.
Of course, my other fear is that maybe they have a point. How has your coding speed been over the latest 45 years? Does it go down and by how much?
I’m not going to bother picking apart this weak analysis, but social science degrees are valuable for forming the theories and observations that help us understand culture and society. It helps us understand more than what and how to do things, but why we do it, and for whom. Machines don’t surface these insights for us, not without people telling them what to look for. Corporations don’t pursue these insights… you can see most technical insight is devoted to getting people to click on ads. That’s what happens when you have a culture obsessed with how to do things and make money, without asking why we’re doing this and whether the money’s going to help.
The purpose of college has changed over time. Today, the problem is that if you want to obtain gainful employment that pays enough to survive, your odds are better (but not guaranteed at all…) if you have one of the few really marketable degrees.
Not that social studies degree. Yeah, if you know somebody, you might be able to talk your way into a position at an ad firm like you say…but aren’t you going to have better odds if you majored in a marketing directly? Aren’t your odds going to be better still if you did an internship and are a young female with perfect legs?
It’s about probability. Major in Engineering or computer science and your probabilities are more favorable. Unfortunately not as favorable as we’d like, but better than your odds if you majored in “the classics” like a relative of mine…
I have been busy, but I want to go back to these ideas. While “teachers, or sales people for technical firms, or technical writers or business consultants or go to grad school or law school or medical or vet school or business school or they pursue whatever career path they started on as an intern” may not be pursing a career as a biologist, it doesn’t mean they aren’t using the things they learned getting their biology degree: they may be teaching biology, or selling a biomedical product, or writing about biotech, or going into environmental law, or whatever. Those are all jobs that need/benefit from a biology degree, so I think it’s worth the opportunity cost right there.
I don’t think the lie is that “STEM degrees are a valuable thing and society needs people–a great many people–with strong STEM skills”. I think the lie is “Go to school and get a degree, and after that adulting is straightforward.” Schools don’t help this: teaching is one of the few jobs where you literally get a degree in your job, not in your field, and the job you get just flows very naturally from that. So teachers don’t know anything about how one goes from the set of majors out there to the thousands of actual jobs people do. And for most of those jobs, there are so many paths that you can’t just follow a dependable sequence and get where you want to be. You have to have a set of fairly useful skills, a willingness to work to learn new skills, and the ability to have flexible career goals and visions–to work toward things, but to be able to notice and adapt when the world changes, or opportunities come your way that you didn’t anticipate. You have to be deliberate about your career.
I think a great many otherwise bright and hard-working students are utterly unprepared for this. But it seems inevitable: if labor is going to be on a free-market model (and there are many extraordinarily good reasons to keep it as such), then it’s going to be a dynamic market. We can’t churn out graduates who match exactly what is needed because we don’t know what will be needed. And we can’t map out career paths for people, we can’t say “do this, then that, then that, and you’ll be fine”, because their aren’t a couple dozen job-classes one can train for: it’s more like there’s a million different jobs that all blur into the jobs around them, and a million different paths to and through them. Individual circumstances and soft skills also play a huge role–what people want out of jobs, what they are willing and able to do–is another chaotic mess, not a clear progression of priorities.
Colleges have these Freshman seminar classes these days. I wish they used those more to focus on what it means to have a job. I especially wish they pushed more kids toward internships, research opportunities, recruitment fairs, etc. These things–and your first job–are where you can learn to be deliberate with your career, and help getting there would improve things. But more than that is the simple understanding that 'What do you want to be when you grow up?" is a dumb question, because for most people, their career in a dynamic, fluid thing that they need to deliberately, strategically control. You’ll have to keep up your skills. You’ll have to pursue jobs you aren’t quite qualified for and figure it out when you get there. I think a lot of people in my generation missed that memo and ended up feeling cheated when things didn’t work out for them.
People are not fungible. To be successful in a career, you really need to be willing and able to stay current in it, to be innovative and engaged. That’s not going to happen without some element of interest. Someone who majors in CS but is bored stiff by it may be able to make passable grades, but they aren’t going to get the professor’s attention and be recommended for an internship, or identified as a person to watch in the department, and so get extra attention in their later classes. They won’t get internships, or, if they do, they won’t get much out of them because they will be fulfilling requirements, doing a job–they won’t learn anything they weren’t taught and they won’t make an impression on anyone. In their first entry-level job, they will spin their wheels and watch while the people around them get promoted.
Put that same person in a field they are interested in, and they can be the rock star, the darling of the department, the first recommended for internships and grad-school connections, the recipient of good advice and mentorship. This is true even if the passion is social studies. The darlings of the history department where I studied–at a regional state school–all did fine. They went to law school or grad school and they were directed into good programs. Some are professors and teachers; others are lawyers. I don’t think any of them would have been as successful as a really bored, mediocre computer scientist.
Now, the person that just hates school, that picks history or psychology or something because it’s harder to outright fail, who does the bare minimum to get through their required classes and otherwise channels their energy into an unrelated job or their social life–that person has a useless degree. But they would have been useless with a STEM degree, too, if they’d even finished one. The problem there isn’t the degree, it’s the failure to deliberately and meaningfully pursue a career, and the delusion that “getting a degree” was a sufficient substitute for actually working toward a future.
But if you want to talk about odds, your best odds are always to be a rock star. Do the thing where you can be a rock star and do it as hard as you can, and you’ll be okay.
I think you are looking at the situation from a vocational perspective and not a career perspective. I don’t thing that there is anything wrong with that, but if we’re tracking back to the OP I think in a sense this type of thinking may go into the perception as being oversold.
What is the difference between a vocation and a career and why does it matter? First and foremost it matters in terms of the cost benefit analysis. For a vocation, it is a lifestyle, a mission, a calling, a passion. A vocation brings non-monetary benefits that cannot be easily quantified. A vocation is something someone wants to do even if the pay is minimal or even if the compensation will never come in monetary form. Artists, priests, activists are often people persuing a vocation. For a few science is a vocation, and historically, many scientific breakthroughs were achieved by people who had absolutely no dependence on income produced from their work.
What this all comes down to is how the value of the education is assessed. Whether someone enjoys it or not, the asset of time, in significant amounts, must be sacrificed in addition to the initial education, in order to maintain employment in your above scenario. What this means also is that a welder salary, in order to be compered to a programmer salary, should take these additional hours into consideration. If you want an example, if a non-exempt employee makes $20/hr. but works 65 hours a week then the yearly salary is $80,600. So, let’s say a programmer works approximately 50 hours a week is making a salary of 90k per year; but also spending 5k on training, averaging 10 hours a week extra on keeping abreast of new technology, 10 hours a week networking - they are, in a sense, not really being compensated better than a $20/hr. clockpuncher. And the latter may not have any student loan debt and also may not have had to spend years out of the workforce for the training needed to obtian their position.
Now, there are intangibles such as social status etc., but I’m just looking at raw financial compensation in relation to assets invested. The meat of the balance sheet, not any of the frilly stuff investors probably roll their eyes at.
I honestly don’t think there are many jobs at all where you can make $20/hr and have all the 65 hr weeks you want and don’t require a specialized education or keeping up with new skills. The closest you’d find are in some of the trades, and those have their own serious problems–not least, the fact that they are prone to extended periods of significant unemployment/underemployment and, most critically, a really high risk of having you working years cut short by disability. It’s hard to do physical labor 65 hours a week for 45 years even if nothing bad happens to you. And something bad–a car wreck, a workplace injury, an unanticipated condition like MS or arthritis–often does happen.
There’s a middle ground between vocation and just-a-job: a place where you are interested enough to make the keeping-current and researching opportunities and honing skills seem like a better alternative than spending that same amount of time doing more generic “work”. And I don’t think there’s a magic path that avoids the risky mess of having to create a career path.
A year ago, I contacted the biology department chair at one of the nearby HBCUs and asked if there would be any interest in me coming to talk to his students about careers in environmental science. Since then, I’ve given talks in front of three different audiences. The students are always full of good questions about what options are out there. I suspect most still believe they are going to be physicians one day. But I think if I can convince at least one of them to think out of the MD box, my efforts won’t be in vain.
I didn’t even know my career existed when I was in college. I don’t know if I would have done anything differently if I had known about it, but it would have been nice having a clear waypoint.
Re “rock stars”. I feel weird telling students that that’s what they have to be to score opportunities. For one thing, I don’t think I was a “rock star” and yet things worked out for me. But for another thing that probably reflects my previous point, rock stars often don’t see the “rock star” in them, so if you tell them that’s what they have to be, they can sometimes panic about not being good enough. So I always try to give concrete examples of “rock star-ish” behavior, like how I spent a semester teaching myself tree identification and then got a job working for a forestry firm, despite my formal training in marine biology. Or the time I cursed myself for not knowing GIS, but then taught myself enough so that I’d be able to put together a portfolio of my own maps and list that skill on my resume. Even though I’m under no delusion that I seem cool to a bunch of 20-year-olds, I really try to downplay the notion that you have to be nerdy genius to do well in STEM. I think the notion of “rock star” connotes “extraordinary in an objective sense”, when really you have just to attract more positive attention than the people around you. For good or bad, this doesn’t always hinge on how much passion you have or how smart you are. It often comes down to how you present yourself.
Also, I don’t think it’s really true that the average or slightly above-average STEM major might as well throw in the towel when it comes to finding work in their field. I work with a lot of scientists and engineers who make comfortable middle- and upper-middle class salaries, but who I wouldn’t say are “rock stars”. And like I said, I don’t think I’m one either. We got our jobs like most people get jobs. Through a mixture of credentials, qualifications, demonstrated competence and dependability, and an invaluable network. Being a rock star gives you an advantage in forming a premium network, but networks can help even the mediocre get a foot in the door. I have reached out to students with information about job opportunities not because they are the best students, but because they took the time to email me and tell me about themselves. I hired an intern to work with me one summer not because she was a wunderkind, but because as a volunteer she had demonstrated dependability, and I was way too lazy to screen through a bunch of applications. Students need to know that being social adept can go along way even if they don’t think they are a rock star. This goes for both STEM and non-STEM.
That used to be the case, but no more. Take a job at McDonalds. A couple decades ago it was seen as a sort of training ground for the adult world. High schoolers would take a part-time job at minimum wage where they learned the importance of showing up on time ready to work, doing things that aren’t as much fun as partying or playing with your X-box, and being compensated for that by making a little money you could spend on what you liked. At one point, something like 25% of the American work force had worked at McDonalds (literally) at some point in their school years.
Next time you’re at MickeyD’s take a look – take a real look – at the people behind the counter and in the kitchen; they aren’t teens any more. Also notice that, far from being lazy, they’re working their asses off. They’re not there because they want to be; they’re there because it is the only job they can find. Some of this might be due to poor life choices but with 20% of them having attended college certainly not all.
(I added some bolding)
See the conflict here? My point is that if you want the best chances to survive and not get stuck at mcdonald’s, your best chances are to pick, from the short list of degrees that actually pay off, one you can tolerate and do reasonably well in. Frankly the way the market actually works is that if you can be an average computer programmer or engineer, or good but not rock star history major, you’re going to make more money and have a more secure living doing the first thing.
Not everyone can be a rock star. Think about it. Due to mass recordings of the best rock stars, only those blessed with the right look, right sound, right vocals, right band mates, at the right time, who are the right age…who do also work their ass off…get to be rock stars. It’s still 90% luck, though.
And if you go to grad school after college, that hardly counts as ‘succeeding’ *because *you majored in history in my opinion. It’s the grad school degree that made the success (and the state bar license or medical license, etc) possible. And if you’re really planning grad school, there are probably majors that it’s easier to maintain a high GPA in than history.
Like which ones?
Because a $120k or more MBA salary is better?
I graduated with a degree in engineering, and like most of my classmates, didn’t really practice in my field very long. Most of the engineers I know went into fields like investment banking or consulting firms like Accenture and Deloitte.
Which is not to say having an engineering degree is useless. Sure, most of the stuff about building bridges and skyscrapers is not particularly useful. But I also learned programming, project management, advanced mathematics, some basic econ and accounting. not to mention all the other various electives and other course I completed to make me a well-rounded and educated person.
“Rock Star” is hyperbolic, monstro, but I think we agree on the fundamentals: it’s the engaged, informed students in any department that get the opportunities to go on and build a career in a field. I’m pointing out that it’s difficult if not impossible to be the engaged, informed student if you have no inherent interest in what you are studying.
Professional rock starts are a lot more common than actual rock stars. Your own example is misleading: you talk about being an “average” programmer–well, “average” is a high bar when you go into a field you have no interest in. If I’d gone into programming, I’d be in the bottom 25%, not because I’m not perfectly bright, but because I’d have been bored stiff and resentful every day. Instead, I got an English degree, which I am interested in, and became a teacher, which I am very interested in, and now I’m one of the most highly paid teachers in my state in one of the best schools in the nation. Luck played a role, but all the luck in the world wouldn’t have made me a successful programmer.
The choice is rarely “average programmer” vs “good but not rock star history major”. If that is a person’s choice, then maybe CS makes sense. But for lots of people it’s “rock star visual arts” vs “terrible STEM major” or “rock star aerospace engineer” vs “average EE” or “average psych major” or “flunk out with debt and no degree CS major”. The best choice for each person is highly dependent on context and your blanket assessment make no sense.
Those 20% at McDonalds with “some college”? Why are you assuming they are frustrated history majors instead of frustrated STEM majors who dropped out because they hated every moment of school?
It depends on what you want to go to grad school in. The easiest degree to maintain a high GPA in is going to be the one you feel is worth studying. And, again, if you are an engaged and informed student, you get the internships and the rec letters and the mentorship that help you segue into a good grad program. My university lacked a grad program in history. When they were selecting speakers for lecture series, they would literally invite professors to come talk based on what the rock star undergraduates wanted to study in grad school. An important function of lecture series was to matchmake between the potential grad students and the potential grad student advisors. And it worked.
I’m not sure what your point is with this. A $120k or more salary is better, I think getting an MBA from the right school will have much higher odds of a better financial payoff than anything purely STEM, and better than pretty much anything blue collar. But then we are talking about something other than just a STEM degree, unless you are considering business to be part of STEM. One of the points of the OP is that a lot of actual STEM work is not being done by American citizens who have STEM degrees, that people with those degrees very often do something else, which is what you are also saying, but it seems as if you are arguing so it’s a little weird.