Is it just me or is our engineering college curriculums still stuck in the 50’s? Like so much else of all of our education system.
I mean, they teach you a LITTLE bit of certain hard skills, but never that deep, particularly coding and solidworks/CAD.
Other than that, it’s math up to calculus 3, even though, from what I’ve been told, very few engineers actually do calculus on a regular basis.
The rest is filled in with humanities electives (bullshit) and crappy attempts to teach the students the “soft” skills (those which are hard to even articulate/explain)
I’ve always been amazing at STEM, but I kinda suck at school (homework on my own time - and please don’t tell me this is an important skill since I and most other sane people strongly do not plan to do even more work at home when employed for no money, simultaneously violating labor laws in the process)
I have a bajillion ideas but I have no freaking clue how basic elements of stuff are actually put together. Such practical knowledge seems to be absent in most engineering curricula.
From what I can glean, learning machining (the curricula include CAD and tech drawing) would be a better start to acquiring real maker/inventor skills
Colleges are not suited to teaching fast-moving skills and employment training. Even in this era of degree-as-job-ticket, the focus of an education and training for specific jobs using specific, current-generation software and standards are not all that congruent.
A college might teach the generalities of engineering CAD using a standard tool like AutoCAD, but that won’t help much if Pratt & Whitney needs someone with a specific area of expertise using Solidworks.
I’m not sure what the solution is. I’d really hate to see degrees turned further into trade-school certificates, but the underlying education is essential to make professional use of the tools. Possibly there should be a separate track wherein students on the verge of a degree can take a far more trade-oriented path through the very newest tools most specific to their intended career.
ETA: If I didn’t say it clearly enough, I don’t believe teaching mastery of tools makes one a master - machinist, programmer, physician, anything.
Again, Carter did not study nuclear physics at Annapolis. He studied nuclear engineering. Nuclear physics doesn’t even exist as a major at Annapolis (or at most undergraduate schools). It’s possible that someone could major in physics, serve the five years they have to in order to fulfill the required time in the Navy, and then go to grad school to get a Ph.D. in nuclear physics, but no one could become a nuclear physicist with a bachelor’s degree from Annapolis. The Navy doesn’t even have that much need for nuclear physicists. They want nuclear engineers for their nuclear submarines. That’s what Carter mostly did in the Navy - work in a submarine:
The problem is that most employers don’t train anymore. A few decades ago, you could go get a BA in Medieval French Literature and then take the exam to qualify for admission into an IBM Financial Analyst training program. IBM would train you as needed and you had a good career. Now, you need a degree in Finance to get anywhere near the door. And if you do get a degree in Finance but find that there are more jobs in Logistics, you have to go back to school for a degree in Logistics.
I know a guy who got a BA in Elementary Education in the 1970’s. Didn’t ever teach in a public school, ever. He just needed a bachelor’s degree-level education to qualify to train for the job he really wanted. He learned how to think. Then he learned how to do a job.
There’s an initiative called Skills-Based Hiring that supposedly would help reform the system by providing a robust way to let people self-train for various careers and then get their skills independently assessed. Then you can go to an employer and say that you have a BS in Marine Biology, meaning that you know how to think, and passing exam scores in Object Oriented Development, Java, and Organizational Teamwork, so you have what it takes to survive in a software development office. But that idea seems to be dead in the water, except for a small number of low-skill jobs for which the skill assessments are little more than basic literacy tests.
You’re saying that degrees and job skills are related, but not the same thing, but that a degree is necessary anyway to really master the skills. How divergent can the degree be to the skill? E.g. degree in Mechanical Engineering, job training certificate in AutoCAD. Good? How about a degree in Physics, job training certificate in AutoCAD. Close enough? How about an MD plus a job training certificate in AutoCAD? MS in Psychology? BS in Dietetics? Clearly the line isn’t that clear, so there’s really no point in asking for you to identify it specifically. But how do you find that line?
What is the math there? Dr. Einstein looks a little … can’t think of the word … Looking at what he just wrote. Sort of like “there you go again. Not sure if I should be mad at you or not. Now I gotta clean this up.”
While most physics was not done on computers back then, most of the stuff that was - on the IAS machine at least - was physics.
MIT had lots of IBM 1130s scattered about for non VI-3 (CS) majors to use. I’m not sure about physics but a lot of engineering majors took Fortran. We got to use real computers.
Think of the computer as providing to some extent “confirmation”. With a slide-rule and some hand-waving, you could get 2 or 3 digits of accuracy. With computers you could be more precise, provided you were careful not to let GIGO take over.
(My Numerical Analysis course, the prof mentioned a result - he says, “Here’s the solution from the computer, Ax^2+Bx+C where A=0.2305+/- 0.3155; what does this tell you? Basically, that you don’t even know if the parabola opens up or downward. Don’t let overwhelming precision make you ignore the meaninglessness of the answer.”)
As technology became more sophisticated, I think the need for more in-depth calculations became more necessary. Particle accelerators kludged together in a lab could be built by seat-of-th-pants and slide rule calculations. Ditto for reactors assembled from bricks and operated manually. Giant projects like modern accelerators, or critical work like the Manhattan project, called for more predictability and precision. They couldn’t spend hundreds of thousands fabricating parts for a device to find the first approximation guess was way off.
As for engineering - somewhere around the early 90’s, CAD started to dominate engineering practice. There were no more draftsmen where I worked. The non-engineers had to have the expertise to be technical designers, doing their work directly on CAD to be approved by the engineer; any engineer who wasn’t an old fossil found it easier to do their own CAD for advanced design rather than doing iterative markup sessions with the designer. (Just as by 2000, most business types found it easier to type their own documents instead of dictating to a typist).
Also around the early 90’s was the discussion about the need for engineers to understand their automated tools, and the need to do back-of-the-envelope calculations to ensure they had not accidentally under-designed something by taking a piece of software’s word for it.
But is this kind of stuff done at the undergrad level? I know for me, once we got past some of the freshman/sophomore level stuff, my physics classes did very little in the way of lab/experiment type stuff, and that persisted even through the two semesters of grad school. It was basically all math manipulation, not data analysis–I recall spending most of my time just getting the complex integrals into something in a Rubber Bible. I just don’t recall dealing with much in the way of real numbers that I would think to use coding on.
Now, I could see using undergrad time to teach coding with the understanding that when you branch back out to heavier experimental stuff you’ll need it, but I’m not seeing the huge use while you’re still learning the concepts.
Then again, I recall when I started my grad school stint, I was a little…directionless…and inquired about Optics, and was told that the Physics department only handled theoretical optics, experimental optics was in the Engineering department.–so it may just be a matter of defining physics curriculum as well.
CAD is a tool to visualize and refine physical engineering projects. As a tool - however intelligent and interactive things like AutoCAD with FFT stress analysis and the like built in might have gotten - it can only do what it’s told. The user has to have the intelligence, training, expertise and experience to use the tool to effective ends.
A degree is about that training, expertise and (one would hope) experience and intelligence. It is not, and should not be, about how to use the tool. For one thing, different fields, companies and projects might use different tools, so a graduate program in AutoCAD might be completely useless at a company or division that uses other tools to apply “intelligence” to its projects.
Learning to use tools, generically, as part of a degree process is one thing, but extensive training in, say, AutoCAD R14 with a specific engineering module may well be worse than wasted time.
I’m not sure what the solution is, since as others have pointed out, employers tend less and less to hire “raw material” and train them than expect to hire someone who exactly meets 20-30 bullet points of experience and demonstrated ability. Jobs once general or at least open to a general field of applicants have become staggeringly narrow fits.
This is true even in my field, which is software-tool intensive but not STEM. I have a truly vast well of experience and abilities (as I should, after more than 30 years); I am automatically screened from about 9 out of 10 job applications because I don’t have EXACTLY the experience they’re looking for.
Tell me about it. It’s a problem in the software development world. Everyone wants to hire people with 5 years of experience in Java, 3 in Oracle, and 1 in Rational Rose. Degrees are becoming training programs for specific languages. So instead of going to learn howtowritesoftware, you do a degree program in Java, and that means you have supposedly learned how to write software in Java. Want to switch to C#? Good luck! Nobody will hire you, they don’t train, and why are you applying for C# jobs anyway! You’re a Java developer! Go apply for Java jobs!
You might think that hey, what we need are job training certificates, so you could e.g. get a certificate in X language or Y technology. So John has a bachelor’s degree in Computer Science and certificates that he knows C, C++, C#, Java, and Scheme, and can use the Jira bug tracking system. Well, those kind of credentials exist, and are colloquially called “certs”. Many of the most hardcore and theoretical folks in the field consider them trash, basically, because you can get them just by passing a multiple choice exam that doesn’t actually demonstrate any real skills but memorizing a study guide. Many people in the field regularly bump into people who are heavy on certs and low on sense. Oh, you’re an MCSE? But you’re as dumb as a sack of rocks and can’t administer crap! How the hell did they ever think to give this to you? Oh, you just memorized the study guide and barfed the right answers on call.
So maybe there ought to be a middle ground. But there largely isn’t.
As another data point, I did a physics degree at a major British university in the latter half of the Eighties.
Exposure to coding at secondary school was a specialist affair. I think I was perhaps one of three out of a hundred who did anything formal in my year (all in BASIC). Okay, that was surely skewed towards those of us likely to do a physics degree, but I think I was the only one of the three who actually did do one. There was a sort of broader exposure to the likes of the BBC Micro and the Sinclair machines.
Without being entirely sure, I think a coding course was mandatory for all undergraduate physics students, but it was definitely taught within the physics department rather than by a computer scientist. I did it in 1988 (see below). At this point I’m fairly convinced, seemingly bizarrely, that this was done in Pascal, rather than FORTRAN. And I think it was running programmes on a centralised machine (but with punched cards long since the sort of stuff old-timers bored on about).
By the time I started my doctorate in theoretical physics in the late Eighties, it was however utterly taken for granted that you could code, if normally in FORTRAN and leaning on NAG routines. But hardly exclusively and with the expectation that you could just pick up any of that sort of stuff. Whether this was based on the expectation that you had some formal exposure as an undergraduate, or that there was a general threshold that had been crossed by this point, I don’t know. All done on a PC.
The most memorable aspect of that 1988 course was the lecturer - a significant particle experimentalist - having some reason to mention Feynman and him adding “I’ve just heard that he’s apparently died.” He’d actually died about a week earlier. By contrast, it’s very difficult to believe that when, say, Hawking or Witten die no physicist on the planet will remain ignorant of the fact for more than 24 hours. It was a slightly different world. (Many of us had actually already heard, but still with a few days lag.)
Good point. I would imagine nowadays, there are programs that do things automatically. (Heck, my scientific calculator in 1977 did standard deviation.) Even back in the 70’s, there were libraries of code (usually FORTRAN) to do line fitting, standard deviation, and other data analysis. nowadays there are even things like Mathcad to do equation analysis. The need to code only might happen if you were designing your own experiment - something I’d typically associate with graduate level or doing a special project around 4th year.
However, even back in the 80’s students were encouraged to take a scientific computation class or something similar (just as statistics was a somewhat required class), to understand the process and limitations of computer analysis of numbers. They may not actually write their own analysis routines, but they needed to know what those programs actually did and how/why they could give unreliable answers sometimes.
I find this quite bizarre. I taught (and advised) math majors right up to 1999 (when I retired) and I can assure that at no time were math majors required to program.
The thought of programming in the 40s is absurd. There were maybe 3 or 4 of what we now call computers during the war. One, the ENIAC at Penn, arguably the very first general purpose computer, came on line in 1945 and was programmed (by women, incidentally, since it was thought to be clerical) by running wires from one point to another. There was the Collosus at Bletchley Park in England. The differential was an electro-mechanical computer at Harvard and, AFAIK, that was it. There was not a lot of development in the late 40s either. IBM estimated that the total world demand for computer would be five and declined to get involved. The Univac I in the early 50s might have been the world’s first commercial computer, but these cost millions of 1950 dollars, think tens of millions today. These computers used vacuum tube (remember vacuum tubes). The late 50s brought transistor computers, but these were discrete transistors and the machines still cost a fortune and the average college student would have had no access to them. Finally in the 60s integrated circuits came along and, with them, minicomputers. They were usually the size of a washing machine, cost hundreds of thousands of dollars and had early hard drives. Before that long term storage was on tape. My department bought a mini in 1975, just as the first micros were being developed. That was when I wrote my first program (in BASIC) but it was purely recreational. I have never had occasion to use a computer in my research, except for writing papers in a language called TeX, which is a program basically for writing mathematics. Now all math journals accept papers only in TeX, but that is not programming.
I meant to add to my previous post that Feynmann’s “computers” was a room full of women with electromechanical calculators carrying out his computations. I suppose organizing their work could now be called programming, but I doubt it was then.
I’m not sure, and don’t feel like looking it up know, but I believe the first corporate users of typewriters (women, of course) were themselves known as “typewriters.”
I can verify that it’s perfectly possible and unremarkable for one to even obtain a Ph.D. in mathematics in the current era without any significant programming experience. Of course, many math PhDs do learn to program, the career market being what it is (and also just because of natural interests, or because one happens to work in a particular area of math for which it is particularly useful, or whatever), but at the same time, many math PhDs don’t, and no one bats an eye.