Around what year did it become abnormal for someone studying physics NOT to do any coding?

I have an undergrad physics student friend who has the impression that coding is just part and parcel of being a physicist. He has just expressed surprise at the idea that Obama is the first president ever to write a computer program (it was a token example for a publicity event, as google will show) since Carter studied physics.

I am almost certain programming (much less using) computers was by no means obligatory back in the 40’s for physics students. My impression is that programming computers was very much completely its own separate specialty back then, which of course it still is to some extent but these days it’s a lot more common for someone to know how to program computers along with whatever else they do.

Even submitting projects for computers to take care of… I doubt that was obligatory for physicists (or students of the field) back then. We’re talking the years of ENIAC or just a little after, right?

I am a little less certain, but still somewhat sure, that this was true even well into the 80s and maybe the 90s.

I would not be at all surprised, in fact, if it were still true today (use of course, but programming? Not sure.)

What say ye who know what ye are talking about (unlike me)?

He’s now asserting that the use of computers was indeed obligatory for nuclear physicists at the time (which is what Carter apparently studied). So maybe he knows something I don’t. Does he?

Use of computers does not equal coding. Coding during Carter’s time was sufficiently arcane/difficult that it had to be done by professionals, though could be directed by the physicists. Today the coding tools have gotten simple enough that amateurs can write perfectly working programs. Lord save me from debugging them though.

Yeah, I’m not a physicist, but I’d think the transition might have started in the early 80s, around the time that home microcomputers started to take off. I was able to write BASIC programs on a TRS-80 as a little tyke, so I’d think a physicist would probably be able to hack together some code to help him solve a problem a little too complicated to work out on a calculator.

Yeah, I know, I was trying to keep that distinction clear in my posts but I failed. (Partly because the person on FB I’m talking about isn’t necessarily being clear about the distinction in his own posts, not due to a lack of understanding on his part but due to the casualness of the conversation.)

This is about what I thought.

I do think my friend thinks of the “directing” you mentioned, when taking place in the 40’s, as being more akin to what we think of as “coding” today, than actual 40’s “programming” would be. (Because it’s more high level task oriented rather than nuts-n-bolts oriented).

Was it obligatory for nuclear physicsts or their students to direct or somehow be involved in the direction of programmers of computers back in the 40s?

I have an undergraduate degree in physics, granted in 1986. I believe that it would have been possible to get a BA in physics at that time at my university without writing any code. In fact, I would guess that the core curriculum for a physics student did not include any programming courses. So, my guess is that in 1986 it would have been in no way abnormal for a physics undergraduate to have written no code.

And, of course, they would eventually have to pick up enough FORTRAN to crank out something or other.

I was an engineering student in the 1970’s and I was required to take a course in BASIC and one in FORTRAN (actually WATFIV). Now, an engineering curriculum is considerably more demanding than one in liberal arts science, but I suspect that, if not required for a BS Physics, it was at least highly recommended.

I know someone who completed a BS in Physics in the late 2000’s decade. He had to take a course in Fortran that mostly was targeting the ability to model numerical analysis calculations and whatnot. It required knowledge of variables, basic data types, procedures, and basic iteration. There was no expectation that the students could implement Quicksort or write an enterprise N-tier business application. So the issue of debugging and whatnot wasn’t as important. It was more or less “Here’s an equation. Write a program to calculate the value of such and such over this interval, maintaining at least three significant figures beyond the decimal point at all times.”

But this doesn’t mean it’s always a good idea. I know someone who bought a house that was owned by a retired electrical engineer who was moving to a retirement home. Everything in that house was custom, labeled with little numbered tags that you could look up in the big binder that he kept in the basement. But the wiring was, well, sort of shoddy. Guy wasn’t the best crimper and stuff would just fall apart suddenly, leaving bare wires.

My father got a doctorate in Physics ( granted, not nuclear physics ) in the late 1960’s shortly after I was born and long after Carter would have been studying. I’m quite certain he did no programming as part of his degree. Far as I know he didn’t own a computer until the 1980’s.

I started at an technical college in 1979. At that time, computer programming was generally seen as a separate skill from applied science. You would have been expected to use a computer in scientific research but not necessarily have been expected to write the program you were using.

I got an undergrad physics degree in 1983, and it required no programming. I did take programming classes in HS, though, so I was familiar with it. The last year that I was there, I was a lab assistant helping a PhD student with his research. As far as I know, he didn’t need to do any programming as part of his dissertation, which he would have finished around 1985 or 1986.

How many programmable computers were there in the 1940’s? 10? 20? I doubt most physicists even had access to a programmable computer. I expect most did their calculations the old fashioned way..

In 1968 I took a Physics course, with a big problem that involved measuring the tracks from a bubble chamber to determine the particles involved. That was hard and most of us got only a few right. One guy wrote a program that would do the work and got them all. It was pretty much then and there I realized I wasn’t going to make it in the world of big-time physics.

Some of the people in my class went on to get a physics degree without coding. None of them became professional physicists, though. I don’t know enough about the others to give a definitive answer.

My guess is that experimental physicists would be expected to code from the early 70s on. Theoretical physicists might have gotten away with handwork for longer. The use of computers to crunch numbers was standard even in the social sciences by the 70s, with anything to do with statistics done by a computer or by Wang calculators with a memory.

I studied Physics in the mid 70s. We had a simple print terminal in our lab that some students used to write programs on the main university computer. I never used it. Most of the usage was for some kind of Star Trek game I never figured out. One or two students had programmable calculators(!), so they did a little assembler type programing - I still used a slide rule.

By my senior year, calculators were cheap enough so that we all had them, but still no coding involved except for the most dedicated. I once coded a full app into my calculator of an exam (this was allowed, or not banned anyway) then the battery slipped somehow- the momentary loss of power meant my program as gone.

I did take a couple of Fortran classes but that was because I could get Math credit for them. Never used it in my Physics classes.

nm

My dad studied physics - got his PhD, became professor. He once talked about programming a computer where each word of “memory” was written on a magnetic drum, with a row of read-heads reading each cell as the drum rotated. Each instruction on the drum included the address of the next instruction; the various instructions took varied number of clock cycles, and the trick was to place the instructions so that the program did not spend a lot of time waiting for the drum rotation to find the next cell. This was done using a giant sheet of graph paper. Based on the tech described, this would have been the late 50’s or early 60’s.

(Richard Feynman describes doing the calculations for the Los Alamos project with punch cards. they were doing calculations on giant matrices, which involved multiplying the value of each row with each column, and punching out the result. Someone discovered an error in one entry; he said, rather than redoing the entire matrix, they could be redoing the column and row results affected. To keep track, just use different coloured cards until they got caught up to the ongoing rest of the calculations. After designing the process, he went home to sleep. He returned to the lab the next morning to find his trick worked so well, there were a dozen different arrays of coloured cards all in process all catching up to correct mistakes.)

My dad even in the later 80’s would do calculations involving solving 50x50 matrices with floating point (using FORTRAN on a PC). That operation wold take two days on his 286. He bought a 486 and ran the program on it as a test, came back in 5 minutes to find the program had stopped. After half an hour of debugging, he realized it had actually finished in 5 minutes, thanks to the floating point coprocessor.

I would suggest the answer depends on how advanced the physics class was. Before about, oh, 1965 access to computers was very limited. Bigger colleges might have a timesharing system which was so expensive only higher up profs and grad students could access it. By the early 70’s, minicomputers were becoming more common, but were still pretty expensive; the university might dole out time on the mainframe for serious classes. Calculators started to become available around 1973-74, so classroom accuracy beyond slide rule level (2 or 3 digits) became meaningful. 1977 would mark the advent of minicomputers like the Apple II, making computations a regular classroom thing, even for high school and undergrad.

I can add the anecdote that my university does not require physics students to have any programming knowledge(). This isn’t a problem because the students would be daft not to include programming courses in their electives if they didn’t have programming experience through some other avenue. In practice and to good approximation, everyone graduates with programming experience, and many are reasonably skilled (and a small fraction are at the level of a computer science professional).
(
) excluding things like working with statistical fitting programs or symbolic math tools.

In the 90s, when jobs for physics majors were sparse, a lot of my colleagues in professional programming were physics PhD and MSs. Since I have been working for myself for the last 16 years or so, I don’t know if they are as prevalent now as before.

I got a B.S. in physics in 1984 from a well-known, prestigious university. The curriculum did not require any coding. I did some programming on the side just for my own interest, but it had nothing to do with the physics coursework.

ETA: Things changed drastically by the time I got my Ph.D. in the early 90s. By that point I was doing nothing but coding.