History of "computer" in popular usage?

I don’t have a cite, but I recall reading that during the Manhattan Project, the word “computer” was commonly understood to refer to the women who sat in rooms filled with other “computers”. They were not programmers and didn’t necessarily know much mathematics. They were good at arithmetic and were trusted for their accuracy and attention to detail. Other people programmed these “computers” to do the desired numerical calculations. Mechanical calculators were also used and sometimes called mechanical computers and the word electronic computer was also used, but only by the few people who were aware that such things existed.

When I was growing up in the 1960’s, to my knowledge, the earlier usage was moribund, if not entirely dead.

The 1942 Life magazine article cited above was interesting, but it is not clear that the word “computer” refers to the slide rule being used, rather than the person using it. If you read the captions of the eight figures, it seems to me they are calling one person Chart, one Navigator, and one Computer. So when the one guy gives the data to the computer, he is giving it to another person. The caption says: “Chart receives answers from the computer in the form of ruled track line and dot showing position of the plane”. Those last steps were certainly done manually as depicted in picture 5 and the opening picture on page 98.

P.S. I worked for a company in the early 1970s that sold computers. One of their sales brochures misspelled the word as “computor.” This became a running joke among the employees when complaining about the company, but it reflects the fact that the word was still rather new to the general public even as late as the 1970s.

What about the possibility that general public referred to distinguished people computers from electronic ones by referring to their brand name?

“We ran the results through the Univac”
“Let’s punch those numbers into the IBM and see what it spits out”

Not completely on point, but an interesting recent (12/3/11) article in The Economist about human computers and their resurgence.

If that article is to be believed the average person or even scientist in 1937 would be more like to think of one of these people when presented with the word “computer” than think of a rare mechanical device. (There were no journals about machine computing.)

It is by will alone I set my mind in motion. It is by the juice of sapho that thoughts acquire speed, the lips acquire stains, the stains become a warning. It is by will alone I set my mind in motion. :slight_smile:

i recall radio science fiction story(ies) from the 50s that refer to computing devices as ‘something’ brains.

There were vocational college classes one could take to become trained as a “computer”, in the 1950’s era or so. I used to collect old math textbooks that I found in used book stores (I had a great trig text from 1914). I had a textbook for one of these “computer” classes. It focused heavily on using the then-current computational devices efficiently and accurately.

When entering two numbers to be added, the first number was entered into the machine in one way, and the second was entered in some different way. Thus, there was a need to discuss the first addend separately from the second addend. Thus, they were called the “addend” and the “augend” (I forget which was which). Likewise, for a subtraction problem, you had the “minuend” and the “subtrahend”. You hardly ever hear those words any more, but in those days it was relevant to have separate words for every part of a computation.

And yes, that textbook used the word “computer” to refer to the person operating these computational devices. As Shagnasty wrote several posts above,

In the Manhattan Project, for example, they had a “computation pool”, just like a “typing pool”, of trained “computers” to do the drudgey arithmetic. The scientists sent there formulas and data down to the computation pool, where the “girls” would do the arithmetic to determine if, for example, the atom bomb was going to ignite the atmosphere and incinerate the whole world.

But I doubt that those same people were the ones who later became computer programmers. They were more like glorified typists, from what I gathered.

DSeid cites an article in the Economist, several posts above:

(Emphasis added by Someone Who Is Me.)
What do these new human computers do that the other kind of computers cannot do?

I am picturing those vast scriptoria filled with people cracking Captchas for the spammers.

Read the history of the punchcard (most often associted with IBM). That sort of eased the average man on the street into the computer age.

When I worked as an operation on an IBM 370, the shop still had card punch and accumulator/sorter machines. Interactive terminals were rare because the RAM to handle a screenful (say, 80char x 25 lines = 2K) would be prohibitively expensive. The first interactive terminals I used were selectric teletypes, because paper by the box was cheaper than RAM. An older co-worker mentioned he learned Assembler because one place he worked at, the computer had only 40K or RAM and his programs in COBOL were too big to compile. A lot was accomplished by feeding data in one tape or two tapes and out another so the computer did not have to handle the whole lot at once.

By those standards, punch cards were great. A small chunk of cardboard could hold 80 bytes; a tray of cards could store the same amount of data as tens of thousands of dollars of electronic RAM. A scanner that could convert even sloppy handwriting to punch cards was cheap, provided she didn’t quit or go on maternity leave. Cards could be sorted, counted, totalled, grouped, or whatever else you wanted. A card was business-envelope sized, and it started to be what you got in the mail for your gas bill, electricity bill, medical registration, or whatever. It might come pre-punched with your account code, you filled in the rest, and when the company received it back, they punched your meter reading or whatever on it.

Originally, all this was processed by electromechanical means - cards were read as they were fed through the reader, as little metal contacts could complete a circuit only if a hole was punched. Eventually, optical readers replaced that - a light through a hole was much faster to read, cards could be processed faster. Numbers could be tallied with counter wheels, like the old odometers; electronic counters sped that up too. Gradually after WWII the card-processing business machines became more electronic and less electromechanical. This was evolution too - the businesses had the processing that relied on cards - all the switch to computers did was speed things up, and give more information faster. The list of accounts payable, the amount, due date, and payee - it’s the same on cards, tape, or a flash drive. Going more electronic reduced mechanical delays, changing calculation times from seconds to microseconds.

(Similarly, electro-mechanical calculators were standard in banks and accounting departments until electronic ones replaced them starting around late 60’s. I remember the Ontario Science Center around 1968 had a display of nixie-tube desk calculators that even did square root, although that could take several secons.)

When I was a kid in the early 60’s, we lived near a university and I remember kids playing with decks of discarded punch cards. So sometime between 1945 when computers were built for the wartime artillery table calculations, and the late 1950’s when IBM and others sold commercial units, “Computer” came to mean what we think of today.

Tommy Flowers, the chief designer of the Colossus, wrote several years later about being summoned to Bletchley Park in 1942:

Senegoid writes:

> But I doubt that those same people were the ones who later became computer
> programmers. They were more like glorified typists, from what I gathered.

For what it’s worth, these were exactly the sort of people who later became the first computer programmers. During the early years of computers (the 1940’s through part of the 1950’s), the assumption was that computers would be programmed in a two-stage process (at least for problems with some significant mathematical content). First, there would be scientists and engineers who figured out the equations that would have to be solved and the algorithms to be used. Second, these equations and algorithms would be handed to the programmers. These programmers were the same women who had been computers in the old sense. They were often very smart women who had been told that no woman should go to college or should get a science degree or should go to grad school. These women were expected to figure out how to write those equations and algorithms in computer language. For business applications, the executives would tell these women what sort of business statistics they needed and let them work out the programming details. The assumption was that this was essentially just a secretarial job. It was only slowly in the 1950’s that it was discovered that programming was actually a very hard job with a lot of necessary theoretical background. At that point, men started to move into the field. The proportion of women who were programmers dropped until the late 1960’s, at which point there were more male programmers than female programmers. In the late 1960’s the proportion of women going into computer science went up again, just like it did in all other fields of science and engineering.

This version of events does however contradict that Economist article I linked to. As already quoted:

That play was the senior play while I was in high school. IIRC, the “computer” was the monster behind the desk that replaced the stacks of books the research department once used, but the personnel that operated it – and queried it for answers – were “operators,” not computers.

The whole point of the play (and movie) was that a computer (the hardware) was not going to replace humans, but augment them. Nevertheless, the human part of the equation was important, as machines are stupid. It was sort of a pre-Watson conclusion, with all the foibles you might expect from a stupid, but smart, machine.

How things have changed in 60 years, and how they have stayed the same.

Turing’s paper “On computable numbers”, which more or less heralds modern computer science, used “computer” to refer to human calculators:

Where he talks about machines doing some computational task, he refers to these machines as “computing machines”. That was in 1936.