How familiar with the hardware aspects of a computer system are programmers?

I see an OP and many (not all) responses that mix 2 different skills and areas of knowledge interchangeably. But they are very different.

  1. Understanding how to build a PC
  2. Understanding low level details of hardware with respect to controlling that hardware in programs

Option 1 requires the following type of knowledge (in general terms)
Which hardware components are required to create a working PC
Which brands/versions/specs of those compoents, etc. can be combined together

Option 2 requires the following type of knowledge (in general terms)
What are the detailed functional capabilities of the specific component
How are the various functions initiated
What are the expected results for every given set of inputs
I would agree with what many posters have already stated.
A large portion of programmers (myself included) probably do not know how to put together a PC without doing some googling, but generally understand which components are required.

As for scenario 2, again most programmers are working at a higher level than that, but most probably have a pretty reasonable understanding of how the hardware operates.

You don’t really have to know hardware to be a programmer, but some familiarity helps.

I’ve worked with many programmers who only had the vaguest notion of the inner workings of hardware, but the better ones have usually had a pretty good idea.

I run a small software company. Not surprisingly, software is our main business. But because most of our customers are small enough that they don’t have their own IT staff, we get asked to do everything having to do with the computer systems.

We install network routers, switches, hubs, printers, run wiring, build PCs, upgrade PCs, troubleshoot AS400s, replace FRUs (field replaceable units, otherwise known as parts), swap components, troubleshoot operating systems errors, upgrade commercial software, etc.

We also write process control software, and I’m the guy in the company who troubleshoots the RS232 com lines and other kinds of serial communication hardware.

I do all of those things, and have for years. In the last two or three years, I’ve offloaded all the PC and network jobs on my partner, so I’m not as familiar with current PC hardware as I was a few years ago. But I’m certain I could still build a PC in an hour or so max, not counting software loading and configuration.

I understand basic transistor and chip theory, and I’ve designed simple PC boards (Printed Circuit, not Personal Computer) for very simple specific functions. But it’s been years, and more than a simple switching or timer circuit was probably always beyond my capabilities.

So yeah, I know some hardware, probably more than the average programmer. Does it help me in programming? Some, no doubt about it. But for many things a programmer might do, it probably doesn’t make much difference.

And printers are the devil!

As a wild generalization, graduates of computing science tend to learn about computers from a logical standpoint. They may take a class in CPU architecture, but will generally be at the level of, “This is the ALU. These are the shift registers. This is a pipeline. This is how you use them.”

A Computer or Electrical Engineer will learn it from an electrical function standpoint. “The ALU is built out of X number of discrete transistors. They use CMOS design, and therefore can sink a current of X.”

I took both CS and Computer Engineering. In Computer engineering we did assembly programming, C programming, and lots of hardware, including radio frequency stuff, microwave equipment, etc. One of my projects was to build a graphics board for a single-chip computer, using nothing more than NAND gates on a breadboard. We had to worry about scan lines, blanking intervals, and all the rest of the TV spec, as well as interfacing to the CPU. It was very rudimentary, but I got it working. I had 5 labs a semester, two of which were programming and three were electronics work at a bench.

In CS, never touched an electronic component again. It was all programming theory. Compiler construction, data structures and algorithms, and lots of math. I had lots of math in computer engineering as well, but it was applied math. In CS it was more abstract and theoretical. In Computer Engineering it would be, “Here’s a differential equation that you’re going to use to solve electrical problem X. Use this Gaussian function to determine the charge on this sphere”. That sort of stuff.

CJJ*: Oh, I fully agree with how badly things can foul up in a production environment, and how bugs can manage to work their way up from even the lowest levels to the high reaches of userland. (To further expand on your case in point, the Pentium FDIV bug was detectable by evaulating expressions in the Windows 95 Calculator program. Intel ended up replacing a lot of chips over that fiasco.)

I further agree that people who don’t understand low-level stuff are doomed to get bitten by it. Hard. We’ve spent fifty years trying to make that statement untrue, but I don’t know if we’ll ever get there.

Eric Raymond has some interesting comments on some aspects of this topic.

Programmers are a whole level above mere hardware. In the same way that architects don’t do scaffolding.

It’s beneath us, leave it to the hoary handed manual classes to wear the anti-static wrist bands. :slight_smile:

One of the thing that never fails to bug me is whenever I mentioned that I did Multimedia Engineering in Polytechnic and is going to take on Computer Science next, I always got tagged with the label 'Computer Hardware Expertz!"

Unfortunately, I am not. I think most programmers worth their salt knows the difference between RAM and ROM, what the graphic card does, how does the CPU works and etc, I would be lost on the exact, specific details, like “Why does UltraPowerfulGraphic brand graphics card kick the UltimateRenderer graphic card’s ass?” or “How many L2 caches does the new processor from SuchAndSuch company has?”

I can’t even put together a computer to save my life.

Once upon a time, though, programmers need to know hardware well. It was the dark ages before Windows 95, that glaring bright light which shoves DOS to the pit where it rightly derserves, come along. Every device, every piece of hardware, has vastly different code which interact with the hardware, and game programmers (especially game programmers) have to deal with them.

Windows 9x and XP changes that by using various abstraction layers. So we just tell the abstraction layers “I want 100,000 textured polygons on the screen NOW” and it tells the hardware to shove that 100,000 polygons, and tell us if it succeeds or not. All right, the last part don’t usually happens - if the hardware can’t comply, most of time you’ll just get an ugly crash or the elegant Blue Screen of Death.

However, as this thread has shown, not all programmers are hardware-idiots like me. Those who have to develop programs for electronic devices probably know more than a thing or two about hardware, and server programmers/maintainers tend to know more about their devices than lazy bums like me who just rely on Directx or OpenGL.

I know a lot of programmers who are completely helpless even dealing with the OS, to say nothing of the hardware. It is a credit to modular computer architecture that this situation is able to exist. People who know hardware and software are expensive to hire and impossible to fire. Niche workers are cheaper and more disposable.

Prolly why I can’t find a job = )

I was a software professional for a few years before I knew how to put together a PC. Hardware knowledge has never been important to my work as programmer/software engineer. My knowledge of PC hardware is more from being a PC consumer than programmer.

You don’t need to know how to replace a starter to drive your car.

How many programmers does it take to change a lightbulb?

That’s a hardware problem.

One thing I wanted to clear up, in case nobody’s pointed it out yet: electrical engineers may have just as little understanding of the workings of a computer as a programmer. At least at my old school, University of Michigan, we had a separate major called Computer Engineering that dealt with computer related design. Electrical engineering concentrated more on things like semiconductor design, signal processing, etc.

You’ll have to refer to helpdesk for that. Their hours are 10am to 3pm mondays, tuesdays and fridays. Please have your make and model number of the lightbulb ready as well as a detailed description of what you were doing when the lightbulb went out. Please also keep a note of what other lightbulbs were on at the same time. You might also need to know what sort of light socket you have and whether your house is built on a granite foundation or sandstone.

Support charges are $35 for the first hour and then $70 for each subsequent hour.

Have a nice day!

How many hardware guys does it take to change a lightbulb?

We’ll fix it in software.

How many IBM computers does it take to execute a job?
Five. Four to hold it down, one to rip its head off.

If anyone wants some readable yet techincal articles on modern PC-class hardware (including modern consoles), Ars Technica is a good place to look. You get articles pitched to the sweet spot between marketing/end-user fluff and obscure, unreadable technical spec sheets. It’s very interesting, especially now that the Cell, 64-bit desktop machines, and multicore CPUs are shaking up the low- and mid-end systems in a real way.