[moderating]
Wendell Wagner and Capt. Ridley’s Shooting Party, please take your little feud out of GQ and stick to the subject. Thank you.
[/moderating]
I strongly agree with those who point out that programming isn’t one single monolithic profession. Today, I make most of my income as a freelance writer, and I know great writers who have never written a word of fiction (and, thus, never developed a plot), great writers who couldn’t produce a decent index if their life depended on it, and great writers who couldn’t write a tutorial for a pencil sharpener.
I started programming back in the dark ages (Ones and zeros? I wrote a Fast Fourier Transform program using only rocks and sticks!) as an operating systems programmer, and then moved into CAD (computer-aided design) application code. That job was wall-to-wall math. I had to write a window-intersection subroutine for vectors in assembly language as part of the job interview – really!
When I taught beginning programming courses, college-level algebra was a prerequisite. Students without some background in Boolean logic, sets, and basic statistics had a lot of trouble in my classes.
On the other hand, when I hired programmers for user interface design, I looked for strong artistic, psychology, and language skills.
In today’s world, many of the people with “programming” job titles do little that would have been recognizable as computer programming ten years ago. I see people whose only coding skills are HTML and CSS that call themselves programmers. They do no algorithmic work, no efficiency analysis, and barely understand what QC and debugging even are. These folks need no math beyond adding up cell widths in a table and subtracting the cell padding.
Forget math. In my humble opinion, the overwhelming majority of grunt programmers are bad at programming. And I’ve hired a lot of programmers (and fired a few, too).
No offense intended, but I really don’t understand how you can get a “Senior Systems Analyst” title without strong math skills. I had that title, and the job involved a lot of algorithm development, efficiency analysis, and statistical analysis. I had to work out interfaces and modularize the design to hand off to programmers–calculating the right places to split the design and figuring out how to balance the workload. With only high school math, you couldn’t have survived the gig for a week.
And OBO (off by one) errors are the most common programming error. (Did I start that loop at one or zero? Should I have used < or <= for the terminal condition?)