Should college professors test or teach?

In this pit thread, Soapbox Monkey pits his teacher for not teaching him something, but rather just telling him what to learn and then testing him on it. Some Dopers then came to the aid of the teacher and claimed that he was doing Soapbox Monkey a favor by preparing him for the real world of computer science.

Most of the Dopers who sided with the teacher claimed that he was doing the right thing for a CS major. What about all the other majors? Should teachers simply test students on how well they learn on their own or should they see to it that the kids actually learn something?

What do you think?

Being a professor myself, I would assume that my principle purpose is to teach, and a secondary purpose is to evaluate the students and give each one the correct grade. If no teaching is necessary, then there’s no need for a physical university. We could simply replace the bricks-and-mortar universities with online institutiions. Students sign up, an email tells them which books and papers to study, and three months later they each get a test.

Real life teaching means the professor uses skills to maximize the amount that students learn. When I’m at the board presenting the proof of a theorem, I can watch my students and see how they’re reacting. If they look like they’re accepting the proof, I can move on to the next topic. If they looked confused, I can repeat the proof in more detail, or present an alternate approach. I can also meet one-on-one with students who are struggling and try to figure out what their specific problems are. Those experiences can’t happen in an online course.

I fell into the middle of that camp and was easily swayed by both arguments. Since I am a developer myself and a manager of developers, I do agree that in the real world you have to find ways to teach yourself a large amount of what you do.

However, a school should provide assistence in learning and a professor - a good one - will do that. If not in the class room, then in his office hours.

I think that a professor should help their students learn - and help them to learn to learn by themselves.

The real problem is ('nother prof here) most of my students have a very passive vision of themselves and their role. I like to think that I am facilitating their learning–i.e., helping them to teach themselves how to figure stuff out. I’m teaching methods, IOW, not material, but a lot of them see me as some form of entertainment. I get a fair amount of “He didn’t teach us anything all term. He said he knew the books inside out, had written books about the books, and it looked like he did, but the fucker wouldn’t share what he knew with us. What a gyp! He kept asking us questions, but he wouldn’t tell us the answers. What a gigantic asshole! “What do YOU think?” Fuck that. I paid good money to find out what HE thinks, and the dipshit didn’t tell us squat.” Etc. I think Ive done a pretty good job when I read that sort of thing on my evals, though my chairman might think otherwise.

To me the issue in the pit thread hinges entirely on what the syllabus said. In the specific case of a computer science curriculum, if I took a course which was described in the syllabus as instruction in a particular language or tool, then I would expect to be instructed in the language or tool, not to be told to learn it independently. If the course was described as more conceptual, with the language or tool not the main focus but a means to an end, then I would not expect to be taught the language or tool. I would expect help if I asked for it.

As should be obvious by my responses in the other thread, I think a professor’s job is to teach. But, I also have to say that it depends. There is a big difference between major research institutions and other schools. It seems to me that the former have more latitude for assuming a student can (and will) simply be able to do whatever is thrown at them. The purpose of a class in that case is to prepare and present a structured progression of material that provides direction to the student, with the assumption that simply saying “You need to know this, now go learn it” is sufficient.

For instance, I went back to school after a hiatus of a couple years to study AI; prior to enrolling, I spent lots of free time in the library picking out books that seemed to cover what I was interested in. But, I ended up spending (wasting?) inordinate amounts of time reading about minutiae that escaped me, as I didn’t yet have the basic conceptual apparatus to understand why the minutiae were important, much less what the issues were in the first place. I hazily remember reading a book by Charniak on search algorithms; at the time, I didn’t even know what iterative deepening, A*, and AND/OR graphs were. It was kind of silly for me to attempt to grasp the points he was making when I didn’t even speak the same language. The whole reason I went (am going) to graduate school is to provide direction – this is what I’d be doing anyway, but needed help in winnowing the endless body of knowledge down to a manageable size. A student who attends a top research school should, to a large degree, expect to be given assignments and then just get them done.

However, most schools do not fit that mold. For them, the professor is indeed responsible to teach. To put it in the most extreme form, it seems to me that a teaching school should operate under the assumption that the student has no (or very little) prior knowledge of a discipline and guide them through (as close to) all facets of the topic as is possible. The intention being that upon graduation, the student will have the skills and knowledge to do exactly what the other type of school assumes the student can already do.

I think this sums it up well.

When I was teaching (computer science), I worked under the assumption that I was being paid to assist students in gaining knowledge. People learn differently. Some learn well from reading a book. Others learn better from lectures. Others from experimenting. It was my job to explain the material as best I could, provide the students with supplementary reading material, and assign programming tasks that drove the material home. For a student who still has trouble grasping the material, my job was to help them one-on-one.

A professor who simply tells the students to write programs–leaving them to find their own sources of reference material–is not teaching, and saying it replicates real-world workplace experiences is a cop-out.

I’d say they’re both essential parts of the job. A professor has to be both coach and referree—has to both facilitate learning and evaluate how much the students know. A truly good teacher must be good at both.

Some other thoughts:

“Teaching,” at least at the college level, consists not only of telling the students directly what they need to know, but also in things like telling them what books or articles are important to read, giving feedback on their work, answering questions, and setting assignments of an appropriate difficulty level and educational value. Exactly what sorts of teaching are best are going to depend on the professor, the student, the subject matter, and the level.

The further along one goes in school, the more one has to take responsibility for ones own learning. (The OP uses the word “kids,” which implies a certain immaturity. Sometime between the time one enters and the time one graduates from college, one should have become an adult, capable of learning on ones own.)

At a college that specifically advertises itself as dedicated to teaching, I would expect the professors to actually teach, and to do so reasonably well, and I would have a legitimate complaint if they did not. At a college or university that made no such claim, good teaching is nice when it happens but it’s not so reasonable to insist on it.

How can one teach without testing? It’s fine to look at the class to observe how they’re reacting, and to deduce from questions how much they are absorbing, but tests give feedback to the instructor what the students are getting, and feedback to the students about how much they really understand. Tests before teaching are essential for finding out what the class knows.

What the person who started the Pit thread object to seemed to be that he was forced to learn something new on his own. That might be a failure of his computer science program. What my CS professor friends call service classes teach a language to non-majors who might need it for their work. Those classes teach the language, period. CS classes should teach you the structure of languages so that you can recognize these in a new language and teach it to yourself. I’m sure anyone working in the field for any length of time have taught ourselves more language than we learned in school. Some of us have designed our own!

As people have said, it depends on the class. An intro to programming will likely involve a specific language, but I think that language should be used as a framework to introduce fundamental concepts that stretch across a class of languages (functional, structural, object-oriented, and I think that now event-driven programming is important enough to cover as a paradigm).

In a more theoretical, conceptual class, the specific theory should be covered with a minimum of implementational instruction, but there’s no reason that in a compiler class (to refer to the OP) that one couldn’t go from explaining the theoretical construction of grammars (context-free, in particular) into a short discussion of yacc/bison and simple usage of it (maybe one class period)…when I took compiler construction, one period of slides on yacc helped me a lot.

There’s a balance to be struck, in this as in everything.

It would be lovely if I could tell my students “Here’s a textbook, read the chapter on integration by parts and then do the problems” and then poof everyone in the class would understand integration by parts. But clearly that’s a fantasy.

But by the same token, it would be almost as lovely if I could stand in front of a class, do integration by parts seven, eleven, or a hundred and seven times explaining each step as I go, and poof everyone in the class would understand integration by parts. But that’s a fantasy, too…it’s just a fantasy that students sometimes fall prey to (as opposed to the previous fantasy, which is one that teachers may occasionally fall prey to).

It’s got to be both, if you have any interest at all in seeing the majority of the students actually learn something. That said, the appropriate balance may vary from subject to subject; for example, I would expect the practice-to-instruction ratio to be higher in a CS course than in a math course.