Why aren't collega professors taught to teach?

Oh, I forgot this part!

When I was in college the first time, I took a bunch of upper-level psychology classes. Always got As and A+s. They had one or two prerequisites, but they were optional.

When I came back to school, I was surprised to find that every upper-level class I wanted to take had a bunch of prerequisites. Like, 4 each. And they were not optional. They automatically dropped you from the class if you didn’t have them.

So I went through a huge fight with the administration to get admitted to the classes I wanted to take. They kept telling me that it would be just too hard for me, that my Psychology GRE scores proving my competence didn’t matter (750, which is pretty damned competent). Finally, though, they relented.

I talked to my best friend, who used to work for one of the big-wig psychology profs. And she told me that the reason they changed the policy on prereqs was that the professors were soooo tired of getting students who didn’t know background material, so they had to spend a few weeks doing review at the beginning. Their goal was to cut out the review. I said, “Great! I know all the background stuff, I don’t want a review.”

I started taking those classes I had to fight my way into, and guess what? The first MONTH was review, in three out of the four. Why? The lazy students were too lazy to try to remember what they’d already learned, so they whined if they didn’t get a refresher.

And they were trying to keep me out of these classes?

Also, college students are there because they went out of their way to attend - they had an educational goal in mind, looked at the institutions that were available, applied to one or more, and they’re paying to be there. You can expect that they’ll continue to take an active role in their education, because they’re there to get one.

Elementary school students, on the other hand, are generally at school because their parents made them go. They don’t know what they’re going to be studying, or why it should be important to them, so they need help from a teacher just to convince them that they should be learning.

In fairness, with so many more people going to college nowadays, you would expect standards to drop. Before, you had only the best of the best going to college, now, nearly everyone is going so of course there are going to be more slackers in the class.

PhD student in Spanish here. I haven’t had much formal training as a teacher, but I’ve had a lot of practice. I’m a TA, and my teaching load isn’t light - the foreign language TA’s here are the only instructor the students will see in their first two years.

I got my BA in 2000 and went straight to graduate school; if not for a fellowship, I would have started teaching that fall. In 2001, I took a completely worthless foreign language pedagogy course; worthless because it was nothing but theoretical perspectives, which were not at all helpful when I was facing 24 faces everyday who had no idea what a direct object was in English, let alone Spanish. It was a rough semester; I taught competently, but I felt terrible that I wasn’t a better instructor.

As a TA, I have taught (write tests, plan classes and exercises, do all grading, etc) 14 classes of varying levels of introductory Spanish, and I still have another 3 years or so of PhD work/ teaching to go.
So, while I haven’t had much formal training, I do have lots of ‘on the job’ training, if you will. While I most certainly have issues with the system, I do believe I’ve become a really good instructor, and hope to continue improving. I would like to find a job at a college/ university that does emphasize teaching.

Umm…you do? Why not just keep them the same and have all the kids who can’t or won’t hack it just flunk out?

Oh, right, colleges want the revenue. But the higher numbers of people going to college aren’t a good enough reason on their own to lower standards.

It’s not like colleges in the old days were snootily turning away all but the most promising students at the door. Colleges want as many students as possible, it was just back then, not as many people could afford to go to college.

Interesting thread. I am a lifetime certified teacher in grades PK-6 in the state of Texas, and currently a doctoral student in education. While pedagogy isn’t my main focus, it does touch on much of what I do.

A previous poster explained quite well why certification is needed in K-12 teaching. Kids tend to follow a pretty regular pattern of intellectual development. As it was said, knowing that 2+2 equals four will not necessarily make you a good teacher. You have to understand how children learn at a certain age to get this information - and the method how you came to said information - across. As a fourth grade teacher, I learned in childhood development classes that my least favorite pedagogical method - rote memorization - actually had significant value for kids in that age group. Remember learning the times tables, and not even knowing what what happening arithmetically?

But there is a fine question here - are we assuming that after age 18, students cease to develop in learning styles? In higher education, typically, yes. My university has a teaching center that offers tons of resources for profs and teaching fellows, but no-one is required to attend any of the sessions. In fact, I’d say it was the rare prof that would have the time, interest, or self-critical ability to voluntarily go to a session to learn more about teaching.

Most universities need research dollars to stay in business. To be an institution that exclusively and predominantly exists as a teaching institution, you need tons of endowment money and will have to charge high tuition, as the financial reward of good teaching doesn’t pay like patents do. But some schools do this - small liberal arts colleges like Bennington charge a premium for great teaching, and are able to successfully do so because graduates enjoy the experience and give money when they graduate to perpetuate the process. Of course, most schools can’t do this, and need grants from industry and the government to make money - hence, the number of schools that can say they are prioritizing teaching over research are very small.

There’s also the feeling that in higher education, the output is new knowledge - research. That’s what academia produces. If students are learning the exact same content as they did a generation before, it can be argued that the institution isn’t really doing much new. So occasionally a scholar will pen a revised perspective on Shakespeare’s life, or study his works from a critical discourse perspective… this keeps the knowledge fresh and appealing.

I help teach a course on the economics of higher education. One of the big questions we tackle in the course is the question of signaling versus productivity. Take, for instance, Stanford and Xavier University in New Orleans. Stanford’s admissions criteria are among the highest in the country - high test scores, stellar grades, and activities are the price of admission. Xavier is a historically Black college with a modest endowment, but looks for students who demonstrate initiative and drive - even if their preparation is lacking due to weaker academic standards at their schools and the like. However, in four years, Xavier places more African American students in medical school than Stanford - as well as Harvard, Johns Hopkins, and a number of other highly ranked institutions combined. (This is a fact chronicled in Ellis Cose’s Colorblind, as well as in a number of studies conducted by the top medical schools in the nation.) At Xavier, it seems that there is a strong tendency towards productivity - in other words, students are really learning something and can demonstrate that learning in assessments like grades, MCATs, and interviews. At Stanford, it’s possible that students that came in to the institution are not necessarily learning as much - the fact of the matter is that Stanford graduates, with the institution’s prestigious reputation and strong alumni network, will probably end up doing just fine after college. The institution sends a strong signal about the quality and capabilities of their students.

This of course is a sweeping generalization - there are Stanford students who grow in leaps and bounds intellectually, and there are Xavier students who don’t get into medical school. But it makes you think about what the purpose of college is, and what students pay for - in part. I say that because the common perception that college is a service, like a meal in a restaurant, where you pay a price that covers the preparation and serving of your dinner, is inaccurate. In fact, virtually all colleges subsidize the cost of attendance through tuition discounts before students even see the bill. This is especially true in state institutions, where legislation may cap tuition and colleges have to find the cash to pay for students. Part of the cost is passed along many times in tuition hikes, but the majority of the cost comes out of state coffers (sometimes, of course, paid for in part through tax dollars of non-college attending state residents).

All of this is to say that the prioritization of research over teaching is a reality in higher education unlikely to change - Stanford was among the first institutions to realize that research labs were lucrative, prestige-granting, and one of the few ways to attract the best and brightest to academic life (see Rebecca S. Lowen, “Transforming the University: Administrators, Physicists, and Industrial and Federal Patronage at Stanford, 1935-49,” History of Education Quarterly, 31 (Autumn 1991): 365-388). Higher education is indeed a business, and a competitive one at that - so to gain prestige, one has to play along - or take a position that is so contrary to the majority of institutions, that you may in fact attract a handful of students at the cost of alienating most of your potential customers…

I’m sure your statistics are correct.

But, speaking as a Johns Hopkins grad student, might i suggest that the reason that Hopkins sends fewer African Americans on to med school has nothing to do with what the sutdents learn. It has to do with the fact that there are virtually no black students attending the University, period.

In a city where two thirds of the population is black, the Johns Hopkins campus is about the whitest place imaginable. In almost five years, i don’t think i’ve ever seen more than one or two African Americans on the campus at any one time, except when we have events that attract people from the surropunding neighbourhoods. And except for the caretaking and janitorial staff, which is predominantly black.

In four semesters of working as a TA, i’ve never had an African American in one of my sections, and i’ve only seen about two or three in total in the lectures for those course.

Do you have a cite for this? It was always to my understanding that typically, profit from tuition was used to prop up loss-making research and not the other way around.

The professors I remember as having the biggest impact on me were brilliant minds and brilliant lecturers. I’m sure they knew next to nothing about such Education school concepts as “learning differences” or “collaborative teaching,” and it certainly didn’t matter to me.

One thing I definitely agree with the OP about is the lack of communication skills for some professors. I swear one professor of mine broke just about every BASIC rule of human communication and it still boggles my mind how he is allowed to lecture. To me the issue of strictness/fairness is not as important as being able to communicate. If the prof can’t communicate well alot of students will think very negatively and end up being too critical about their tests or grading. Anyway here are some things this prof did wrong that I feel is unacceptible for his job:
[ul]
[li]Talking into the blackboard as he is writing[/li][li]The writing itself is illegible (hint: if you can’t read it yourself you can’t expect your students to)[/li][li]Speaking softly and often a muffled his voice with his hand. Often hesitated with “umm” and “uhhh”.[/li][li]Pointed at the projector screen with his hand even though he was a good 2m under the bottom of the screen (this was a large lecture theatre) instead of using a long pointer or even using a small pointer on the projector acetate.[/li][li]When he did decide to use a ‘pointer’ on the acetate he would use his finger and smudge up all of the stuff he had written out. He also had a tendancy to leave his hand on the acetate blocking out everything and greatly smudging the writing.[/li][/ul]
It was very clear that particular prof didn’t want to be there. Even though students very kindly asked him to not smudge or to repeat something or to use an overhead projector (since his writing on the board was so bad) he still made the exact same mistakes almost every single lecture. I definitely agree that the last thing I want is my professors to baby me through the material and I am seeing a growing trend of very repetitive teaching. In one class the professor taught about 60mins of actual material and spent the remaining 90mins repeating that material in some way. Being the strong independent learner that I am I often found myself bored and brought extra material for me to sift through while the prof was repeating things.

I’d go into a rant about some other trends that I see from my fellow students at university (I am entering my 3rd year this Sept) but methinks that departs a bit too much from the OP and would be better suited for The Pit. Point being there are alot of things going on in higher level education from both the Profs and the students that are quite disturbing to me.

Huh, shalmanese. There’s probably exceptions to this, but at my institution (a large Research I) research is funded mainly through external grants. In fact, when I worked on a research project, we were even “taxed” a percentage of those external grant monies to the university. There is institutional grant money available, but I think it would be very difficult to sustain a project that was losing money and not producing much else.

It depends on institutional type, but typically private schools get around 50 percent of their revenue from tuition, and public schools about half that. The real culprit of why tuition increases would tend to be declining state and federal appropriations for public institutions (and the former affects private schools as well). Cite That’s a link to an article by McPherson and Schapiro, two leading economists who wrote The Student Aid Game: Meeting Need and Rewarding Talent in American Higher Education.

Other explanations for increasing costs would be competition among institutions, less grant aid from the government, and the increasing costs of retaining high-quality faculty and technology costs.

mhendo - thanks for sharing your experience at JHU. The JHU website reports that 5 percent, or about 210 of the 4200 students at the university are African American. Percentage wise, that’s actually fairly large compared to most predominantly white universities. Numerically, we’re still talking about a very small number of African American students heading to medical school from Xavier - 82 total in 1999. Among the top 8 schools, the last school listed, the University of Maryland - Baltimore County had 22.

Heh, your post brings back memories. From student evaluations I found that I made each and every one of the above mistakes when I first started lecturing. I worked hard at correcting myself after that and the complaints went away. But the point is that no one asked me to do so, and there were no incentives to be a better lecturer. So when you see a prof who continues to be a terrible lecturer year after year, you can be pretty sure that he/she knows about it, but just have not taken it upon him/herself to change.

Hippy Hollow, I enjoyed reading your post. Your point about academia’s preference for research over teaching is right on. But your point about Xavier is off since Xavier’s student body is primarily black and it should not be surprising at all that they place more blacks into medical school.

GorillaMan, correct me if I’m wrong on this: from soon (next year I think), all college lecturers in the UK (only really high-level lecturers get to be called ‘Professor’) will have to have a teaching qualification. This would be a PGCE (post-graduate certificate in education) or equivalent, which is usually a year’s intensive study after a Bachelor’s degree and is the same qualification that’s required before teaching in primary or secondary school. For current experienced FE/HE lecturers there are other ways to get the PGCE equivalent which wouldn’t involve a year’s full-time study.

It’s not about spoonfeeding; no good teaching is ever spoonfeeding, at any level. It’s about aiming for different learning styles, encouraging students to metalearn, keeping track of progress, choosing materials approoriately, scaffolding learning, differentiating for different backgrounds and needs (ie. dyslexia), directing discussions, grouping students in ways that all will have an opportunity to express themselves, having good timing and back-ups, having clear objectives, and lots more, not just standing at the front of a room reading out from a text.

It’s not the same as the skills needed for a secondary level teacher, which are not the same as the skills needed for a primary school teacher. All the same, certain teaching skills are needed.

If learning styles are the all same after 18, if the students don’t actually need any teaching, why have the pretense of teaching? Why charge them for uition if they don’t need tuition? Why not just give them an essay list and a library card? That would suit some students, it’s true; why not admit that and charge them less?

nivlac:

If you’d taken a teaching course, your fellow students and your teachers would have pointed out those problems before you ever got in front of a group of students. Lots of people in this thread have talked about learning ‘on the job.’ As if it’s OK to be a not-very-good teacher, as long as it’s only for a few years.

(This is not meant to pass judgement on your own teaching skills - it was just a handy quote).

The research part of an undergrad degree is independent research, but the taught component should be taught. No undergrad degrees are ever research-only in the UK; don’t know about the US. Again, if I’m wrong on this I’m sure Gorillaman will correct me.

Some untrained lecturers can be great. All trained lecturers will at least be good. That’s something to aim for, IMHO.

It’ll be a find day when professors come out of their sacred ivory towers and recognize they are mere mortals, too. In reality, they’re not paid to teach…they’re paid to work in a vacuum and publish a bunch of crap.

Nothing practical ever came out of a college experience…

  • Jinx

In Australia, as far as I’m aware (for CS at least), When a researcher recieves a grant from the government, the university then provides x% matching funds. On top of that, the university pays the researchers salary and provides an annual stipend of $x for research.

AAAAAARRRRRGGGGGGGHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH!
That is all.

A maxim usually followed by people not intelligent enough to gain anything from college in the first place.

I’m pretty sure this is only for lecturers in further education, not in universities (note for non-Brits - FE is any post-high-school qualifications other than degree courses).

Nope, that’s spot on. Typically the independent reasearch component for an undergraduate will be 1/4-1/3 of the overall degree, and mainly be in the final year of the course.

That’s confidence, if ever I saw it!

This isn’t the pit, so I’ll restrain myself to asking where you think the medicines that cure your illnesses get made, or where the engineer who designed the roads you drive on learnt his skills, or how the internet got developed, or…

I got laid.

What?