My college used to compile the surveys and make the results and select comments available to students during registration (so that we could supposedly make decisions based on the data.) I know that the instructors/professors read them.
The only time I can remember anyone saying anything about it was one professor who said something like “Last semester <another professor in his subject> was called a ‘god.’ Could at least one of you can compare me to a deity?”
They’re not all that useful IME but there’s sometimes stuff you can take from them - we usually use them for lecture courses here in the UK, not small group teaching in classes.
The aggregate box ticking responses can be useful (e.g. rate the delivery of the course on a scale of 1-5), but the low response rates can make it statistically misleading (like the 5% of people mad as hell about a product are certain to leave an online review). But if you have a decent uptake and 80% of responders are rating your class as not interesting, or not delivered well, or not useful, then I’d say you have an accurate picture of how the course is going down.
The actual comments students provide lack quality usually - it’s nice to be praised and it’s painful to hear sharp criticisms, but serious, constructive feedback is a hard thing to do in general. I mean it’s hard to do when it’s just two people 100% invested in the process, so for students / a prof it’s a different sort of relationship and mostly it ends up with people getting stuff off their chest or dishing out platitudes.
It’s actually feedback on quantity of material that is the best, rather than opinions on quality, IME. It can sometimes be hard to judge if you’re giving them too much - I reckon most profs would tell you their lecture courses lose weight, and slow down, as they improve with age. The student assessments can put you on the right track here, particularly when you’re starting out.
I’m sure we have close to a 100% response rate at my university. They make it as annoying as possible with pop-ups and reminders every time we visit a different page on the website (Canvas). They don’t go away until we fill out SEOI’s (student evaluation of instruction). It’s effective!
It’s been very interesting reading about these from professors’ points-of-view. It pretty much confirms what I figured. I’ve never written a mean review because I don’t want to devastate someone. I’ve been lucky, anyway, I’ve had pretty much phenomenal profs. The meanest thing I’ve written was this quarter, and it was intended as constructive criticism. We’ve had two papers in one class. One was due the third week and the other the seventh week. Neither one is graded yet, and the end of the term is Friday. I’m sure I did well, but I did say it’s difficult to know what expectations are when grading is so slow.
I know one sure-fire way to piss off most of the profs here. ratemyprofessor.com :D:D;) It’s not gone over well when it comes up in class or in conversations!
One “trick” my advisor taught me to help calibrate responses from these is to always return every assignment, graded, the next class period after it was turned in. By doing so, you darned well ought to get a perfect 5 on “Assignments were returned in a timely manner”, and so you know that if you get anything less, that your students are just being unfair.
Anecdote: In one class I had in college “History of Science” a professor wrote that one semester he had this one student who never said a word the entire semester. Well towards the end he showed a video of some guy, say a Mr. Davidson, giving a short speech.
Well the student did an evaluation. Every question was crossed off and he wrote just one thing under comments: “Liked Davidson’s tie.”.
Thats it. The entire feedback or conversation the student had the entire semester were those 3 words.