[Discussing here because the S/N ratio is higher here]
For awhile now I’ve noticed that new releases over there often get rated with a ton of “1’s” & “10’s”; see Next which has 41% 10’s and 17% 1’s. In case you think that such things get washed out as time goes on, Snakes on a Plane still has 50% of its ratings at 1 & 10. I worry that the place will become a less and less reliable place to try to hunt down good movies and avoid bad ones.
Virtually any voluntary poll is going to suffer by a distribution to the extrema by the fact that the people most interested in posting an opinion either love it or hate it. If you were “meh” on a film, would you talk about it with your friends?
When Gigli came out (or perhaps even before it) it was ranked as the #1 worst movie on IMDb. Aside from gaving Al Pacino the distinction for having appeared at what was one point the lowest rated movie and the highest (The Godfather) it was perfectly meaningless; as bad as the film might have been, it was nowhere near the worst film every made, and the ranking was more a measure of popular backlash than any critical measure of the film.
This effect has been true on Amazon since it began. If people like a book they tend to give it five stars, if they don’t they tend to give it one star. They don’t make fine discriminations. (A tendency, not an absolute.)
I see the same thing on other ratings, like the movie ratings at Box Office Mojo. They rate movies as A, B, C, D, and F. Every movie seems to have more F’s than C’s and D’s combined. This is highly unlikely in any rating system.
I don’t think it’s necessarily a function of the internet, but it certainly gets exacerbated by its instantaneous and anonymous nature. People want to feel that their opinion will be influential and that means that an extreme choice will affect the outcome far more than a carefully made distinction.
I don’t look at the final average, but at the bar chart, so to speak. What is the distribution of ratings? Where is the cluster? Where is the peak? That means far more than the idiots who like to vote 1 or 10 on everything.
Also, I think their new rating interface affects the results. You are basically clicking on the current rating, and I bet that people rate relative to it. So if you think it’s good, and it’s rated at a 7 already, it gets lots of 8s and 9s and not many 7s.
Yup, there’s definite hardcore ballot-stuffing going on. I suspected it with Memento but was convinced of it when Anus Magillicutty held the Bottom #1 rating for months and months a while back.
Oh, a movie that nobody’s heard of and has no advertising at all and no one’s reviewed it and you can’t buy it in any commercial outlets and it’s shot on cheap home video but was apparently released in theaters beforehand and 95% of the user reviews are positive when on any other Bottom 100 IMDB movie the reviews are usually 95% negative? No, that’s not suspicious in the least.
Slight hijack: Can anyone tell me what the hell the “Starmeter” alludes to? For example, Jeri Perry, whose only credit is as “Fun Hole Girl” in that movie, five years ago, has an up arrow with 34%. I tried clicking the “Why?” link, but it only asked me to join IMdB Pro.
You will get this when you allow just anybody regardless of reason, age, movie knowledge, etc. to sign up and vote. Most young people can’t deal in something middle of the road- either “it sucked” or “it was awesome”.
Side question, is there a single movie in the bottom 100 released before 1990 that was not on MST3K? This artificially skews these even more.
I blame the Internet – it’s hard to get reasoned discussions of anything. Everything either sucks or is great. The “I liked some of it, but it was flawed” argument gets lost in the noise.
I guess. One counterexample is the Rate Your Music website, where such over/underrating very rarely happens. Perhaps it is mainly due to that site having much fewer members than IMDB.