Bimodal customer product-review distributions

I have noticed on a lot of Amazon products that the review distribution will look something like this:

5 star: 75%
4 stars: 6%
3 stars: 2%
2 stars: 1%
1 star: 16%

It seems odd to me that a product would get a high percentage of top ratings, then drop off through the lower ratings, but then suddenly there is a surge of lowest-level ratings. What causes these to be so polarized? I have some theories:

  1. The 5-star reviews are loaded with fakes (although even so, the profile would still be bimodal).
  2. The most dissatisfied people are the ones most likely to express their frustration through reviews.
  3. The profile represents poor quality control for the product–it either works perfectly or fails miserably.

Seems like this would make for an interesting grad thesis.

I see that a lot in restaurant reviews in Tripadvisor etc. I always assumed that the good reviews were put there by the restaurant owners and their friends, the bad reviews were put there by the restaurants down the street, and so on. The way round it seems to be to only trust reviews once they exceed (say) 50, after which real reviews start to exceed the (presumed) fakes and the bimodality should therefore disappear.

Not quite the same situation with Amazon I guess (never use it) - but does the bomodality disappear as review numbers increase?


My assumption is that a lot of people give 5* reviews because the product is fine. Many give 4* because it’s against their nature to give 5 for anything. When you read the 3, 2 and 1* reviews you find all kinds of reasons. This is where I look to get the SD on any product.

They tend to split into a few groups:

1- It was damaged or did not work when it arrived.
2- It stopped working soon after delivery.
3- It was different from what I expected.
4- The quality was poor.

Then the people who give 1* because it was the wrong colour/wrong size/etc, all of which they would have known if they read the information before they ordered.

This was always surprising to me. I guess they come it it from the perspective that every product starts with 5 stars, and stars are removed if something is not right.

However, my opinion is that if you give 5 stars for a product that does only what it is supposed to, in a minimally satisfactory way, then what rating do you give a product that blows you away or exceeds expectations? You can’t give it 6 stars.

Truthfully, I think your theories are pretty good. Only one thing to add. I would change #2 to “People that are passionate about a product tend to leave reviews. They may passionately love it or passionately hate it. Ambivalent or moderately pleased people are less likely to leave reviews.”

This is the same issue with rating Uber drivers (1 to 5 stars). Most drivers have an average rating of about 4.6 to 4.9. Uber won’t let them work if the rating falls below a certain (pretty high) number. Most people rate the driver 5 stars unless something was really wrong. So what do you give the really good drivers??

I thought this type of grade inflation was a common issue with star ratings. Uber has already been pointed out as an example of this, where drivers are probably getting five stars as the “default” and almost all of them have > 4.5 ratings. This matches my behavior: if I’m satisfied I’ll give 5 stars. That happens most of the time, with some exceptions that I marked as 1 star (a huge price reflecting the driver getting lost at what felt like every single intersection). I don’t typically leave product reviews but if I did, and with things I do leave reviews for, I don’t really know what to do with five stars. It’s too much granularity for me. I’d prefer just a thumbs up or thumbs down approach, and tend to simulate that by picking 5 stars or 1 star but nothing else.

I also think a lot of people are non-confrontational, and picking 4 stars or less is basically saying “you’re flawed in some (perhaps minor) way”. Unless I’m actually annoyed, I’m not going to bother with that. I do agree that it would be nice to have a way to say “you’ve gone above and beyond” (I’ve wanted to give this to an Uber driver that put a phone charging outlet in the back) but I think if you just added a higher star it’d become the new default. It’d have to be a separate checkbox, and even then I’m not sure it’d be immune to this sort of psychology.

So if there are a lot of people like me, that can be option (4) in the OP. :slight_smile:

And I’d be remiss not to link to the excellent Black Mirror episode “Nosedive” (, which I think accurately depicted this phenomena in its rating depiction.

A tip.

In fact, iirc, if you give a 4-star or lower rating to an Uber driver, you are then prompted with a question about why you were dissatisfied with the ride (and a list of choices). Five star seems to be the default.

It doesn’t seem at all odd to me. Amazon product ratings are totally voluntary, and you have to actually say something about the product. Hence, the only people who bother doing them tend to be people who have strong feelings (positive or negative) about the product.

I’ve only used Uber a few times, but that’s my recollection, too – give the driver 5 stars, and you’re done with the rating, but if you give anything below 5, you have to provide additional information, which is kind of a hassle.

I think there’s likely a lot of both of these – the reviews are created by a self-selected sample of those who care enough to write a review (legitimate or not). People who are really disappointed / upset / angry over what they got are more likely to take the time to post a review. Yes, that’ll be the same for at least some buyers who love the product, but I expect that, for most products, there won’t be many of those.

And, then, for the rest of the buyers, who aren’t either (a) totally in love with the product, or (b) pissed off, there’s not a lot of incentive to go back and place a 3- or 4- star review.

First off, I believe some people receive products for free in return for the review. I’ve seen the written review disclose this – I assume that such disclosure is required. The reviews often are not 5-stars, and sometimes are quite low.

Also, if you are using the ratings to decide if you want to purchase the product, I highly recommend you read the associated review. I’ve seen 5-star reviews that say something like, “This is exactly what I was looking for! I can’t wait to get it!” I’ve seen 1-star reviews because the buyer called to ask a question and thought the customer service person was rude. (I admit I’d like to know how the service is, but I’m more interested in the quality of the product.)

I think it’s mostly that people only rate things when they have an extreme reaction. You tell people about the best meal you’ve eaten on vacation and the shittiest hotel you had to bed down in because those are good stories, but do you even bother to remember the sort of ok but whatever experiences? Would you be motivated to go explicitly rank them? The great one, you want to tell the world. The terrible one, you want to warn off others (or maybe punish the proprietor).

A while ago Netflix went from a 5-star rating to a simple thumbs-up thumbs-down because it turns out that if you ask non-professionals to rate things on a larger scale, there isn’t general agreement on what the different levels mean.

Stephen Colbert the other night commented that their new book Whose Boat Is This Boat? was getting one-star reviews from people who were disappointed to learn that the book was actually what was described on the show!

And I thought you gave 5 stars when you got exactly what you were told you were going to get.

I’ve served on the program committees for scientific conferences. People send in extended abstracts of their papers, we decide which ones get accepted to become full papers in the conference proceedings.

I discovered that your ratings should either be the max or the min. For a paper you sort of like, giving anything less than the max hurt its chances of being accepted. For a paper you mostly dislike, giving a not completely bottom rating increased its chances of being accepted. So you make up your mind: In or out? And rate accordingly. Subtlety messes up the results.

After talking to a customer rep I was asked to rate my experience 1 to 10. I gave him a 7, which seemed high. (I was actually somewhat disappointed in his service.) I thought 9 or 10 should imply real excellence; and 8 would be a good rating. “7” seemed as neutral as could be: “Nothing to see here; move along folks.”

I got a phone call from his supervisor, insisting I tell her what he’d done wrong. I kept saying, “7 isn’t so low, no?” but she wouldn’t have it, keeping me on the phone and demanding that I find fault with her rep.

In future, I’ll just say “10” unless something is way wrong.

Movie ratings are on IMDb are the same way. With ten points instead of five, the peak will often be down from ten, but the lesser ratings have fewer and fewer hits until 1, where there’s always an uptick. I guess when someone dislikes a movie, 2 won’t do. Personally, I’ve never rated a movie less than 3 but I avoid movies with bad reviews/ratings as not worth my time.

After I bought my car I was sent a sanctification survey. Apparently if they didn’t get all 10s, they would not get full commission. Didn’t seem right, as a 9 is a pretty good score. Anyway, I gave the guy in the back room a low score due to high pressure sales techniques, and they dinged my salesman, who was great.

I suppose they just want excuses to keep the commissions low, but it’s petty.

I have never used Uber, so just out of curiosity, how is it that you have to provide more information? Is that part of some agreement you sign up for to use Uber, so if you don’t comply they’ll say “No cab for you!”? It would be odd if they turned down a paying customer just because you didn’t want to fill out a stupid survey. And I would likely rate an average cab ride a 3 - less if the driver got lost or cursed me out or something, and more if they went above-and-beyond like carrying luggage inside the house.

When it comes to 5-star product ratings my default is still 3 for “as expected”, but I am only going to bother with a rating if I am unexpectedly disappointed or pleased, thus adding to the skewed 5-star and 1-star ratings.