Does the media directly earn money from higher TV ratings?

I understand that there are many reasons for the media networks to want high ratings and lots of TVs tuning in, but is there a direct relationship whereby a 38% increase in ratings will lead to 38% more revenue on a sort of X-dollars-per-viewership system? And who would be paying them?

I think you first have to tell us what you calling “the media”. But yes, more viewers, along with a whole host of other things* means more money by way of driving up advertising spots. So a 30 second spot in the middle of a show with 3 million viewers might cost $30,000, but all else being equal if it has 20 million viewers it could get $320,000 (<-cost for a commercial during Big Bang Theory).
As for how the money flows, I couldn’t tell you.
*With DVRs, things are really interesting. With everyone having what’s basically a Neilson box in their house, things have changed. Ratings can be adjusted by if you skip commercials, if you watch live, if you wait until later in the week to watch it etc.

Advertisers obviously pay more to place ads when the tv show has higher ratings. This is why advertisers pay around $5 million for a 30-second ad during the super bowl, whereas they will only pay around $500,000 for a similar commercial in a regular Monday night football broadcast.

Sent from my P01M using Tapatalk

The term is “Cost per Point.” And. yes, the more points for the show, the more cost.

I used to work in advertising. To grossly oversimplify the process. . .

Our clients rarely bought a specific program. What they wanted were “viewers” who fit their specific needs, and they left it to us to spread their money around the most efficient way possible.

Theoretically, we could spend the same amount of money on one commercial that reached 10,000,000 viewers or 10 commercials that each reached 1,000,000. But time for commercials is limited (even if it doesn’t look like it when you watch commercial TV) and it is very much in the TV station/network interest to be able to sell our clients 10,000,000 viewers with as few commercials as possible, and have more advertising time available as “inventory” to sell to other advertisers.

It’s also more efficient for an advertiser. If they buy one commercial that reaches 10,000,000 people, that means they reach 10,000,000 separate individuals with one commercial. If they buy 10 commercials that each reach 1,000,000 people, there’s a chance that some people who see one commercial will also see the others. If the advertiser’s goal is to reach as many different people as possible, that’s an inefficient use of money.

It’s not exactly a perfect relationship - one rating point does not exactly equal X dollars in every situation. But in general, shows with higher ratings are preferable to both the seller and the buyer.

A TV show will ask for more money based on higher ratings. They’ll generally get it, but it’s possible based on the viewer demographics and advertisers that they can’t actually sell more ads at a higher rate. I don’t know if higher ratings have ever resulted in no increased revenue, but it’s not a simple linear relationship in all cases.

I used to work in TV and the highest paid employee at our channel was the Head of Airtime Sales. He got far more than any of the talent ever did!
I worked at a major channel in the Uk and they were very innovative in their approach to advertising so he earned his money.

Advertisers could specify all sorts of variables for their ads. They could ask for a certain number of a demographic per ad, a total for the campaign, a specific ad placement or just the best they could get for a fixed price. There were even things like exclusive ads, so people would pay more to be the only car advert in a break for instance. The first ad in a break usually cost more, as people would see it while trying to find the remote to flip channels or fast forward if it was recorded. Some clients would set a specific budget, others would be variable if we exceeded their minimum requirements, but usually with an upper limit.

At the time there were rules about adverts containing actors that were in the surrounding shows, so the ad schedule was quite complicated. Back in the late 90’s my channel was quite advanced and ran a distributed computer program to calculate the best return. This was run overnight on all of the desktop PCs in a SETI@home way. Eventually it was moved to a rack of high end servers that we had to pamper at all times.

Also, advertisers pay based on an expectation of a certain number of eyeballs, perhaps a certain number of eyeballs in a given demographic. If those numbers are exceeded the network doesn’t get more money but can ask for more next time around. If the ratings fall short, advertisers may get “give backs,” meaning more commercial airtime for free or a reduced rate.

DVR watching can be tracked and I think the numbers to watch now are the “plus threes,” which is everyone who watches within three days of airing. The plus sevens contain, as I understand it, the overwhelming majority of those who will watch.

To combat commercial skipping via DVR, many networks put a little bit of content in a commercial break, usually one that’s later in the hour/half hour. These are sometimes called “podbusters.” I’m told that the commercials right after the podbuster are the most expensive, because people will actually watch them after they’ve watched the podbuster.

Now, since, for instance, the Super Bowl’s advertising slots are sold in advance of the game, doesn’t this mean that the TV networks can’t make more money if the viewership is higher than anticipated - they sell $3 million slots anticipating 100 million viewers, but it turns out there’s 130 million, but the price was already fixed at $3 million beforehand regardless?

Any rough metrics on how many times an ad has to be repeated, for the message to get across?

Would seeing a beer ad once get you to buy it? What’s the threshold for a message to get across? And at what point does someone become saturated?

There are some measurements, but they depend on factors like how many other ads for competitors are you being exposed to, how interested you are in beer in the first place, how frequently (not just how many times, but how often in a given amount of time) you see the beer ad, how much other exposure you’re getting to the beer from magazines, radio, signs at your local bar, etc.

Something like Apple’s 1984 ad, a genuinely first-of-its-kind ad for (what was then) a unique product, had a lot more impact than this beer commercial you saw just a few months ago.

When does someone become saturated? To an advertiser, not until you actually buy the product.