Statistics: Combining Trends

I’m looking at some football stats and trying to extrapolate.
Team A’s offense is averaging 15% fewer yards per game than their opponents’ defenses normally allow.
Team B’s defense allows 5% fewer yards per game than their opponents’ offenses normally gain.
What can one expect when these teams play each other?

There may be some subtleties I’m missing, but I’d expect team A to get 19.25% fewer yards than the average. There’s some average yardage, but when A plays, they get only 85% of the yardage that would otherwise be expected. And when the opposition is B, a team only gets 95% of the yardage that would be expected. Multiplying these, we find that A playing B should get only 95% (since they’re playing B) of the 85% they’d usually get (since their own offense sucks), or 80.75%. Or, if we subtract that from 1, they get 19.25% less than the league average.

As long as the percentages are low, we can approximate this just by adding the two percentages together (which would give us an estimate of 20%, in this case).

I’d prefer to work in absolutes rather than percentages. If team A is averaging m fewer yards per game than expected and team B is allowing n fewer yards per game than expected, then team A will average m + n fewer yards than expected in the long run.

I considered using absolutes, but that clearly can’t be right in general, since in extreme cases it’d lead to negative numbers, which I don’t think are realistic here. It shouldn’t make much difference in this case, though-- Both methods would give similar results (the 19.25% vs. 20% I mentioned).

OK, if it helps:

Team A averages 276 yards on offense, while their collective opponents’ defenses give up an average of 325.4 yards.
Team B gives up an average of 319.7 yards on defense, while their collective opponents’ offenses averaged 336.9 yards.

I would think that dealing with percentages would be more accurate. Using slightly more exact 15.2% and 5.1% numbers:

Team A’s 276.0 yards * 0.949 = 261.9 yards
Team B’s 319.7 yards * 0.848 = 271.1 yards

Would it be wrong to simply take the average of those two results (266.5)?
Or should the combination be adjusted further?

Dealing against the league average, instead of opponents’ average, as Chronos has:

League average is 333.4 yards (both offense and defense, obviously).
So Team A gains only 82.8% of that, and Team B keeps their opponents to 95.9% of that. Multiplied together we get 79.4%

League average 333.4 yards * 0.794 = 264.7 yards

Clearly, the numbers are very similar, but this has triggered the anal mathematician portion of my brain and will bug me until I have a precise formula.

I don’t think you can use statistics just against a team’s opponents, if A and B have had different schedules. You need some way to control for the possibility that the reason A’s yardage is bad is that all of the teams they’ve played have unusually good defenses. That’s why I was considering the whole league, not just the opponents.

One game is too small a sample for the averages to come into play. One missed tackle can turn a 10 yard out into a 90 yard touchdown.