Really the only way you could figure it out is just an analysis of actual last digit distributions. One point scores are also possible in the NFL, of course, but it never happens. The thing is, though, that scoring patterns in either the NFL or CFL (or college football, or whatever) aren’t just based on the rules, but on overall offensive levels, trends in play calling, the capabilities of the specific teams at play, and even the weather on game day; some types of poor weather tend to reduce scoring, which will change last digit likelihood.
There are, for instance, a number of football games known in football legend as The Mud Bowl; the 38th Grey Cup, the 1983 NFC Championship, and a few others. ALL were low scoring. Low scoring games make it likelier to have scores that end in 3s, 7s, and zeroes.
Do you think the distribution of last score digits is different across Super Bowls as opposed to regular-season games? If we want to predict the odds of winning squares for the upcoming Super Bowl, what set of previous games should we analyze?
Hmm, eight scores per game for 51 games to enter into Excel. I’ve built bigger databases. And yes, expanding the data to playoff games would give more indicative results.
But I suspect this work has already been done by someone. We are a nation chock full of crazy people.