Good book about history of American football

Can anyone recommend one? Not interested so much in the competition, as in who won the championships or a rundown of the great teams/seasons and so forth. More interested in how the game itself - as in the rules, equipment, culture - and its promotion and popularity have changed, and why and how.

It doesn’t quite cover all of those bases but How Football Explains America and America’s Game: The Epic Story of How Pro Football Captured a Nation would be good places to start.