What is the best book on the history of Hollywood and American Cinema, covering early developments, major milestones and major actors/directors/awards/scandals and the general scene until say, the '70’s?
I have a simple recommendation for you. Go to the library. Find the movies section. Pick out several books on the history of movies. Be sure to include at least one written by Roger Ebert. Read a few and determine the author or authors who give you a better feel for the areas you’re most curious about. Look for other works by that author or those authors.
Try A History of Narrative Film by David Cook (erm, I think it’s Cook).
It is a very comprehensive book that is a common staple for film students. It is not just about Hollywood, but all of cinema. You’d need such a broad perspective anyway because so much that has influenced Hollywood has very non-Hollywood sources (e.g. “Spaghetti Westerns” are so-called because some major Itilian film directors re-invented the western genre).
I don’t think you’re going to find one book that covers the whole period. There is an excellent series, History of the American Cinema, wherein each volume covers a certain period. You can find a lot of them here.
The books in that series listed are by Bowser, Musser, Crafton, Koszarski, Prince, Balio, Jowett, and Schatz.
It’s a terrific series, lotsa photos—delves into all aspects of Hollywood, and may in fact have more than you’ll want to know. Make sure whatever you get is from that History of the American Cinema series: here is the publisher’s link.
Martin Scorsese’s book, A Personal Journey With Martin Scorsese Through American Movies is a companion to his BFI documentary series. Both are extremely informative and have greatly enhanced my appreciation of American films, as well as films in general.