How much did computers contribute to the field of physics?

I can think of many aspects of biology that have ridden on the coat tails of the significant rise of computing power in the past decades (genetics, neuroscience).

But what about pure physics (not applied physics / engineering)? Can anyone name any significant advances in the field?

They have been very useful in demonstrating and proving theories in physics. The Monte Carlo method is an example. It’s like this in many areas. The computer doesn’t develop the new ideas, its used as tool, just as a calculator or slide rule would be, but computers are much more useful, versatile, and efficient tools.

A few years ago some scientists calculated the mass of hadrons “from the bottom up” in an attempt to support or debunk the Standard Model of physics (the results supported it).

To do this required modern computing power. It would not have been possible (except in theory) to do it by hand. Was not even possible till recently as older computers were not really up to the task either.

There are a great many problems which physicists tackle nowadays, which before computers we were forced to just gloss over or handwave away. But I’m not sure what you consider the boundary between “pure” and “applied” physics. For instance, it’s only in the past few years that we’ve been able to fully model the collision of two black holes, and the gravitational waves produced in the process. Now, we do anticipate being able to detect those gravitational waves soon, and being able to model them is crucial for that, so in that sense one might call it “applied”. Then again, though, there’s (as yet) very little practical use for detecting gravitational waves, so maybe that’s “pure” after all.

As recently as 1980 problems in Mie scattering had to be solved using clever tricks – the infinite sums gave exact answers, but they converged with great slowness. I recall an issue of the Journal of the Optical Society from 1979 on Meteorological Optics in which the same problem was solved by two very clever means of summing the terms to make them tractable – even with computers, the solutions could only be obtained in this way. Prior to computers, it was hopeless to even try to solve the equations.

Now, thanks to Moore’s law, optical calculations of this sort on even a standard, off-the-shelf home computer can be solved without al that clevernes, byy “Brute Force”. What were once tedious optical calculations can be breezed through with speed.

Computers have also been useful in solving some classic problems like the Four Color Map Problem that have resisted "Clasical’ methods of proof:

Most problems in physics require numerical solutions. The more computational power we have the more problems we can actually solve. But more specifically computers really advanced understanding in nonlinear dynamics. (Actually anytime you enter a nonlinear regime).

Applied statistics has been changed so much by modern computing that it’s not even remotely the same field that it was in the early 90s. So anything that involves significant amounts of data analysis is going to look just as different. In physics, the big change has been the move to bayesian methods that involve calculations you just can’t do without modern computers.

That’s not really physics, though.

Computers have allowed physicists to consider non spherical cows. How pure this is is up to you. Also much of experimental particle physics is not feasible without computers again how pure this is can be debated.

My physics Ph.D. thesis involved computers. I was studying the properties of a structure that consisted of hundreds of atoms. It would have been way too tedious to work out the math by hand, but with computers it was doable.

Fascinating! Thank you for all the replies!