This isn’t my field, and I can’t pretend to be up to speed with these developments; but from what little I know, there’s been a kind of revolution regarding the way computations are done in particle physics, and the ‘amplituhedron’ seems to be the culmination, or at least the latest development, in this direction.
So, basically, the classical way you compute the amplitudes—which give you the probabilities—for certain scattering processes, i.e. for certain ways particles interact with one another, is by means of Feynman diagrams, which are often read as symbolic representations of one particular kind of interaction (sometimes they are even presented as depicting ‘what really goes on’, but this is grossly misleading). However, in fact, Feynman diagrams are graphical representations of the terms in a certain kind of approximation to the ‘full’ expression (which in general can’t be calculated exactly), a so-called perturbation series. (The often-mentioned ‘virtual particles’ are things that show up in this approximation, by the way.)
The problem is that the number of these terms grows very quickly with the ‘order’ of the approximation, that is, the better you want your approximation to be, the more diagrams you have to calculate, which in general becomes intractable quickly. Similarly with the number of interacting particles to consider. But in recent years, a couple of techniques have emerged that allow to dramatically shorten these calculations, in a sense doing many Feynman diagrams at once; one is for example the so-called ‘unitarity method’ due to Zvi Bern et al., and another, and I believe more direct precursor to this amplituhedron-thing are the BCFW recursion relations.
The big question is, of course, is this just a nifty trick, or a hint at something more fundamental? I think at the moment, nobody really knows; there is an interesting relation between the calculations using the unitarity method, which are fundamentally quantum field theoretical, and certain calculations in general relativity (it’s claimed that gravity can be seen as a ‘double copy’ of a QFT), which nobody knows how to bring into the mold of quantum field theory, but I’m unsure to what extent this has yet been explored.
What gives me pause regarding this amplituhedron thing is that it’s (apparently) letting go of unitarity (I don’t care so much about locality). First of all, I’m not sure how this is possible; after all, unitarity is build into the very foundations of quantum theory. That you should need to let it go to calculate scattering amplitudes seems strange, at least conceptually. Second, I just can’t see how to make sense of a theory in which unitarity doesn’t hold—probabilities that don’t add up to one, i.e. that don’t satisfy the constraint that something always happens, just don’t really make sense to me. For some reason, however, this kind of speculation has become much more common in recent years; when Hawking proposed that unitarity might be violated in black hole evaporation, it was still sufficiently anathema that people immediately thought the idea couldn’t be right (which lead to much brilliant theoretical work in order to smooth out this flaw, in my view—and likewise Hawking’s, who famously conceded a bet on the matter—succesfully).
One thing I don’t know anything about and would like a passing particle physicist (Pasta?) to comment on is in what sense, if at all, this kind of things can be considered as analogous to the old S-matrix programme; there seems to be at least a kind of conceptual similarity in that the emphasis is placed on the calculation of scattering amplitudes above all else, and featured a similar rejection of the space-time picture (though of course the S-matrix was strictly unitary).