Using the Reimann Hypothesis as an example: It certainly appears to be true (if I recall right, it’s shown itself to be true for trillions of digits thus far, via supercomputers, and there hasn’t yet been an exception found.) However, that’s not the same as an actual formal proof.

Likewise, with P vs NP, it certainly appears that P does not equal NP, but again, that is not formally proven.

My question is, when mathematicians are waiting in the meantime for a proof for such problems (proof that may never come,) do they simply assume a certain way ("let’s assume that the Reimann Hypothesis is true, and assume P does not = NP,) for the time being when they are doing any work that involves or needs the Reimann or P-NP work?

Yes, all the time. I recently read an article by a mathematician was part of the team that found that an important conjecture was wrong although it had for decades commonly used for “if this is true” further studies into the field.

I can’t seem to find it again, and my leaking memory provides no more clues than what’s given above. But the concept drives progress in fields that would otherwise be stalled.

Pretty much all of online cryptography is based on the assumption that no efficient algorithm for integer factorization exists. So far no such algorithm has been found, but if one is, then the online economy will be in major trouble.

Speaking of the RH, Gary Miller famously established back in the 1976 that if the extended RH is true then testing for primes is polynomial.

(I knew Gary for many years. It’s a shame that some other person got their name attached to a variation that he came up with alone first. Fun ? fact: At the last CS conference I ever attended I went to dinner with Gary and a friend of ours.)

There are a lot of such conditional results in Computer Science.

E.g., if Graph Isomorphism is NP-Complete then the Poly time collapses to the 2nd level. It seems to sit inbetween. (Subgraph isomorphism is NP-Complete.)

No. There is a minor field of proving that if the Riemann hypothesis is true, then certain other things follow and then another minor field of taking those results and then proving them without the RH.

Now applied mathematicians might assume that factorization of integers is NP and that NP != P and design encryption on that assumption, but no one claims this is mathematics.

But there is one major assumption. Goedel showed that no axiomatics strong enough to allow arithmetic can be proven consistent. So if you don’t suppose the consistency of your axioms, you can’t do mathematics. On the other hand, since it is known that proof of consistency is impossible, no one spends any time worrying about it.

There’s very little practical application for this, though, since there are already known tests for primality that are both extremely efficient and extremely likely to be correct. In practice, nobody worries about a 1 chance in 2^1024 or whatever that your “tested prime number” is actually composite.

Testing whether a number is prime is known to be polynomial, anyway, which is theoretically important even if the published algorithm is not one you would use in practice. There have been results discussed like: if such-and-such a conjecture is true, then there is the following more efficient algorithm, which is the subject of this thread, so people do sometimes at least consider such things.

I haven’t found an example yet, but I suspect the same is true of the Navier-Stokes existence and smoothness assumption. It’s certainly the case in the sense that Navier-Stokes is widely assumed to describe fluid behavior without having any weird consequences, but that hasn’t been shown to be true. Some bounds have been put on the problem, but no one seriously believes that solutions outside that narrow scope all have blowup cases.

Though no one has proved yet that a classical computer can’t factor a number rapidly.

Every so often it’s shown that a particular quantum algorithm actually offers no speedup over an optimized classical version. The problem is that many classical algorithms aren’t well-optimized, and in any case impossible to prove how optimal they are in the first place. So sometimes a quantum algorithm can seem advantageous until someone really takes a hard look at the classical version they’re comparing with.

Right, classical computers can’t run the Shor algorithm, and the Shor algorithm is (in principle) much faster than the fastest known classical algorithm (if you can manage to build a computer that can run it), but we don’t know that there’s no better classical algorithm.

Though in at least that case, it’s not for lack of trying. Coming up with an efficient classical factoring algorithm would give you your choice of prizes, either the Fields medal, or the contents of everyone in the world’s bank accounts.

Yes, but Navier-Stokes is an assumption about physics, not math. Do those DEs actually represent physical reality?

I might add to me previous post that one reason for proving theorems of the sort, If RH is true, then… . Then try to refute … . This has never happened of course. The very first such result was by Riemann who showed that if RH was true, then so was the prime number theorem (PNT). Of course, the PNT was finally proved 40 years later (1896) by two mathematicians (Hadamard and de la Valee Poussin) with somewhat different arguments but both using analytic function theory and based on a weak version of the RH*. Then it was proved without using analytic functions by Selberg and, independently by Erdós in the 1980s.

The RH is that all zeroes of Riemann’s zeta with positive real part lie on the line x=1/2. What they proved and showed sufficient for the PNT was that all those zeroes lie on the strip 0<x<1. The PNT, by the way says that asymptotically, the number of primes <n approaches \frac n{\log n}. (That’s the natural log.) Equivalently, the odds of n being prime approach \frac1{\log n}.

You did see that this was a result from 1976? I guarantee you that this was a big result. And that result is deterministic. He then had a separate result that was poly time probabilistic and which was used in practice (and probably still is).

Another example: the so-called “Kepler conjecture” about the face-centered cubic lattice being the optimally dense sphere packing, finally proved about a quarter-century ago by Thomas Hales et al., was famously described (prior to its eventual demonstration) as something that “most mathematicians believe, and all physicists know”.

But yeah, AFAIK there is no official epistemological status of “provisionally assumed true” recognized in mathematical research. Mathematicians may study potential consequences and implications of a conjectured result, and as other posters have pointed out, that sort of contingent investigation often leads to a lot of important findings. But all those contingent results remain equally conjectural until an actual proof, or disproof, is found.

Depends on how you phrase things. You can have a theorem, completely proven to full satisfaction of all mathematicians, that says “if the Foo Conjecture is true, then the Bar Conjecture is also true”, even before the truth or falsehood of the Foo Conjecture is determined. And that theorem remains true (though much less interesting) even if the Foo Conjecture is later disproven, because a false statement implies everything. Or that theorem might be the conduit through which the Foo Conjecture is disproven, if someone else manages to show that the Bar Conjecture is false, or that Foo also implies Not-Bar.

So, kind of like saying, “If David owns a huge mansion, then he must be a rich man,” but then later finding independently that David is indeed a wealthy man even though he does not own a huge mansion? Where, basically, If A, then B, but B was later proven independently without any A?