You’re not intelligent and\or educated enough to understand why your paper is wrong. You’re the Dunning-Kruger effect in the flesh, enjoy your life tilting at windmills.
Anyone else care more about the validity of an argument more than whether the person promoting it is the smartest person in the world? After all, arguing one way or the other against a person’s intelligence constitutes either an appeal to authority or an argumentum ad hominem which contributes approximately nothing. Establishing the validity of an observation or theory can be fundamentally important, however… and one method of doing so that’s been brought up is submitting the argument for actual peer review (i.e, field specialists, rather than self-proclaimed genius polymaths).
Edit: Oh yeah, Gardner also points out that a more useful definition of intelligence has to do with productivity, something that Chris Langan doesn’t regularly demonstrate and I’d argue Vos Savant falls behind Paul Erdos on, for example. So even from the perspective of an Appeal to Authority, one is better off with the collective intelligence of a journal than an individual.
According to Don’s rejection letter, they had better things to use their space for.
It’s a conspiracy, I tell you!!!
If you check out his page, you also find that he claims that either this proof or another proof he’s discovered makes algebra obsolete. (It’s the third letter he wrote, counting from the top of the page)
:rolleyes: This is the internet, not a refereed journal. Nobody’s reputation matters, what matters is the content of the proof.
Almost 1.
(Okay, there was one Galileo, but the common understanding of that situation is distorted enough that even he doesn’t really count.)
My question: what happens when Marilyn Vos Savant, Chris Langan, and Cecil Adams all three post a column on it, just to get their $50,000? I think someone’s math has a flaw in it.
“Hell with those scientific journals! It’s all a racket anyway! I’m going to get the endorsement of a fictional person!”
Don:
I am not, nor do I believe others are qualified to look at your proof and judge it. We’re simply not math experts.
What you need to do is find a science chat forum and explain your proof there.
Wait a second! It looks like you already have. And, in several places.
Both of these threads have found what look like, to a layman as myself, problems with your proof. I keep seeing the terms contradiction and errors popping up in these threads. I keep seeing people give examples how your assumptions are invalid. Again, I am not a mathematician. I never got much past basic calculus and I would not be able to derive or take the integral something if the fate of planet Earth depends upon it. However, if I have to judge between you and the dozens of people who responded to you and claim there are problems with your proof, I’d have to side with those dozens of people.
You obviously know more math than me. People on the above forums recognized that you didn’t spout utter nonsense. They took the time to look at your proof and to critique them. That’s more than they’d do to any of my posts. I admire your grit and determination. You certainly learned a lot of math on your own.
However, unless there’s a massive conspiracy out there denying you the recognition and credit you deserve, I would say that the simplest explanation is that your proof has some major issues and those who understand math have pointed them out to you over and over.
I understand why you find this hard to accept. You’ve put a lot of effort into this endeavor, and you have a lot of faith in yourself. That can be good. However, you are an intelligent man, and the time has come to recognize that maybe your proof isn’t all that you assume it is.
Think about how long you’ve been pushing this proof. The posts I’ve seen have been since 2009, and the earliest mentions are back in 2007. That’s a long time to push on this particular issue and getting no where.
You know a lot of math, and maybe if you understood the math a bit better, or could explain it in a more formalized way, you could correct the problems in your proof and actually win that prize. Why not see if you can get into a post-graduate program at a university and see about getting a masters or Phd? Maybe you’ll learn about new fields in Math which could help you with some of the holes in your theory.
Otherwise, you will become famous as nothing more than this year’s crank and become another Internet meme that most people want to forget. I’d hate to see such a fate for someone with your intelligence and drive. Put your theory away for a while, and get the formal training you need. Spend a few years with that, then look at your theorem one more time. Maybe you will decide to discard it as rubbish. Maybe you will want to rewrite it and correct the errors that others claim to have seen.
Before you think I’m pushing you off as a crank, I want you to look at the story of Ron Mallett. Ron Mallett wanted to build a time machine in order to go back in time and see his father who had died when Ron was a boy. Ron wanted to go back in time and save his father. He had this dream since he was 10 years old.
Ron Mallett has actually become an expert in space-time theories and a brilliant physicist. Ron actually thinks a time machine is theoretically possible, and many other brilliant physicists think he might be right.
I wish you a lot of luck on your proof and don’t want to see you waste your life hitting your head against the wall trying to prove it’s correct. Take some time for formal training. You will be exposed to a lot of smart people, which in itself makes the endeavor worth it. You will learn new things, and maybe you will some day actually win the recognition you want.
Don Blazys: Throwing around personal insults is a fast-track way to get removed from this board.
If you can’t be civil, you’ll have to leave.
Same thing. “Attack the argument, not the poster.” You’ve been here a while and should know better.
Pssst…“Cecil” is not a real person.
Agreed. Of course, in that case, their “flaws” were either pointing out that the evidence presented so far was insufficient, or their “flaws” were incorrect.
Or are you talking about the theological issue, which isn’t science or mathematics?
Don, I read through the discussion linked by qazwart (only one I can pull up right now). I see that you are trying to approach the Beal Conjecture from the reverse angle: you are trying to show that for cases where the three terms A,B, C do not have a common factor (prime or otherwise) that the only case where the exponents can be solved is if at least one of them is less than 3.
This would prove the Beal Conjecture in that if there are solutions with all three exponents greater than 2, then the terms must have a common factor.
It seems to me the area of concern is where you are invoking the logorithms. Now I agree that the way you are working your proof, you are cancelling the log terms before plugging in the identity for C = T. If I understand the concern that is popping up, you are identifying a special case and showing that for that special case, the equations solve, thus proving that the solutions work for that case. That case sets one of the exponents to 2, which is less than 3. Ergo, the only way the solution can be found with this method is if the Conjecture is true.
What I think is the concern is that you haven’t completely proven the Conjecture, because you haven’t completely disproven the case where z ≠ 2. Ergo the proof isn’t complete.
You have shown that the Conjecture appears to work, but you have not shown counterexamples don’t work, you’ve just shown that your method of proof cannot solve the counterexamples.
I don’t think it’s kosher to state “my method cannot solve the counterexamples, therefore the counterexamples cannot be true”.
One of the things they try to stress in formal algebra classes is that if you reach an expression that is undefined, then you cannot proceed with evaluating that expression, because the answer will be invalid. Undefined means exactly that - you cannot evaluate it, it can give you any answer.
Dividing by zero is just such one of those undefined cases. Once you hit a divide by zero, you have to quit. It won’t work.
One of the comments made was a reference to the “well known” proof that 1 = 2. There is a math gimmick/challenge/teaching example floating around where there is a proof that 1 = 2. I don’t recall it specifically (I could look it up, but don’t care enough) where there are a series of algebraic steps that seem to all work, and ends up showing that result. The challenge/lesson is to figure out where the proof is broken. Somewhere in the chain of logic, there is an expression that evaluates to a division by zero, but it is masked in the format of presentation so you don’t see “Here I’m dividing by zero”.
That is what the mathematicians are trying to explain to you. Your expression involves a step that is undefined. Doing so means your proof is broken at that step. Dividing by zero gives an incalculatable result, and using it in a proof yields an invalid step.
Of course if you want my endorsement for your proof, you can have it. The proper citation is “some guy posting on the internet under a pseudonym” and my credentials are “I read it somewhere” and “I said so”.
Is that not equivalent to “some guy with a newpaper column who posts under a pseudonym” with credentials “I bill myself as the World’s Smartest Human” and “I said so”? I forget. Maybe if I divide by zero…
Is the OP familiar with the pi guy?
When come back don’t bring pi.
This sentence is a perfectly concise explanation of why the proof is wrong.
No wait. One of those ‘letters’ is an email… that has been printed out… and scanned.
Well, yeah. The bellboy kept the $2. Hell, I ain’t a mathematician, and I knew that. Don’s theorem proves that HE kept the $1. :eek:
Chris Langan, who I had to look up, has a logical proof that God exists.
Is there a need to say any more?
Chris Langan is Liberal and I claim my £5.
Charing Cross to Knightsbridge.
On second thought, forget I said anything. I see now that this way lies madness.
Eh? While I find it easy to believe, is this official? In other words… cite? ![]()