Newton a fraud?

Direct quote from the opening of an editorial in today’s (1/5) NYTimes:
“In his 1993 biography of Isaac Newton, Richard Westfall argues that parts of Newton’s watershed work, the Principia, are “nothing short of deliberate fraud.” True or not, it is clear that Newton made compromises in service to his vision.”

Does anyone here know what that’s all about. Who is Richard Westfall? What compromises did Newton make? I know that he used calculus to make his great discoveries, but then avoided using the then-new and basically unknown calculus to present them, using strictly geometric arguments.

Yes, originally calculus was done using infinitesimals which did look like “ghosts of departed quantities”. It took Cauchy in the early 19 c. to free calculus of that heritage and Robinson in the mid 20th to put infinitesimals on a sound basis. But the history hardly rises to the level of fraud.

The quote above was the opening lines of an op-ed piece about Elizabeth Holmes.

He more than just used calculus. It’s my understanding Newton essentially invented calculus in order to solve the problem of planetary orbits. And he invented calculus at about the same time as Leibniz.

The claim I’ve seen made is that Newton used “fluxions” rather than what we would recognize as calculus (although practically very similar), and that modern calculus basically all descends from Leibniz’ work.

I checked Google Books for “fraud.” Nothing relevant came up but look at page 733 & 734 of Never at Rest

Newton was engaging in a form of quantitative polemics, bolstering his philosophy by pretending to a degree of accuracy beyond any legitimate claim.

Already in Book II, he had carried out another computational sleight of hand to give a similar pretense of precision.

I’m not going through 900 pages to dig out the exact meaning of this, but it seems to me to be the stuff that the column was talking about.

That’s very far from fraud.

Fluxions are derivatives under a different name. The main difference between Newton and Leibniz was notation. What Newton called dotted x (standard tex is \dot{x}, but markdown doesn’t do it right), Leibniz called \frac{dx}{dt}.

It’s unquestionably “compromises in service to his vision.”

Who was Richard Westfall-he died back in August of 1996, And was considered the foremost biographer of Newton. He had an M.A. and a Ph.D. and taught at several universities, according to Wiki.

From wiki

Westfall considered Newton a driven, neurotic, often humorless and vengeful individual. Despite these personal faults, Westfall ranked Newton as the most important man in the history of European civilization

So I guess whoever wrote the article was looking for a provocative statement in order to provide a hook to pique the interest of readers.

Sometimes the such writers are gracious enough to expand on the quote to provide a broader context somewhere towards the end of the article. But sometimes not.

I doubt whether the Richard Westfall would be very impressed.

I thought it was more that modern calculus is closer to Newton’s approach than Leibniz, but we ended up mostly using Leibniz notation for the concepts.

Off-topic, since the OP has been answered:

If anyone is interested in the history of mathematics, I found this engrossing and excellently presented video the other day, about the invention of imaginary numbers.

There’s a lot about the solution of the cubic equation in the 16th century, and the geometric approach to what we would call algebra, before modern algebra was invented.

Don’t be put off by the graphic or the title – this a quality video.

Newton was certainly an asshole (and a complete dick to Hooke, although the dickery may have been mutual) and had some odd ideas about alchemy, but he wasn’t a fraud

I don’t know if everyone will be able to access them, but this is a wonderful little series of vignettes that introduces some of the key mathematical figures from Cantor to Ramanujan (the closest to A-Z that I could get)

Incredibly brilliant, but yea, he was an odd dude. Very cold and antisocial. Didn’t seem have much use for other people. Am guessing he had some kind of personality disorder.

Did Newton also introduce the ’ notation, as in u’ = du/dx, Hari?

Dunno. We use both of course, depending on the context. As I understand the dot is used mostly (maybe always) when the independent variable is time.

The upshot of this thread is that the original quote was so much BS. Shame on the failing NYTimes.

As to whether we do calculus mainly in the spirit of Newton or Leibniz, that is not obvious. I think we mostly teach it in the spirit of Newton, but most mathematical analysts think using infinitesimals and then write it in Newtonian language. But you would have to check with a math historian, which I am not. I do abstract algebra (category theory if you know what that is) anyway.

It sure is. It was fascinating. Of course I knew the solution of the cubic already, but I didn’t understand it geometrically. Curiously, even for the equation x^3-x=0, whose solutions are \pm1 and 0, complex numbers show up when you solve it using Cardano’s method.

On the other hand, he was elected by his peers to lead the Royal Society for 24 years. Either they were equally bonkers so he didn’t stand out much among them (which, given the fumes and heavy metals that most were probably being exposed to regularly isn’t entirely implausible) or they felt like he was reasonably reasonable enough person to serve in a position of power.

A person’s personality might be mediated more in face to face interaction than it is in their private journals, etc. that a historian would use to know them by?

The “fraud” label is specifically about some of his work in Principia. The pulled Westfall quote came from this line: "[Newton’s] use of the ‘crassitude’ of the air particles to raise the calculated velocity by more than 10 percent was nothing short of deliberate fraud.” So it was a little more specific than it appears without context.

Here’s the chapter from Westfall’s biography with the quote in question. It’s more about Newton being a master of the fudge factor and unjustified precision in his calculations. In context, it reads more like hyperbole used to describe some of the flaws in Principia.

https://web.archive.org/web/20150226043014/http://web.centre.edu/muzyka/articles/Westfall1973.pdf

I read about half of it. It seems that the real story was that Newton’s first edition did not rely so much on fudge factors, but created such a furor that he felt he had to claim greater precision to convince people and did fudge a bit. So what? Calling it fraud really misinterprets what was happening. And this by a man who commits an error that no physics 101 should: confusing velocity with speed. I guess he thought it more erudite to do so. I cringed every time he used the phrase “velocity of sound”.

“Deliberate fraud” would be something like falsifying data. I remember such allegations concerning the Oil Drop experiment. Though, checking the Wikipedia article just now, it says that David Goodstein investigated the original lab notebooks and found that no data was excluded. Another type of fudging is simply fabricating data, like when Ninov “discovered” new elements in the 1990s. At any rate, if there are alleged discrepancies, an obvious move would be to examine Newton’s original notebooks.

ETA as for pure mathematics, while there is no experimental data per se, people have been known to do things like take a paper that had been rejected for publication because the reviewers found mistakes, not bother to fix it, and submit it to another journal where the editors were not so careful. That sounds deliberate to me.