A bit off topic perhaps, but as your question seems to have been answered I would like to point out that when someone confuses speed and velocity in English it is often due to a problem of sloppy translation, mostly from a Latin language (French, Italian, Spanish…) or Latin itself. As an example: the proper tranlation of the speed of sound into Spanish is la velocidad del sonido, but the right translation of la velocidad del sonido into English is not the velocity of sound.
In Newton’s time many measurements and values of constants were inaccurate and/or disputed.
It seems – as best I can make out from a quick skim of parts of Westfall’s book – that in certain cases Newton chose only the specific data values that suited him, and then used them in calculations to ‘prove’ the accuracy of his theories.
It wasn’t that he was falsifying data, or even omitting data, it was that he was not taking into account the large margins of error in the data. He was claiming a degree of accuracy that he had no right to claim at the time.
(Of course, it turned out that he was right!)
Hmm, \dot{x}—\dot{x}—seems to work fine for me? (On the other hand, quoting posts with embedded MathJax doesn’t work for shit…)
That story is also told very well in The Quantum Astrologer’s Handbook by Michael Brooks, a scientific biography of Cardano / introduction to some elementary concepts that went into the creation of quantum mechanics (mainly the probability calculus and imaginary numbers)
He gets credit for being the first person ever to realize apples fell straight down. All of physics is built upon that observation.
Read the whole thread. Just some observations…
Newton’s “dot” notation for calculus is still used today in physics, just as he invoked it – to identify derivatives with respect time. Of course for all other applications of the calculus, we lean towards Leibniz’s more elegant notations.
I’ve seen no study on scientists who have falsified data in support of what turned out to be a correct hypothesis. For all we know, it’s the majority of correct new ideas. You can imagine the thinking: “I’m quite sure my idea is correct, but these data need my adjustments to help reveal this fact.”
Agreed: “the velocity of sound” is an unforgivable phrase for an elite, highly degreed biographer of a physicist.
Lastly, for the NYTimes Guest Essay to cite Newton in an opening sentence and in the same discussion as Elizabeth Holmes is nothing short of clickbait. And we’ve all created a vibrant thread because of it, so I guess it’s working. The author could have chosen among many, much better examples that span centuries to illustrate the comparison. Such as Cold Fusion. N-Rays, Piltdown man, Vaccines & Autism. The list goes on and on.
Respectfully Submitted
Neil deGrasse Tyson, NYC
Who’s this guy??
Welcome, Neil! Thanks for joining, hope you stick around!
The term ‘velocity’ was used by Newton himself, and throughout the first English translations of the Principia.
The Latin word that Newton used was ‘velocitas’, and the English translations simply followed that, using the English equivalent. At the time, the term (in either English or Latin) did not imply a vector. It was interchangeable with ‘speed’.
The OED gives the earliest use of velocity specifically defined as a vector quantity as 1847.
Perhaps it’s natural that a biographer of Newton, immersed in the documents, letters, and books of his time should follow the usage of the time.
Watch out, we got a badass over here!
Happy to grant a hall-pass to Westfall on those grounds.
-NDTyson
Several years earlier.
Well, Neil, I hate to contradict you, but let me introduce you to a book called Fabulous Science: Fact and fiction in the history of scientific discovery, by John Waller, a historian of science and medicine, published by the Oxford University Press (2002).
You may be surprised, Neil, that the entire first half of this book carries the section heading Right for the Wrong Reasons. The second half carries the section heading Telling Science as it [really] Was. Which gives you some idea of where this book is headed, and suggests you may find there some of what you’re looking for.
Here, Neil, take a look at this summary on Amazon:
The great biologist Louis Pasteur suppressed ‘awkward’ data because it didn’t support the case he was making. John Snow, the ‘first epidemiologist’ was doing nothing others had not done before. Gregor Mendel, the supposed ‘founder of genetics’ never grasped the fundamental principles of ‘Mendelian’ genetics. Joseph Lister’s famously clean hospital wards were actually notorious dirty. And Einstein’s general relativity was only ‘confirmed’ in 1919 because an eminent British scientist cooked his figures. These are just some of the revelations explored in this book. Drawing on current history of science scholarship, Fabulous Science shows that many of our greatest heroes of science were less than honest about their experimental data and not above using friends in high places to help get their ideas accepted …
… distortions of the historical record mostly arise from our tendency to read the present back into the past. But in many cases, scientists owe their immortality to a combination of astonishing effrontery and their skills as self-promoters.
Thanks. I was thinking more of an academic/statistical study of the phenomenon, which I’ve never seen. But the Waller book is an excellent start – which I had clearly missed. Thanks.
-NDTyson.
I feel I should add a general comment for the sake of clarity. Neither my previous comments nor the book I referenced should be interpreted in any way as a criticism of the tremendous success of the modern scientific method and peer-review protocols. That would be totally the wrong interpretation. What it says, rather, is that we tend to see distant historical events through the prism of hindsight and subsequent developments, and this is not always an accurate lens through which to view these distant events and the people behind them.
In science, as in politics, we tend to venerate transformational pioneers to an extent that is often not justified. Our great scientists of the past, just like some of our nation’s founders and great national leaders, are often elevated by history to the status of infallible sanctity, whereas in fact in many respects they were just flawed people like everyone else, made mistakes like everyone else, and were sometimes motivated by less than honourable intentions. This broad historical perspective is not meant to be a criticism of the amazing progress and successes of modern science.
How would you know the results of such a study were accurate?
How do romance languages distinguish between “velocity” the vector, and “speed” the scalar?
And Liebnitz’s notation is definitely more useful than Newton’s, but I’m not sure to what extent one can describe the “methods”, outside of the context of the notation.
In French, speed is vitesse and velocity is vélocité. I assume they had the same meanings as scalar and vector. I accept that Newton used the Latin cognate of velocity, but that is no excuse for a modern writer to.
As an aside, this reminds of an argument I had with a friend over whether it was correct to translate the Latin word probare (to test) by the English cognate prove. It is not.
Two or more independent studies that align with the original results is all that’s required here. The risk is that if the results of the first study are obvious or expected or not interesting, it doesn’t attract other researchers to verify it. That happens too.
-NDTyson
But the exception proves the rule!
This post would be even better if Chuck Nice were cracking gags around it.
I have it on good authority that he knew nothing.