So I’m a big anthropology geek and have just finished a book on neanderthals, one thing the authors are keen to point out is how quickly the field is advancing and the degrees of paradigm shifts, depending on if they find a tooth next week in Azerbaijan or somewhere and the implications of new discoveries (more seriously stuff like the Denisovans and homo floresiensis being uncovered that we had no idea about before).
Obviously it’s just a matter of time for all fields of science, but if an expert in the field authors a book for the layman, which stands the risk of becoming obsolete the fastest?
I have no idea how to measure the “speed” that a given science is “moving” at, so I can’t suggest anything based on that, but I have a related suggestion for you that is more practical.
If your concern is that you want to buy books that will retain their validity as long as possible, then ignore the pace of change of science. Instead, look for books which have as their primary focus, to list and describe evidence, and which spend less of their time and space on speculating about what the evidence means.
Factual evidence does not change. Interpretation of that evidence does.
Purely in terms of the volume of data being acquired, astronomy is growing so rapidly that we have more yottabytes of data coming in than all astronomers currently working have time to process and review. We literally have enough of a data backlog right now to keep the current working population of astronomers busy for decades, and that in an era where we are at a reduction of astronomical facilities, especially space telescopes (though we are getting near optimal use of facilities as a whole).
However, the data from genomics will eclipse that (if it hasn’t already) with the ability to sequence not just genomes of known species but to make differential comparisons of genomes of individual organisms. We will soon be running into some thermodynamic problems with storing and accessing all of the data from genomics using conventional data storage techniques, which are not fast enough to store and retrieve the volumes of data required to perform useful analysis.
We currently have the paradigm of “Big Data”, which doesn’t just mean having to handle a lot of data but find new ways of performing analysis using non-frequentist (generally referred to as Bayesian) sampling systems just to try to understand what the data means. It is as if instead of just looking at ocean waves and calculating frequency statistics to determine significant wave height, we now have to look at each individual wave and tease out the underlying patterns even though they are constantly changing and fluctuating in nonlinear fashion, a task that is impossible using conventional counting methods. But now we’re dealing with such a volume of data (what I term “Colossal Data”) that the time and energy it takes to just handle and process the data is overwhelming, requiring even more advanced methods of hyperparallel processing (dividing parallel processes into different classes or methods to more fully optimize processing effort) and using quantum superposition principles to find best fit solutions that conventional analysis methods using linear transforms or filters would take lifetimes to do.
So while astronomy and genomics are providing the largest volume of data, the field that is ultimately going to have to develop fast enough to allow us to make use of all of this is information/data science, which is going to have to draw in revolutions in computation theory and quantum mechanics in order to resolve. Even at that, we are going to come to a point where the human mind simply cannot absorb enough material to do useful work without artificial augmentation. We’re arguably already there; no research physicist working today does the majority of their work using a whiteboard or pencil and paper, and access to a powerful computer and knowledge of data processing and simulation codes and packages is a sine qua non of doing any kind of research.
To address the specific question of the o.p., all books which are intended for the general public are either obsolete, essentially trivial, or flatly wrong at the time they are published, and textbooks frankly aren’t much better although they go into far more detail. It simply isn’t possible to take the cutting edge of science and explain it in layman’s terms, primarily because the scientists themselves don’t have a comprehensive grasp on material that is ever evolving, and the best they can really do are the kind of summary articles you see in publications like Scientific American which are generally a combination of current research and informed speculation. If you are a professional scientist working in a hotly developing field you are spending at least a few hours of your day reading articles in a handful of very specific technical journals and struggling to keep up with just the developments in your area of specialty within your field. The era when someone could make near simultaneous contributions across multiple fields has passed.
If I had to guess, I’d assume a field with a lot of potential data where we just got the equipment to actually start identifying and measuring the data. But just because you can identify data doesn’t mean you can do anything with it.
I wonder how fast neuroscience is advancing. I thought we just got the tools to really advance the field lately (the last 1-20 years).
Renewable energy seems to be advancing rapidly, cost depreciation is pretty fast compared to other forms of energy generation.
But I really don’t know. Very interesting question.
And of course, even the whiteboard and the pencil are already artificial augmentations to the human mind. People talk about the coming “singularity” in technology, but in fact, we’ve passed the singularity (actually more of a horizon) hundreds of times in human history.
Sure, we’ve been using various tools, from the printing press to the computer algebra system, for centuries to communicate, codify, and organize knowledge, but we’re getting to the point that it is no longer possible for scientists to even fully comprehend the underlying mechanics without resorting to systems that perform analysis in an automated fashion that the users cannot even independently reproduce. When I use Numpy to manipulate large matrices of data, it performs more calculations in a second than I could by hand in years, but in principle I know exactly what it is doing. But we’re coming into systems that an individual scientist couldn’t even reproduce or fully understand without spending the equivalent of a seperate doctoral education learning about. As a more concrete example, I know the basics of how a finite element code works, and have even written a solver and several different element formulations, but when I use a geometry optimization system it does a bunch of black box operations that I can only guess at. On the other hand, the system often gives me “optimized” shapes that are unmanufacturable because it is an idiot machine that doesn’t know anything about fabrication methods, so I still have a job. However, I’m sure it is just a matter of time until someone figures out how to give it rules for manufacturability, and I’ll have to resort to my mutant healing powers, gruff manner, and animalistic fighting skills to survive.
In the title which fields are advancing the fastest and
In the post books for laymen becoming obsolete the fastest
I think that you find that books for laymen more often talk about the basic stuff which been understood for decades, not the esoteric stuff at the frontiers of research.
In the mid 80s I worked in a neurophysiology lab that was studying the cellular basis of learning. The field was advancing so quickly that papers were being edited in the time between acceptance and publication just to keep up.
I think that the field advancing the slowest is physics. Bigger and more expensive machines (LHC, LIGO) are finding less and less. Higgs boson and a few black hole collapses for many billions of dollars. I assume that there are other results (like failure to find new particles at particular energies), but it seems to me the end of the line of this kind of apparatus.
Meantime, the theory is at a standstill. While string theory is great mathematics, it is making no testable predictions, which means it is not science. In that sense quantum gravity is no closer to having a testable theory than it ever was. And the argument over one universe or many seems to be untestable in principle.
Well, for all of LIGO’s cost, gravitational wave astronomy is still in its infancy. So far we’ve only found the sort of things that we expected to find, but that’s with the first instrument we’ve ever built that even worked at all. There’s still plenty of room for advancement there, and we’d need to explore more of that room before we can justify calling it a bust.
You’re right, though, that the LHC has been a disappointment. I’d heard it said, when it was just starting up, that the most boring possible outcome we could get from it would be to find the Higgs and nothing else… and that appears to be exactly what we’ve gotten.
Do you really? There are an awful lot of layers of abstraction in any computer program, and nobody’s really conversant with all of them. Ultimately, what it’s doing is channeling electrons and holes through impure silicon. As an incidental result of that, some portions of that silicon are acting like switches or valves, and those are in turn arranged into logic gates, and those are used to provide flow control constructs (controlling the flow of program execution, now, completely different from controlling the flow of electrons at the lower level of abstraction), and those are further organized into high-level languages, and so on. It may feel like you understand it, but that’s mostly just because the parts you don’t understand, you’ve gotten used to not understanding. Thus it has always been, and so nobody ever notices when they’re living through what was the Singularity of a few decades previous.
Archaeology, particularly how we understand human evolution and diversity, as the OP suggested, is undergoing rapid change of fundamental understanding based on the ability to retrieve genetic material from older and more degraded material.
Our knowledge of human evolution over the past ~3 million years is still pretty rudimentary. The existence of the Denisovan population was based on genetic fingerprint, ironically on a single finger bone.
Any science that relies on computer power … my first computer 35 years ago (Z-81) had an entire kilobyte of memory and I could store programs to cassette tape … assuming that computer science is more of a technological thing rather than the sciences that are being mentioned …
My comment was intended to be restricted to just the software component of Numpy but you make a fair point that the overall system of computation, “from soup to nuts”, is so complex that no one individual could claim expertise in all aspects sufficient to independently reproduce the system or even describe it in functional detail at the working level.
However, while modern digital computation systems have many different levels with abstraction that allows you to define and query the function “y = lambda x: x*np.sin(x)” without having to think about any of the various operations occurring in software, firmware, operating system instruction sets, and the electrodynamics of transistors (or all of the very complex manufacturing methods to produce an integrated circuit), all of the various layers can be conceptually defined and linked in a way that can be comprehended and explained by a reasonably intelligent adult. It’s complicated and detailed, but straightforward.
On the other hand, if you look at the field of systems biology and specifically at metabolic reactions, while the common pathways such as Oxidative decarboxylation, glycolysis, and the citric acid (Krebs) cycle are comprehensible, the entire set of metabolic pathways (respiration, nutrient transport, signalling and feedback interactions, vitamin and cofactor metabolis, protein synthesis and protease mediation, lipid and fatty acid decomposition, et cetera) is so complex that not only is not possible for even a very smart person to fully understand and reproduce all of it, we don’t even understand all of the complex interactions sufficient to simulate it in toto using existing computation capabilities and methods. By the point at which we can make a system that could literally simulate the operation of the metabolism at a cellular level, the system itself will “know” more than the humans querying it in the sense of being able to establish the influence of various interactions.
The Large Hadron Collider (LHC) is a “disappointment” in the sense of not really getting further into the mechanics of what we assume are fundamental particles and interactions beyond what was already speculated, I think it is important to put it into context; while the LHC is arguably the most sophisticated machine that humanity has ever constructed and has brought more physicists and engineering together to a “single” goal than any project in history, what it is essentially doing to examine physical interactions between particles is the equivalent of doing geometry by catapulting rocks into each other and evaluating their composition by the color of sparks and dust that is produced by the impact. It is an incredibly crude way of doing physics that we have resorted to because we don’t have any way to operate directly upon particles using the nuclear force and weak interactions. We’re limited to electromagnetic interactions and slamming particles into one another to more or less randomly produce different interactions, and then sorting through the detritus for resulting statistical distribution of decay particles.
Supposedly a lab grown hamburger made with cultured meat cost $325,000 in 2013, but the price was only $11 in 2015.
However I don’t think the price has declined much in the last two years. I’m not sure if the price decline was due to new technologies or just economies of scale of existing techniques and technologies.
One way of measuring the rate of advancement of a field is how fast textbooks get outdated.*
In my field, Computer Science, the churn rate for textbooks used to be phenomenal. A compiler textbook could be outdated in a year or two. Some computer hardware texts they’d be outdated by the time they hit the bookstores. For many undergrad books, things have settled down some. But a five year old book for many grad courses is long in the tooth.
And I don’t mean the tweaking publishers do to create “new” editions of textbooks to try and hold back the used text market.