Archaeology is Unethical

Over the last few centuries, as mankind has gained interest in exploring his own history through archaeology, we have destroyed an incalculable quantity of data about that history, solely for the purpose of being able to manipulate these artifacts for analysis, on the basis that it is necessary for us to do so in order to learn anything.

While that assumption has been justified in times past, it no longer is and that has been clear for some decades now. While it is true that, for the most part, we still need physical access to historical artifacts in order to get information out of them, it’s clear that sensor technologies will eventually advance to a state where fully non-destructive analysis will become available over the next few decades. There is no excuse for continuing our current digs and physical manipulation of artifacts, knowing what we know, except impatience.

It may well be that we have to give up all or most archaeology for a generation or two, but that’s as it should be. If we have reason to believe that a particular location contains, for example, samples of a plague that we need to protect ourselves against, or we know that a location will be destroyed by construction/a river/etc. then it is fair to dig and extract a bare minimum of what needs to be gotten to on a fast schedule. But outside of that, history is a curiosity, not a need. It can wait 50 years while we advance scanners that can tell us every molecule and its high-def, 3D coordinate. Trying to go in and get stuff, without that technology, knowing that technology will come, is just willfully destroying history. And while it may be that history is just a curiosity, impatience is still not a viable excuse for destroying most of it.

We should have laws that ban destructive archaeology, except in case of emergency (as noted), and we should have test structures and materials constructed and layered under different materials, to act as a testing ground for sensor technology. Test archaeology should be the limit of most of our work for the next few decades.

I disagree with the premise of the thread.

About a decade ago I went to some old Roman and Neolithic sites. I was told by the rangers that they don’t do digs for just the reasons in the OP. They use ground penetrating radar to find things and then … go away. There is another problem though. Unscrupulous people will dig at night to get items to sell on the black market. The rangers have to maintain secrecy and keep watch or they’ll come by one morning and find holes. This is also a problem with old Civil War battlefields, I have read.

We have top men working on it right now. TOP men.

I sort of the the OP’s point, but there comes a period of time where learning about these past civilizations outweigh the respect to the dead or to the long past civilization. I mean, if 30 years from now someone is desecrating my grandmother’s grave, that is bad, but if it in the year 4125 someone does it, it seems to me to be qualitatively different.

I mean, there is finite space on the earth. We don’t get to have our ruins possess property until the sun goes supernova.

The Society for American Archaeology has what they call Principles of Archaeological Ethics. Number one is Stewardship.

Exactly what benefit is there to leaving artifacts buried?

So we can dig up your grandma?

This is completely false, to the point of being outright fiction. I would not even place a probability that this will occur, nor would it obsolete existing archeology even if it were to occur because actual real-world systems, unlike fantasy technology as shown in Star Trek and the like, has serious, practical physical limits.

I feel like neither of you read the OP.

The OP is about data destruction through using tools on and manhandling ancient artifacts. I did not mention nor allude to desecration anywhere in the OP, and I do not know how you arrived at that discussion.

But if data is all that’s important, once you’ve extracted all the data, why not destroy the source?

Have you?

Can you prove that you have extracted all of the information that could be gleaned, in all respects, for all fields of knowledge, and destroyed none of it?

If you can say that you have an atomic-level 3D scan of the entire subsurface of the Earth, 100 feet deep, and 20 square miles, then sure you might reasonably say that “It doesn’t get any better than this. Feel free to tear up the area and build a parking garage.”

But today, there’s always new techniques coming out; new scanning technologies, new digging techniques, new cataloguing techniques, new preservation techniques. There are always new people wanting access to the materials for new reasons; recreating recipes, recreating the climate, recreating the history of disease, recreating construction techniques, etc. Just by digging something out of the ground and brushing it off…well wait, what was in that dirt you just brushed off? Maybe there’s powdered organic matter from millenia ago that was in that. Maybe there were striations or density ripples in how the dirt was packed that could have told us something about how the walls fell, and so where there might have been organic supports and other structures that were part of the original structure.

The assumption that because we can’t detect it now, it doesn’t exist, is clearly wrong. We assumed for decades that we would never be able to detect an exoplanet - and then suddenly we gain fine enough granularity in our measurements and sufficiently advanced math that we’re not just detecting them, but we’re even measuring how dense they are and what sort of atmosphere they might have.

While we don’t know what the upper bound is for sensor technology and reconstruction of ancient data from the output of those sensors, we do know that we’re not at that point and that we’re permanently losing the original record of where the items were, in situ, and what the makeup was of the materials surrounding them. And there’s good reason to believe that later generations would be able to make more meaningful investigations, had we not removed that opportunity.

Once we know what the upper bounds are - whether that’s atom-level detection from 2 miles away, through solid rock, or just barely more advanced than we have now - we can make a determination of whether it makes sense to dig away or hang back and make periodic scans as sensor technology increases resolution. For the moment, the latter is clearly the right solution though.

Strongly disagree. This is tech that’s in the realm of sci-fantasy. There are so many tests you just can’t do remotely.

If we can learn about things now, we should, rather than hold out for some magic tech that may never materialise (I mean, scan for every molecule? That’s ridiculous when it’s used in Star Trek, and you think it’ll be here in* 50 years*?)

And none of you later examples are what you’re talking about there. They had to get the manuscript to scan it, for instance. And the rest are great techniques for finding artefacts (well, not so much the core thing, that’s not even archaeology and of little use there). Not so much for doing essential analysis like dating and the kind of isotope analyses that let you tell from their teeth that someone wasn’t native to a particular locale…

Exactly. You do the job with the tools you have; you don’t sit on your thumbs for decades or centuries until better tools are invented.

Beyond that, the purpose of archaeology is not just to accumulate knowledge but also to disseminate it, by making it available to the public. Archaeology, at its best, can educate, inspire and bring pleasure to laymen exposed to it. Let me put it bluntly: I live in a country blessed with a greater variety of archaeological sites and artifacts than almost anywhere in the world. I love visiting these sites, and I love looking at artifacts in museums. If the archaeological establishment decides to deny me these pleasures, then I want none of my tax shekels going to support them.

Archaeological discoveries have, time and again, changed our perception of history; that in turn means changing our perception of ourselves (“our people”, both the short-range ones and humanity as a whole). The idea that we should stop doing any science, whether archeology or any other, until we have a perfect way of doing it, has two serious problems: on one hand, perfect according to whom?, and on the other, how is the science itself going to advance if it’s not used? A science is a living body of knowledge: freezing it does not preserve, but kill.

waves to Alessan from another country with more archaeological sites you can shake a stick at

The tech of 50 years from now will be fantasy to us, just as ours is to the people of 50 years ago.

Even bad sensor technology is sufficient for very good data, because the solution to noisy data is just more samples. If you take enough samples with poor sensors, and apply some math, you will get a high-fidelity image. That’s not science fiction, that’s just the mathematics of signal detection.

The human brain, currently, does a poor job of reconstruction of 3D data - like ground penetrating radar - and there’s two main factors that account for that. The first issue is that is that our visualization technologies are horrible. They’re 2D, not 3D. When we look at a display of ground penetrating radar, we’re seeing a narrow slice of the Earth - with a noisy signal - rather than seeing the whole thing in 3D with some math applied to cluster layers and remove noise. It looks like nonsense, because it’s a nonsense way of visualizing the data. Secondly, our brains don’t operate on 3D material composition maps. Our vision system sees the surface shells of 3D objects and interprets the world on that basis. Infinitely thin 3D polygons wrapped around hollow cores are sufficient to represent how we view the world. So even with proper visualization technology, we’re poorly equipped to reconstruct an image from 3D sensor data.

But human brains are no longer than the standard for what can be done. Computer intelligence would be very capable of clustering 3D composition maps and allowing us to refine and remove virtual layers in bulk, to reveal the shapes of things that we are interested in - pots, columns, etc. VR goggles and augmented reality are allowing us to start visualizing 3D data in actual 3D.

None of this is science fiction. That’s all just stuff we have today, and can infer about where the technology will be in 5 years, and we’re already looking pretty damn good for having some reasonable first stabs at reconstructing basic objects through ground penetrating radar.

Whether we can get down to being able to detect a single molecule, through 20 feet of earth and stone, I don’t know. But we have high-speed footage of a light passing through water. What did it take to achieve that? Lots and lots of samples. If you take a billion samples, even from a poor quality sensor, you can reconstruct down to just about any degree of precision that you desire. You don’t really need higher quality sensors - though it helps for sure - you just need to keep sampling.

In fact, thinking about it, the En-Gedi scroll example counters your own argument - it was only because it was excavated and preserved for further study that the scanning tech was able to be used on it in the first place - the very process you think shouldn’t happen. It *wasn’t *destroyed. Who know what might have happened to it had it been left in place, though.

I have not suggested perfection. I have suggested a moratorium on destructive techniques. That will set us back, but it also would set us back if we now wanted to go discover a new continent of people, because we would have to figure out ways to vaccinate them before we arrived, rather than just rowing over and sneezing on them, and saying, “Eh, someone else’s problem.”