Could 1960s scientists reverse-engineer a modern desktop?

Right, so are we to reject a piece of software based on prior bugs? You’ll forgive me for being obtuse, but this is the first time I’ve heard that excel is unacceptable for scientific research. I have done research myself, and am friends with a PhD Toxicologist and a renowned statistician, and all of us have published material that used excel, SAS, and GIS software.

I’ll leave it to an engineer and/or a mathematician to respond to that. I was responding to your apparent surprise that your version of Excel did the right calculation.

Ah, it is tough to react to edits!

I fully understand that excel may be prone to an occasional bug. However, is there research out there comparing the rate of error of excel to other math/statistical programs? I’m not convinced by the argument that a discovered and fixed bug indicates a program is less reliable than another piece of software. Surely all software has some incident rates of failure?

To claim that excel is unreliable for scientific research makes me think there is evidence of repeatable, common, and consistent errors. Not bugs.

Thanks, this is a great guide to what makes a good graph and I am saving it. I agree with many of the points made. However, it is possible to avoid many of those mistakes with a bit of care when designing the layout of a graph in excel.

I feel at this point that I have derailed the thread from its purpose. In addition, I’m not really prepared to defend excel and feel that I am venturing close to doing so. If anyone has comparative research I’d be extremely interested in seeing it. Thanks!

As I said in another thread, the idea of reverse engineering advanced tech is laughably over-estimated. It took decades of consistent small improvements in the manufacturing technology and numerous engineering breakthroughs to be able to manufacture your desktop. They may get ‘ideas’ from it, but they’d be no more capable of replicating it than the apes at the zoo.

It would of almost no use to them except as a conceptual device and if they starting dicking with the machine re probing IC components the chances of them breaking something critical are pretty high.

Unraveling the hardware and software of something as massively and narrowly integrated and complied as a modern PC without contemporary tools is effectively impossible. It would be quite literally almost like magic to them.

Re the 60’s people smoked like chimneys in that era. It would be interesting to see if they smoked around it until the smoke tars gummed up the fans and keyboard.

Indeed, our current machines have reached levels that some people back in the 1980s confidently predicted would be impossible.

See Clarke’s First Law.

It’s not just the graphs, though. Take a look at what they quoted from the help file on t-tests and see if you can find anything right in it.

Excel + SAS + GIS stuff is not the same thing as Excel.

This is a bit of a hijack, but it’s worth at least attempting to close off. I’m not going to claim that you should never do anything statistical in Excel–it’s fine for basic descriptive and inferential statistics, and simple graphics. However, for anything more complicated than that, there’s something better out there put together by people who really know what they’re doing.

I feel like that is moving the bar. SAS has some arcane and confusing bits, and I have no doubt I could find a SAS (or any software) tutorial that is awful. I am interested if correctly setting up a t-test in excel returns in an incorrect result.

So far no-one here has demonstrated that Excel has a broader range of failure than other products. Instead, we have a single example of a bug that has been fixed, and a dated overview complaining about how the default setup on many excel graphs are poorly suited for reporting science. As to the former, I am dubious that an old bug = unsuitable program. As to the latter, I have never used any graphing program that didn’t require tweaking to my own precise demands.

Why are those more trustworthy? What metric are you using to compare?

Microsoft Excel (and other spreadsheet programs) have a number of properties that make them undesirable for computationally intensive engineering or scientific processing. For one, its native handling of floating point math is very poor. You can, of course, specify a variable type using VBA and control precision that way, but then you are no longer using just Excel. Its known problems with memory limits have been discussed at length elsewhere, but needless to say for really large data sets it is a wholy inadequate tool. The way Excel generates plots is very quick, but not easy to edit or control consistently unless you get into VBA. Its native formatting of legands and text is very poor compared to Matlab or the Python matplotlib library, and it is not nearly as compact or fungible as R’s ggplot extension.

However, the worst thing about Excel is the way that people tend to use it. That is, they attempt to develop it as a pseudo-programming tool. It starts with putting a few simple formulas in cells, and then copying the cell functions down. Then they start linking calculating cell to another calculating cell. Soon, you have this ginormous workbook with multiple sheets of interlinking cells that could be a few hundred lines of easily documentable S/R/SAS/Python code, but instead is an unintelligible, uncommented mush of overcooked spaghetti that no one else can untangle. And when there is an error, tracking back to find it is nearly impossible, thanks to the fact that the cell references cannot be named or documented clearly. It permits, and even encourages, all of the bad practices that make software engineers throw their hands in the air when dealing with code written by electrical and mechanical engineers.

That being said, Excel has its place. If I need to handle a moderate sized data set and do some simple transforms, it’s a decent tool. If I want to visualize data with a line or bar plot, it can be quick. To set up a matrix for a trade study or one-to-one relationship table, it is just the thing, and way easier than using Nodebox or Visio. Basically, if you are performing some kind of bookkeeping task, it is fine…because that is what it is designed to do. But for doing any kind of complex calculations, it is bolsh. I still wake up in sweats over the day a coworker came to me with her Excel spreadsheet in which she was attempting to perform some DSP operations and asked me to help “figure it out”. After an hour wasted of trying to work through iterative differential transforms like “D17*SQRT(A$121)+DIV(S479,J32)” I had one of the guidance boys free up a seat of Matlab and showed her how to import data and load up the DSP module.

Next, I rail on people who attempt to use Microsoft Word for structured documents and desktop publishing…

Stranger

Make sure you say your name is John Titor.

Something sort of like this has actually happened - capacitor plague.

Does it have a development environment and documentation on it?

If it does then I seriously doubt they’d ever try and take it apart, you’d get a huge team of people studying copies of the programming language documentation and the thing would be setup to run batch computations in a queue system like mainframes were.

The most consistent problem I run into with Excel is with non-linear lines of best fit. Excel does this by changing your data to linear, then calculating the linear line of best fit. That is wrong. It weighs your data incorrectly at different parts of the graph. The lines it calculates for me usually intersect the actual data at two points, one at the beginning and one at the end. I get much more accurate results calculating with dot to dot lines.

I started a similar thread once on whether the people of 1983 could merely connect to my now-five-year-old macbook Pro.

The answer: maybe. Even since '83, there has been enough change to make it difficult. And the latest model is an arm and a leg above mine

OK, real simple, during Apollo’s early years, the IC wasn’t REALLY of interest. Some radiation thing doing that destroying of the circuit thing going on, not to mention that germanium was the “biggie” at the time, NOT silicon, which is current. What WAS of silicon could be measured visually, no need of a microscope unless one were nearing 50.
As for MODERN computers, MODERN computers circuits would be BARELY visible to most electron microscopes of the time. I know, as we had a few of those models in my high school that were donated to us.
Add to that the CHEMISTRY and QUANTUM DYNAMICS that didn’t exist back then, nope, it’s a non-happener.
It’d be equal to a cave man learning how to forge stainless steel!
It’d be on the order of us trying to figure out a quantum supercomputer, based upon its powered down status and lacking its power source.
Better, like me handing YOU a current pocket calculator, on a desert isle and demanding total replication.

I love your use of a sexual metaphor. Was it intentional?

Preach it, brother!

Yes, but the process will take 40 years.

:slight_smile: