Could copying info from a database crash a computer?

This is a legitimate question not a bash. Is it possible?

http://ap.tbo.com/ap/breaking/MGBDP9DE2WD.html

Anything’s possible.

But in my expert opinion, it’s a load of shit.

There’s nothing about this request that suggests that the search criteria would place a heavier load on the database than any other typical request. Regardless, the suggestion that a massive search could actually cause the loss of data is pretty suspect. Are we to believe this system is so poorly designed that a massive search would cause data to be overwritten or corrupted? Or that no backups exist?

Then again, this is the government we’re talking about, here.

From the same article cited by reeder:

My, what convenient timing! Are they sure the copies won’t be available by, oh, I don’t know, say, November 3?

My reaction is pretty much the same as friedo’s - sounds like a bald-faced lie, to me. The system on which these records are stored may be unstable, and may crash when it’s driven hard, but the idea that this would result in a permanent loss of the information is patent nonsense.

What’s next:

The ongoing attempt to do government as it’s really supposed to be done — which is something we the American citizens are always in the process of figuring out.

I dont know that other nations’ citizens do not have a similar sense of participatory responsibility and idealistic purpose with regards to their own nation and government, but I do tend to think that this sense of collective self is a big part of what it means to be an American.

Aww, dammit, wrong thread.

Substitute this instead, please:


•They don’t have a backup?

• They couldn’t do the search from the backup copy, for that matter?

Maybe the info is stored on Windows machines?

The GQ answer is: Yes. It is possible.

How likely it is depends on a host of variables, none of which are touched upon in the article. What DB are they running? What hardware are they running it on? What is their DR solution (since they could just restore it to a spare and do the search on that.) Et cetera.

What I think is being said in the article is that if a large search was started, it could crash the system. And whenever a database crashes, there is a risk of data loss. Not a certainty, mind you. Again, depending on the variables, sounds plausible enough to me.

I have gone from Exchange Admin to Oracle Admin and back to Exchange Admin (all without changing cubicles), and I would venture to say that if it is a Oracle DB, then the chances of data loss approach zero, assuming rampaging chimps aren’t allowed near the server. If it is a Microsoft DB (MSSQL, I am guessing. God save us all if the gov’t is using Access as a database), then I would certainly be concerned about loss from a crash.

Shrug

It’s possible, but a good indication of a very poorly maintained and overloaded system.

A massive search, or large import could require massive temporary files to be generated. Finding someplace to put those files may be enough to overload the system or the data storage. Or the process could max out the available memory. Both could cause a crash. The worse state the database is in, the greater the chances that the crash could corrupt the data or make it impossible to restart the database without doing so.

The fact they’re so concerned about this suggests the situation is so bad that either they can’t create a backup for the same reasons (the backup process itself causing a crash), or they have no faith at all in the quality of their backups. It sounds to me that the most likely answer is that they have an ancient creaking system that no-one is entirely sure about, and no-one is willing to say they could put it back together if it breaks.

This is pretty embarrassing. Basically they’re saying their database sucks and is held together with gum, string and sacrifices to the database god. So either it’s true, or they’ll admit anything rather than provide the data.

I agree that this is most likely the case. Many government offices and agencies are probably still using hardware and software from the 80’s. The hardware may be obsolete and irreplaceable and the database software may be incompatible with newer software making an export impossible. They’re probably in the process of transferring the database information manually. The article says:

It also states that the system “was not designed for mass export of all stored images” and said the system experiences “substantial problems.”

Another reason why the office may be stalling is because there is no current in-built system for excluding and/or redacting information and the volume of information that would have to be redacted manually is absurd. The new system can probably handle such things better.

It certainly is possible that a huge array of queries or satanic query could really tax a creaky old system to overload. I’ve worked with utility databases containing billions or even trillions of records, where the operators told me tales of horror about how they carefully filter each query out of fear of crashing it, and how their backup capabilities aren’t up to backing up the entire database easily. Many large government systems are running on bailing wire and a prayer, so to speak, and anyone who says running a query has no possibility of crashing a system or even having permanent data loss is an idiot or else talking out of their ass, because it can and does happen. We have a whole team of people at work that make a tidy amount of money consulting for people who have their databases crash for a variety of reasons. Even this Board has had posts and entire threads disappear when it gets overloaded.

It depends.

It depends on how the database management software is written. It has been a personal mantra of mine that data must not be treated internally as code, or be used to modify code. If that programming principle is adhered to, nothing in the data can alter program flow in unintended ways, although massive amounts could exceed design limits or increase execution times to an unreasonable value.

It is also a good principle to design data handling routines to never assume that the input data is clean. For example, even if the expected value of a field is 0…5, always have a trap for when it comes up 6 or -2. Because, sometime in the lifetime of the program, it will, and spitting out a descriptive error message is a lot better than blowing up the computer.

If government programs are anything like government cheese, beware. That said, I do think the government spokespeople are blowing it out their ass.