EVERYTHING You Do On Your PC......

If that’s the question, your wife wins.

If that’s the question, you win.

Given these additional parameters to your argument, I say your wife wins. You seem to be arguing that there is a permanent record of everything you have ever done on your computer. This is not true. At best, there are remnants of a portion of the things you have done. A file you deleted a year ago is likely completely gone now and nobody could get it back, not even the FBI or NSA.

I do not have any particular expertise in the matter, but I am fairly sure that even for the expensive, fancy equipment available to the FBI or NSA, reading data that has been overwritten more than once or twice is a very hit and miss affair.

I believe that these methods rely on slight inaccuracies in the positioning of the head of your hard drive, so that when it comes back to write a new 1 or 0 over the 1 or 0 that was written there before, it does not come back to exactly the same position as before, and writes the new bit at a position that overlaps, but does not completely cover, the tiny spot on the disc where the old bit was written. The machines used by law enforcement and national security agencies can read the small edge that did not get overwritten inasmuch as they can position their read head much more accurately than that of a normal disc drive, and step it in smaller increments.

However, this must be getting more and more difficult to do successfully as consumer disc drive technology improves, with drives packing ever more information into smaller areas of disc, which itself relies on ever more accurate head positioning. Also, each time the old data is overwritten, the chances of there being a small part of the area where the original bit was that has not been overwritten, and so obliterated, decrease rapidly. One overwrite is very likely to leave a sliver of the original bit behind, but the chances are very good that that sliver will be overwritten the next time that part of the disc is written to (leaving, perhaps a sliver of the first overwrite), and, by the third overwrite, the chances of anything of the original bit remaining must be very low. You sometimes see claims that data needs to be over written something like 9 times to be truly secure, but I am fairly confident that in reality even the with the very best equipment you would need to be very lucky indeed to be able to read something that been overwritten 8 times, and really pretty darn lucky to be able to read something that had suffered just 2 or 3 overwrites.

Incidentally, you do not really need special applications to securely erase (i.e., multiply overwrite) data. Windows XP has a utility built in. If you open up a dos box and enter cipher /w:xxxx at the prompt (where xxxx is the name of any directory immediately under your root directory - don’t ask me why you need this, but you do), it will overwrite all the “free space” (i.e., anywhere where files deleted from your recycle bin might live) three times, with ones, then with zeros, and then with random bytes. It will take a while if you have a lot of free space, but it should be pretty secure, although if you are really paranoid I guess you could always run it again to get 6, or 9, or whatever multiple of 3 overwrite you might want.

Put it this way: The Defense Department rules say that, once a hard drive has classified data on it, that drive can never be sanitized. It must be physically destroyed, to fairly demanding specs of “destroyed”, when it is retired from service.

Do you think they would go to this trouble if overwriting data X number of times would get rid of it?

Sure, it their job to be paranoid (and they have have to take into account the remote possibility that the “enemy” might have invented some sort of data recovery technology that we have yet to conceive of). We are still living with the residue of the Cold War mentality with this sort of stuff, where screwing up certian national security issues might really have meant the end of the world. A few overwrites will give you 99.99% certainty of data destruction, but those guys want a full 100%.

As far as your browser is concerned, those are just webpages, and webpages get saved (at least temporarily) to the cache. They might not stay there long, but they certainly touch your hard drive.

A number of you have noted that the more sophisticated file “shredding” programs overwrite the files-to-be-deleted with random data (I presume that means essentially overwriting them with random sequences of 0’s and 1’s.)

My question, then, is what is the ‘source’ of those programs’ randomness? Do they use truly random data, or pseudo-random data? And, if the latter, does that mean that the original data may be recoverable?

What about the new solid state drives on the market that use non-volatile RAM chips?

Not just “sophisticated” ones. The cipher /w command built into XP DOS does this too on its third pass.

I do not have direct knowledge of this, but I am confident that it is pseudo-random. True ramdomness would be very hard to implement and quite pointless.

I do not think it makes any difference whatsoever. To makes the data less likely to be recoverable, you just overwrite it more times, it does not matter what with. The randomness is not to remove the data better, it is to make it less obvious that data has been removed. The easiest and quickest way to remove the data is to overwrite it with all zeros, and then with all ones (or vice-versa). The trouble with that is that if anyone does examine your disc and discovers that all the unused areas are filled with all zeros or all ones it is going to pretty obvious that you erased something, and thus that you have something to hide. Writing (pseudo)random data on the final pass makes it a lot less obvious. At least to a casual inspection, it is just going to look like the sort of innocent debris of old deleted files that you would find on any well used hard disc.

Interesting question. I am fairly sure that you would still need to overwrite the data to truly remove it, but just one layer of overwriting would probably be enough to make it totally unrecoverable.

The way it was explained to me as ananology, not claiming this is the exact process, a spot on a hard drive has a charge value from 0 to 1. Anything over .8 reads as a one, anything under .2 reads as a zero.

The software looks at the drive and sees that most ones are .9 for example, and that a .95 is probably a 1 overwritten with another 1 or a .85 is probably a one written over a zero.

same thing on the low end.

In this way you can extrapolate what is most likely what was there on a first overwrite. By using finer charge variations you can theoretically dig deeper but you introduce a greater probability of it being wrong because knowing what that sector had 2-3-4 overwrites ago is more difficult to figure out. You might know three ones and a zero were likely written there but was it 1-1-0-1 or 1-0-1-1, harder to tell and require alot more computer power to run all the possibilities looking for a coherent file.

for your secure computer data disposal enjoyment.

The NIST has another view, as you can verify by chasing down this article’s links:

The DoD is a little more paranoid, but the DoD is apparently living in a world where destroying a drive is safe for the person doing it. If they were assuming a world where destroying a drive was going to expose the person to Definitely Not Torture, their tune would change.

I believe they organise themselves internally in a similar way to magnetic storage - that is, they delete a file by removing its information from the allocation table (or equivalent), but don’t overwrite the contents until the space is required by something else. Even ordinary flash removable storage such as SD card works this way.

With dedicated flash disks designed to replace hard drives, I believe there is some kind of optimising algorithm in there that tries to ensure that all parts of the flash memory get used evenly (to stave off the limited write-cycle life that flash memory has) - this might actually mean that a deleted file on a flash hard drive remains recoverable (by normal means) for longer than it might have done on a magnetic hard drive.

Unlike magnetic media, I don’t think that flash memory internally retains any kind of residual traces of previous states that could be recovered by the advanced methods discussed in previous posts though.

I think this just means they’ve implemented a procedure they can be confident will always be more than adequate - it doesn’t actually mean that data on an intensively-overwritten hard drive is or ever will be recoverable - it’s a just a deliberately-excessive action that can’t fail.

By analogy - there’s a fly buzzing around the room - spraying a can of Raid at it will kill it - and is perfectly reliable for any reasonable purpose, but if you want to be absolutely, 110% sure, you could take off and nuke it from orbit.

And they may get paged out of memory to the disk as well, though this is less likely with today’s huge memories.

On a related topic, there has also been some research done with extracting data from volatile RAM. The idea is that when powered down, the state of the bits in RAM decay somewhat slowly and if the RAM modules are removed from the target computer fast enough, information like passwords can be read out. Spraying the modules with a freeze spray or can of air can extend this time.

This is a very marginal technique: It requires, as you said, being right there when the machine is powered down, before the RAM modules have time to lose enough charge to erase the data, and being able to get the modules into another computer capable of reading from them without wiping them itself.

Finally, if the program overwrites all of its sensitive data in RAM even once it’s not going to work.