I don’t know if this is true for everyone or not, but if I put a full data CD into my computer and copy all the files to my hard disk, it takes a bit longer to finish than it takes to then burn a copy of that CD to a new CD. I’m using Roxio software, but I doubt that’s an issue. I could see some software taking longer to burn, but I don’t see how any burning software should be able to beat the native read for speed.
This is Windows XP if that matters. I’ve let the software pick its own burn speed, and as far as I can see there is no way to adjust the read speed.
I do not believe this is a bad CD which takes extra time to read because this happens repeatedly and the CDs are burned with the same CD drive I’m using for the subsequent read and write. I suppose a badly aligned CD might burn a CD that it had trouble itself later reading, but I just don’t know.
One possibility I think would be if your disk is heavily fragmented the read/write head could spend a lot of time jumping around finding empty blocks. When the disk is written, the data goes down as one continuous stripe.
There’s a lot more “housekeeping” activity going on when you copy files. For each file that is transferred, the system has to create an entry in the file system with the name and all of the file properties, open the new file, transfer the data from the file on the CD to the disk, possibly in multiple sectors if it’s a large file and the disk is fragmented, then set all the proper attributes, etc. All of this causes the heads on the disk drive to move to various sectors of the disk to accomplish this task. Although it seems “instantaneous” with only a few files, the extra disk reads add up.
When you copy the CD, the entire CD image is copied to the hard drive and then to the other CD without all of the intervening file system overhead.
I understand error checking (though I’d hope the CD burn would check for errors).
I understand it takes some time to create the file structure, though doesn’t the computer have to create a file structure on the CD as well?
I don’t understand why hard disk fragmentation would matter at all. If the HD is fragmented and it has to skip all over to find blank spots to store the data, it would have to skip all over to read that data back when creating the CD copy. I’m measuring CD write time from when I click on go so accessing the data time would count. Or does the computer actually start reading my hard disk as I’m building the CD description and put the data into memory? (Though it’s simply one master folder I use in the description,)
Or is it possible the computer is “smart” enough to realize “Hey I just burned those files to the hard disk, all of it still is in memory right over here. I don’t need to go get it.”
The difference is that when you burn a CD from another one, the system doesn’t care about the file structure. It simply makes an exact copy of the data bits on the first CD to the same locations on the second CD. No worrying about files, folders, or anything else. That information is transferred as part of the data stream.
But that’s not what I do. I’m making 3-4 copies of a CD. So first I copy the CD to my hard drive. Then I use Roxio to create the copies. So I don’t see it being any different from just copying 600+ Mb from my hard disk.
Download a copy of imgburn and try that to rule out Roxio being a problem. I stopped using Roxio several editions ago - same with Nero. Not sure what either product is like these days. Imgburn is pretty close to being an industry standard for freeware. It handles any non-copy protected disk perfectly and will report if there are a lot of read errors on the original which might be slowing you down. I never have the need for anything else but I stick to ISO images of disks and rarely create anything with custom menus and such.
Disk fragmentation can make a difference because you may not be buffering your disk writes (this is slower but safer - can’t remember the default for XP). When writing to the CD, you have to have a buffer or some kind of “under run” protection. So writing to the CD is, as pointed out already, a smooth continuous process.
When burning to a disk there is no correcting a mistake. If the data going on is goofed then that is that. Even a re-writable disk will not check as the data goes on (while you can re-write on those disks the process is a whole other thing so does not flip back and forth that I am aware of…on a write once disk you are SOL no matter what if bad data goes on).
You can tell the software to check the data is all good after the write and I suspect if you tell it to do so and add that time the write it’ll be longer than copying the disk to your PC.
When you copy to the hard drive it can check on the fly that the data is faithfully copied. If an error occurs it can read the data again and copy over the erroneous data. Overall this will be slower than the write to the CD because it checks as it goes.