Excel - VBA Programmers - This happen to you?

I’ve been crawling through programming forums and have been Google happy all morning, but I haven’t found a satisfactory answer.

Here’s the situation.

I write a lot of Excel VBA Macros. More than once I’ve written macros that do the following -

  1. Open a second Excel File
  2. Modify it (add, subtract, compute data)
  3. Save and Close the second Excel file

Now with the magic of for/next looping these programs may sometimes modify hundreds of files at a time.

The problem - The program errors out randomly on a save. If I choose debug the program shows that it halted on a save. When I hit resume, the program saves the file and then chugs along like nothing ever happened.

This has happened on files stored on a network drive and files stored locally on C: (though network drives appear to have a higher rate of occurrence). Its occurred for computers running Windows XP and Windows 7. It happens for SAVE or SAVE AS coding.

This happens so much that I will no longer write a SAVE or SAVE AS statement without wrapping it in an error trapping routine that allows multiple attempts to save. The programs don’t bomb if I give them 10 chances to save a file. I haven’t tested whether 2 chances is enough.

So it’s a problem which has a fairly painless work around but I still can’t figure what causes the problem.

I suspect there is some kind of latency issue since it happens more often when writing to a network drive than a local drive. It might also be some kind of buffering problem. Maybe the previous file isn’t finished writing when the next file is cued.

Any body else have this problem?

I’ve done a lot of VBA… can’t say I remember this specific problem but this sort of thing happens all the time :mad:

What is the specific error message?

You might insert a doevents wait loop before you open the next file. Say, for 5 seconds.

Surprisingly its not always the same error message. The error message usually say that the file is “in use” or doesn’t exit.

My error trap basically is an “if error then” with a counter. this works nicely in that there are no delays if the file saves on the first try.

I don’t want to use a “wait” statement because that would just slow the batch file down. (adding a few seconds to each occurrence of a save for 400 files can add 1/2 hour to the batch) Sometimes I have to run these things during the day and its nice to get the job done so that I can get the night batch set up.

I’d guess that this happens when the file is in use. Some other person - or process is trying to access the file, and it’s stopping you writing to it. If somebody else has the file open, it may put a lock on the file that prevents you writing it. Or it may be an automated process that creates a backup copy of all files.

Is Excel configured to autosave? Since you’re holding the second file open as you change it, it might be autosaving at the time you try to do a save. And yes, that will cause a hiccup.

I like your error handling with multiple attempts. That’s probably the best way around it. I’d include a sleep function for a second or two in the error handler as well.

Yes it is! I hadn’t thought of this. I may try to test this later this week. I’m getting a second computer today or tomorrow and I might just be able to give this a with or without test.

Thanks for the suggestion.

What difference does it make. You’ve solved your problem by adding error trapping, which you should have locally for all I/O operations anyway. Do you think that is the only possible cause of an error?

The difference is that I will understand why the error happened. I’m an engineer by training and a programmer by necessity so sometimes common sense programming habits elude me. :smiley:

You’ve got the right idea though. ‘Engineer’ is curable. Record unknown errors that you trap.

‘Engineer by training’ :smiley: Good one!