I’ve been maintaining a character creation spread sheet for my d20 Star*Drive game. It’s pretty much entirely free of macros, because I don’t know how to code visual basic. I wanted to expand the sheet so that it could also be used to support a Dawning Star campaign but I noticed at some point that adding the new data was really bogging down the sheet, so that it took several seconds to recalculate for every change.
I’ve tried to develop ways to minimize the number of mathematical operations needed by the sheet, but I’m starting to get the impression that what really slows the calculation down is having more data in the cells.
I’m wondering if anybody has advice about how to maximize the efficiency of an Excel sheet. Is there a significant gain to be had if I learn the Visual Basic?
Here’s the sheet as it was before adding all the new data:
http://www.ashmead.net/~angel/Star*Forge/
Thanks.
I don’t see a spreadsheet at the link you provided.
The speed at which sheets are computed is highly dependent on what you’re doing, and your post doesn’t say what kind of formulas you’re using, how much data vs. formulas, chains of dependencies through the formulas, how many worksheets, etc., etc.
I had a very complex spreadsheet set up to project staffing for a group to support several projects out for more than a year. This spreadsheet was 12M, and linked to another spreadsheet. It had about 30 worksheets, several macros (to program buttons). Each worksheet had at least 1000 cells calculated using formulas.
When I changed a data cell, it seemed to be able to determine very quickly which cells were affected, and do updates quite quickly. Response measured in seconds. BTW, “several seconds” is not a long time for a complex spreadsheet to recalculate.
Whether using VB would help you or not depends entirely on what you’re doing. If you can try again with a link to the sheet I could spend a few minutes browsing it and see if I can help.
The actual spreadsheet is linked under Download Now. Here is a direct link.
Most of the actual logic of the sheet uses the Index, Vlookup and Countif functions. Three are ten or so sheets actually used for interface, and as many holding data or doing the actual crunching of data.
I would have thought that my Rube Goldberg method for removing blanks from a list would have been the thing to slow the system down, but I tried using a macro to sort lists using the built-in function, and it was noticably slower.
Wow, that’s pretty impressive.
Index, Vlookup and Countif are all expensive because they have to scan many cells and make decicions to do their job. But there’s not much in the way of alternatives. At first glance you seem to be doing things straightforward and appropriately. You have a lot of volume but only medium complexity.
Only one alternative I noticed. You have programmed dropdowns as controls. An alternative would be to use data validation instead. To do that, you have to define names for the lists you refer to (because you are referring to data in other worksheets). However, I don’t think that will make your calcs any faster, so it’s not a recommendation, just a noted alternative.
I didn’t pick up on what you meant by a "Rube Goldberg method for removing blanks from a list ". I didn’t do a very exhaustive look yet, I just scanned it for obvious things. It’s pretty darn big.
The functions I’m performing often require me to remove the blanks from a list, so that for example I can take a common list of items and generate two lists whose entries are culled by different criteria – you get a different sublist of feats for level than you do for a given character class.
Instead of having a lot of blanks in the list, I shift the culled items up in the sequence through an elaborate structure that finds each applicable item in the common list in the order that it appears, ranks each item by its position on the culled list, then looks up each rank to see what number corresponds to it. I don’t see any other way to do this. I figured this was a processor hog, but I swear just adding more data (or perhaps the problem is bigger arrays to be processed) is what’s slowing the thing down.