Statistics on the difference between projected and actual costs?

It’s well known that the cost of any plan rarely fits its initial budget. But on average, how much, in term of percentage, is the inflation? I’ve tried quite a variety of keywords and phrase combinations, but my searches yielded no result with that % value. Could you help? I wonder whether there have been any research on the increase of final costs over planned / estimated values, using big data?

Personally I think having a somewhat definitive number will help tremendously in ballparking a project’s budget.

Look for “cost overruns” and they are well documented and researched.

They depend on the industry and types of projects.

One big area of cost overruns is IT projects, and you can find lots and lots of information on that on the internet.

Here’s just one example.

It works the other way around - its not how much excess (and inflation is the wrong word to use here) the final cost, its how great a contingency you should be including at the beginning of the process to cover unknowns and risks as part of your estimate.

In complex engineering projects, like bridge construction you start with an estimate including a contingency factor that could be 100%+, ie more than double the likely end cost or more. This is narrowed over time through further design refinement and planning until the actual expected cost is known.

The base cost of building e.g. a bridge is readily calculable and can be worked out to $X million per metre. (most industries have estimator tools based on end data) What is not known at initial planning stage is whether there are other necessary costs that need to be taken into account - say environmental, geotechnical, hydrological for your bridge. To cover this a contingency is added in initial budget (if not then, you’ll never find the money later when its actually got to be spent).

The contingency is reduced or refined as risks are investigated and their effects on the design understood as it progresses from strategic → concept → detailed (20% - 50% - 80% ) design. Your actual cost is the cost to build, having accommodated risks and complexities. Cost to build is usually the fee set in a competitive contract tender, plus anything that the contractor can still show was necessary but not costed in their bid (which means the builder is also pricing in their own risks of bidding high and not getting the job, versus going cheap and hoping to recoup costs otherwise - so its not an actual ‘real’ real cost of the work).

Much depends on the process within which the projection is produced.

Public attention/concern about cost overruns is mostly focused on publicly-financed projects. But the process there is pretty much designed to produce large cost overruns.

In most developed countries, public procurement laws will require commissioning authorities to accept the lowest bid, unless there are demonstrable compelling reasons for accepting one of the others. Thus bidder are strongly incentivised to submit a bid that is as low as possible.

Since all the bidders compete in the same market to hire labour, source supplies and materials, etc, opportunity for reducing costs by comparison with other bidders is limited. If you can genuinely come up with a new and more efficient way to carry out the project that other bidders haven’t though of or cannot access, great, but that is not a common state of affairs. So how do your lower the amount of the bid?

By assumptions and exclusions, is the answer. You make aggressively optimistic assumptions about all the variables in the project, and they you say that if these assumptions are not born out you will charge more. Individually, each assumption is (just about) credible, but the chances that they will all be born out are vanishingly small. Thus cost overruns are all but inevitable. And the bidder who submits the lowest bid, with the maximal chance of cost overruns, will win the tender.

Thanks guys.

That’s a problem with the system, I agree. Using only 1 criterion - bid value - to make a big decision is pretty stupid even in a capitalist economy. But that’s a bit too broad for this topic. I seek it more to apply to my own projects.

Does that mean it’s refined all in the design process, not during execution? And yes, I do want to use an approach similar to what you described. That is, after trying my best to think of every possible costs in all scenario and preliminary number, I want to multiply it with a certain overrun% to come up with a final budget for, like, everything I plan.

Ah, the proper keyword helps! The paper you linked is really to-the-point, with nice graphs to assist too. Yet, I find them to be rare in my searches, and the “general %” is still elusive, so I’m interested in how you did it. I assume you used google? Could you advise me the details on how you arrived at these ‘better’ results? As @UDS1 put it, the papers are skewed toward IT & public construction projects, but surely there must be statistics on personal, familial, and (many) small company projects? What’s the word to describe a research that collects data from other studies to come up with a “big summary” type of paper? I definitely came across that term before, but can’t recall. Trying “meta research” led to facebook R&D…

It sounds like your’re trying to do the right thing, which is to pull the project apart and look at what could be a problem in the many moving parts that contribute to your new build. Although its hard to quantify, every dollar spent sorting a problem before you start building will be a lot cheaper than trying to fix it once you’ve started. if you can, pull up carpets, get in tight spaces and look at the condition of everything beforehand. Don’t pull up the carpet on Construction Day 1 and find the floorboards were replaced with papier mache.

For what its worth a builder (who is unusually rigorous about his cost control and therefore will never be an overnight millionaire) told me that half of his jobs come in within 10% of his quote, another 20% within 20%, and almost all within 50%. He said that if a job has a 50% over-run in his line of work then the problem almost certainly has nothing to do with the condition of the building or anything that money could have fixed more cheaply, but is a problem between the client’s brain and their mouth.

Another interesting statistic would be the relationship of cost overrun to the process by which those are paid. If a project is guaranteed to be fully funded by the funding party, and the contractor does not eat any portion of the overrun, then instinct tells me the overruns will be many and much.

Mitigated, usually/sometimes, by the requirement that the bidder show they have a track record of being able to do the job. I can’t just make up a bridge bid, submit it, and then when I win the bid start looking for people who know how to design and build bridges. Usually some demonstrated capability is needed.

But yes, most allowance for contingency is based on past experience and is factored into the estimate. Also, most major porjects, for example, are done in stages - the design for the new aircraft or bridge is out for bid and complete before the actual construction starts, and as Banksiaman points out, each successive stage the risk is assessed and the possible contingencies identified, so they should be less vague.

Are you looking for “metadata?”

Do you have a specific project you are budgeting for, or is this just a general interest activity?

I’m not sure is this approach is the most optimal if you have a specific project, because what’s the next step if you found average cost overuns for (insert your project here)?

As others have noted, there are averages, but there are also projects which are much higher.

It may be helpful looking for more specific advice, which can then help narrow down potential problems.

What TB said. The average might not be very useful information here, because averages can mask huge variations and because, while the occurrence of contingencies may be randomly distributed, how well they are factored into project budgets is not. Much will depend on the nature and likely timescale of the project — some are much more susceptible to contingency-driven blowouts than others — and even more will depend on the quality of the cost projection process and the risk tolerance of the person conducting it.

Thanks again, informative as always. I have a lot of projects both in practice and in my mind, ranging from featherweight to staggeringly huge. Being lazy as I am, I just plan to slap that overrun% and call it a day; but you guys convinced me that medium and big projects need specialized case study. Still, I’m of the opinion that for small and personal stuff, where the cost and availability of research are not always easy, a general % is more beneficial. The concern about randomly spiked contingencies could be alleviated by adding yet another layer of 1 standard deviation over that overrun%, and maybe setting a cutoff rate where, if cost rises over, say, twice the calculated budget, then it’s off: canceled. What do you think?

Anyway, to heed your advice, here’s a case in point that’s most immediate to me: I’m going abroad for education. Of course, the university lists the COA for a typical foreign student on its website, but already I have experienced many types of ‘auxiliary’ fees that seem not to be accounted in that number. The cost of this whole thing is significant enough to worry me that, if the overrun% is above certain threshold, I’ll not be able to complete it.

I understand your dilemma, but I’m not sure than an average will be helpful to you. Suppose you find an average cost overrun on household projects with a budget of $10k-$25k of (picking a figure out of the air) 25%. That tells you next to nothing at all about what the cost overrun on your two-year Master’s degree in Paraguay will or might be. It gives you no reassurance whatsoever that you can complete the Master’s at a cost you can afford. Deciding that you will abandon the Master’s if the costs blow out by (say) 50% is something you can do now, if you want, but you can do it without knowing what the average blowout is. The salient datum is not the average blowout; it’s what you can afford/what the Master’s is worth to you.

In theory, what knowing the average blowout might do is enable you to get a sense of how likely it is that you will have to abandon. But, really, I think there are better ways to evaluate that. Rather than average cost overruns, I would try to find people who have competed a similar degree in Paraguay and talk to them about what it cost them — not what they budgeted and what the overrun was, but simply what was the cost. And I’d do other research (which you’re obviously already doing) on what things (that will be relevant to you) cost in Paraguay.

Looking for a general answer for cost overuns and then applying that to seems like a classic example of garbage in = garbage out.

Good luck with that and let us know how it goes.

Big public projects usually require political support. To persuade legislators to approve the billions required to build your high-speed rail link (HS2 - UK) or massive highway interchange (WTC Rail Station - USA) the projected cost is minimised.

Once approval is granted, all kinds of unforeseen costs come to the surface. These can be anything from the need to tunnel under ancient woodland and seriously underestimating the costs of acquiring land for the former, to "a morass of politics and government.” with conflicting demands. for the latter.

I think that most of us assume that the projected cost of these projects will be less than half the final cost.

Whether it’s a large public works or simple home reno, quite often “overruns” are a result of changes in the project requirements and specs. (I recall one fellow on a home renovation show talking about “money pits” and said one of the bigger problems was owners who start asking for modifications - “make the shower bigger”, “Can I replace that with a longer granite counter, move the sink” -once the project was well under way, then are shocked at what it cost to make changes).

for IT, it was known as “scope creep”. One I was involved in, was tracking lab samples analyzed. Then, “can it also track analysis machine calibration?” and “can it track machine maintenance history…”. No! It’s a database of samples analyzed.

For reasons I don’t understand, many estimations seem to follow Hofstadter’s law which says “it will take longer than you expect, even when you take into account Hofstadter’s law”.

My best guess is that the reason Hofstadter’s law often applies is that consciously or subconsciously everyone shaves estimates and semi-expects over-runs. Meaning that estimates are always somewhat under, and actuals are always allowed to be somewhat higher than the estimate, because underneath it all, that’s what we expect.

Others have already pointed out that there is probably no general answer to your question. But on top of that, even if there was some usefully generalised average over-run percentage, adding that percentage to your projects will probably just result in the over-run increasing.

My appreciation.

I assume that if there exists a general overrun%, then the average blowout (spike / deviation, if I understand correctly?) will also be available. As for the last point, I have to completely agree. The problem (for me only) is that the MA’s worth is also really hard to calculate due to many unclear variables. I’d love to have another pair of objective eyes to shine a light on that, too; but it’s uncomfortable to share here - perhaps some PMs are more appropriate. Would you mind helping me in this? I must say beforehand that you might encounter frustratingly vague concepts, and the conversation might inflate to a surprising proportion, so I’m perfectly OK if you say no.

I think that’s a pretty common heuristics that some people use. The issue with this manual data-collecting approach is that the sample size is usually small, thus any spike could really skew (and screw) it. Meanwhile, enlarging the database is very costly in of itself. Nonetheless, I’ll take your advice and ask some people I’m already in contact with :slight_smile:

Haha, shame on me. In this case, not finding out a general % might be a blessing in disguise.

That I can relate, for I’m in an IT project as a customer. I partly prompted a scope creep myself, lol. Thankfully the programmer is pretty cool with it.

That’s… thought-provoking. My knee-jerk reaction to it is that the “new” overrun will be smaller. Using UDS1’s example above, if one thinks of all scenarios, calculate the preliminary number, and slaps 25% contingency over it; then the ‘unavoidable’ new overrun would likely be 25% of that extra 25%, or 6.25% - a more acceptable range for face-saving or whatever purpose.

I’m certainly not saying you are wrong because I really don’t know the answer to all of this but I suspect it may not work like that.

If I’m right about why over-runs occur even when we allow for over-runs in estimation (a big "if) then I suspect “anchoring” applies - we subconsciously anchor off the estimate and consider a certain degree of over-run to be acceptable. And if that’s right (lots of “ifs” here…) then the issue is whether we anchor off the overall estimate or the estimate of over-run.

In other words, if we estimate that a project is going to cost $100,000 but then we add on an allowance for overruns of $50,000, do we “anchor” off the $50,000 when subconsciously letting ourselves overrun, or do we anchor off the $150,000 overall estimate?

If I had to guess, I’d guess the latter. Meaning that by increasing the overall estimate all we have done is increase the extent to which we will overrun.

I’m happy to help if I can (and feel free to PM me) but I’m not sure that I can actually be much help. The budgeting/projecting process for any project is hugely specific to the nature of the project, so general information about budget blowouts on projects of all kinds is basically useless to you here. Running with my example, it seems to me you are looking for information about budget blowouts for people who do Master’s degrees in Paraguay. OK, we could broaden that slightly (postgraduate degrees in Latin American universities?) but, still. What you’re looking for a study that:
(a) looks at what Latin American universities tell prospective postgraduate students from overseas that it is likely to cost them to get their degrees, and
(b) looks at what it costs overseas students to get postgraduate degrees in Latin America, and
(c) identifies the average variance or range of variances between (a) and (b).
You can then apply that variance to the projection you have received to get a sense of the likely actual cost.
You’d be lucky, obviously, to find research of that kind.
But I think your approach is overengineered. What you are actually interested in is (b), what it typically costs. That information might well be available other than in the very specific studies of variance between budget and actuality that you are focussing on. I think you’ve adopted this focus because you’ve got a projection from your prospective university, and you are trying make it more useful to you. But it may be better to accept that, actually, it’s of limited use to you and to supplement it with other data about what postgraduate degrees in Latin America cost. It seems to me, intuitively, that that data will be easier to source than data about variance between projections and actual costs for postgraduate degrees in Latin America.

Yes, one of the biggest drawbacks of IT projects is the constant creep - but the project won’t go live until all those pieces are implemented and tested. IT has over time compensated for this with versioning - the first version is X, the next version adds some Y capability, the third verion refines it and adds Z, etc. Unfortunately, this simply turns a project into an opened ended endeadvour.

To quote the Frantics album Boot to the Head, it is “not a path leading to a door but a way leading forever to the horizon.”

I have to agree that for public construction especially, announcing an already inflated number is quite a bad move. I hope that, for individuals, if we’re well aware of the prepared overrun amount then our conscious will be stronger than our subconscious biases.

I PM’ed :slight_smile: