I think I have a fair sense of what ‘opportunity cost’ means. What I don’t understand is how an accountant, or other financial type, actually goes about estimating the dollar value of the opportunity cost for a given scenario, i.e. how do you determine the “cost” in opportunity cost?
Do you estimate it by assuming the capital expenditure was, instead, simply invested at current (and predicted) rates of return? Do you use historic comparisons? I am certain it’s not that simple so would be interested to learn how it’s really done.
Theoretically you compare the (expected) benefit of the expenditure with the (expected) benefit of the next best use of the resource. As you guessed, determining the next best use is actually the challenging part.
To make things simple, an organization’s management can just set a Minimum Acceptable Rate of Return (MARR). The MARR should hopefully be set with an eye towards what the organization can easily get from a default investement, ie what the opportunity cost is with some fudging for risk/risk tolerance. This will vary based on market conditions and industry.
In reality, how much the person doing the analysis (or their boss) likes the project in question seems to affect what opportunity cost is assumed.