Thanks for the replies. Perhaps it would help to explain the intent of the exercise. We are trying to apply a theoretical value to commercial property. One accepted way of doing this is to generate a multiple-regression model of the properties that have sold over a period of time to determine the effect of various characteristics on price, and then applying that model to each property.

First, however, we need to determine how the market has changed over a specific period of time. I have gathered a bunch of sales, and scatter-plotted them. I’ve then applied a best-fit line. According to the authorities on these matters, this is one of four methods of estimating how the market has changed over time.

So, I have a scatterplot of all of these sales with zeroed time on x (that is, time has been put to zero right around when the first sales record is, with months from that time then demarked on x). The dollars per square foot is on y.

What I’m trying to determine is the average percentage increase over this 18 month period, so that it can then be applied in later calculations. This average percentage increase, I believe, is the slope divided by the average of the y values. Or, it can be calculated by taking the last value–(2.3582)(18)+84.407-- minus the first value–(2.3582)(0)+84.407, divided by the average of the y values.

I’m pretty sure that this formula works. I’m just not sure why I’m using the average of the y values. Any thoughts on that?