Math Q: slope and average % change

I have to explain this to my boss. I’m not sure how, and I haven’t been able to Google someone else’s explanation.

I’ve got a bunch of data points, (time on x, dollars on y) to which I’ve applied a best-fit line using Excel (y=2.3582x + 87.407). To find the average percentage increase over time, I believe that I must divide the slope by the average of my data points’ y values.

2 questions: Am I doing this right? Why is this the formula?

Is it a function or a scatterplot? If the best-fit line fits enough (r>.5), just use the slope…unless I’m missing something. I don’t really understand what you mean by “average %”. % of what?

The slope is, by definition, the average increase over time. If you’re doing a linear fit, it’s not going to be a constant increase in percentage. In your example, your slope is 2.36 dollars per year when you’re making 87.4 dollars, or 2.7%. However, when you’re making 1,000,000 dollars per year, you’re going to have a negligible percentage increase.

This is an extreme case, of course.

So, yes, you can get an average by using the average of the y values, but it’s going to be a pretty meaningless number. It would be more meaningful to figure out each data point’s increase, and take an average of those numbers. That still won’t give you what you’re looking for, though. Give me a minute, I’ll try to find a compounding interest formula (it’s not compounding, but it’s similar).

If you’re actually looking for an average percentage change, you should take the logarithms of all the points, then fit a straight line to them. The slope of the line, when interpreted properly, tells you an average change in percentage terms.

Thanks for the replies. Perhaps it would help to explain the intent of the exercise. We are trying to apply a theoretical value to commercial property. One accepted way of doing this is to generate a multiple-regression model of the properties that have sold over a period of time to determine the effect of various characteristics on price, and then applying that model to each property.

First, however, we need to determine how the market has changed over a specific period of time. I have gathered a bunch of sales, and scatter-plotted them. I’ve then applied a best-fit line. According to the authorities on these matters, this is one of four methods of estimating how the market has changed over time.

So, I have a scatterplot of all of these sales with zeroed time on x (that is, time has been put to zero right around when the first sales record is, with months from that time then demarked on x). The dollars per square foot is on y.

What I’m trying to determine is the average percentage increase over this 18 month period, so that it can then be applied in later calculations. This average percentage increase, I believe, is the slope divided by the average of the y values. Or, it can be calculated by taking the last value–(2.3582)(18)+84.407-- minus the first value–(2.3582)(0)+84.407, divided by the average of the y values.

I’m pretty sure that this formula works. I’m just not sure why I’m using the average of the y values. Any thoughts on that?

This. You’ll have to compute the exponential of the answer, though.