Lagrange Error Bound Confusion

If you have e^x and you want to make sure that the error between the Taylor series (centered at 0) and the actual function is less than 0.000001 when x=2, to what degree must the Taylor polynomial be?

I understand that the error bound uses the max value of the n+1 derivative, but how would you find the max value if e^x has a range of (0,inf). Would you use e^2 as the max value?

This is what I have been looking at.

P.S. I got my answer as the 14th degree polynomial and I am uncertain if it is correct.

e^x is an increasing function. Therefore we know that, if c is between 0 and 2, e^c is between e^0 and e^2, with a maximum value of e^2.

Oh, okay thank you. although little, that clarified a lot.

Your problem, such that it is, is that expanding exp(x) into a Taylor series at 0 is not going to result in a particularly good estimate of exp(2). That is why you have to take it to such a high degree.

If you allow yourself to use a different polynomial, you can approximate the function over your entire interval [0,2] using a polynomial of much lower degree. 7th degree ought to do it in your case.