Math: Intergals harder than derivatives, why?

Well, yes, of course the function is non-analytic at the origin and I’m stuck with a Taylor series that only converges over a finite radius. But then, the same is true when I expand (1-x)[sup]-1[/sup], which is certainly a nice happy function to expand in series when the series is convergent.

I guess I was reading Chronos to make a stronger statement than he meant (this happens at 1:30 in the morning). Or else my interpretation of “represent a function as a Taylor series” as “represent the function as a Taylor series in the disk in which the Taylor series converges” was too naive.

I have seen an infinitely differentiable non-zero function whose Taylor series converges to zero everywhere, but I don’t have those notes anymore.

My two cents, on an rather unrelated note.

It has to do with the fact that the set of functions that we deem elementary is fairly limited and totally arbitrary. Elementary functions’ antiderivatives are not necessarily elementary, like they said before…

And I’m not buying the whole notion that derivative is, somehow, “less complex” than the original function. How is exp(x) less complex than exp(x)? :slight_smile:

g8rguy, I would take “represent a function as a Taylor series” to mean that the entire function is represented by one Taylor series, not that restrictions of the function to certain disks can be represented by one Taylor series.

That’s what I thought at first too, but it happens with a lot of stuff. Say we just call polynomials our elementary functions and leave it at that. At that point, everything’s happy. The polynomials are closed over differentiation and integration. But that’s the last time that happens.

As soon as we consider anything else to add to the mix, we break out of our set, always through integration, never differentiation. Rational polynomials? No good, because 1/x forces us to define a new function. Composing with exponentials gives us exp(x[sup]2[/sup]), the integral of which we don’t even have a name for. What about trig or hypertrig? Again no good. The integral of sec(x) involves a logarithm. Then there’s sin(x)/x, sqrt(cos(x)), or a whole host of others.

Is there any set of functions which is closed over integration but not differentiation? I think I’m missing something by asking this…

Fair enough, then.

The set of functions asymptotically bounded below by x[sup]2[/sup].

Could you explain what you mean by that, in a little more detail?

Any function f such that the limit as x goes to infinity of f(x)/x[sup]2[/sup] is equal to infinity is asymptotically bounded below by x[sup]2[/sup].

Alright, that makes sense. I would have been surprised if the answer was no. Wouldn’t the set of functions bounded below by 1 also qualify?

Note that asymptotically speaking, the set of functions bounded below by one is the same as the set of functions bounded below by any function which is bounded above by a constant.

The answer’s yes. I think that you would find that the antiderivative of f is always an asymptotic upper bound for f (or they’re asymptotically incomparable).