Math help please?

How do you prove that the derivative of the sum as k goes from 0 to infinity of f(sub-k)(x) = the sum as k goes from 0 to infinity of the derivative of f(sub-k)(x). Thanks in advance!

Have you tried anything so far? This sounds like a homework question.

Any proof is going to depend on your axioms. What axioms are you using about limits and sums?

Same question asked last night: Yahoo | Mail, Weather, Search, Politics, News, Finance, Sports & Videos

LOLz at asking a limits proof question on Yahoo Answers. I’d be amazed if they could handle multiplying fractions.

Awesome internet stalking skills.

Apparantly the answer is “v”. You’re welcome OP.

Identical, right down to the sign-off-“Thanks in advance!”

If I were you, I’d look for a good analysis textbook. If this is a homework question, hopefully you already have one. Failing that, you might be able to google something up. I found PDFs here and here that may be of some help, but I didn’t have time to study them carefully.

The OP’s statement isn’t true in general, so there can be no proof.

Yeah, that’s what I wanted to say.

Here’s a trivial counterexample…

Let f_k(x) = 1. LHS is divergent while RHS is zero.

Is it? Sum over k of f_k(x) is divergent doesn’t imply that d/dx(Sum over k of f_k(x)) diverges.

This is all about the proper formal way to define a limit. It looks a lot like the kind of exercises I had 20 years ago (and since my formal mathematics skills have taken some rust, I’ll leave it to other).

This is obviously some exercise and some conditions are needed. Maybe boundedness; I don’t recall. I am an algebraist.

At the risk of doing someone’s math homework, the required condition is that the convergence of the series be uniform.