Summing fom 1 to n^x for various values of x

I’ve been working on and off on a formula that will generate the sum of n^x from 1 to n, given n and x. (This morning, I borrowed a classmate’s TI-92 and got x=4-10 formulas. I must have this calculator!) I looked on Wikipedia, and they said that the function to do that is recursive (indirectly). Is that proven?

Every Turing-computable function in N -> N is recursive.

Note the double meaning of “recursive.”

Def. 1: Something defined or computed in terms of itself.

Def. 2: Computable by a Turing machine (or equivalent) that halts on all inputs.

(In the 1920s, some people thought all computable functions could be defined in terms of simple recursion. Hence the historical basis for using “recursive” to mean “computable.”)

The first definition applies here.

The Bernoulli numbers can be computed recursively. But they don’t have to be. In fact, nothing has to be computed recursively at all so there can never be a proof that states that recursion is required to compute anything. (And it is usually a Very Good Thing to avoid recursion for efficiency reasons.)

Define “simple recursion”. The fact that Lambda calculi and Turing machines are equivalent proves that their classes of computable functions are identical, and (AFAIK) nobody’s come up with a more general sense of computability.

One (of several) methods to define a subset of computable functions using recursion are the primitive recursive functions. A proof of a computable non-primitive recursive function was given by Ackermann in 1928.

I still think your implication that the use of the term “recursive” for “computable” is less than accurate is somewhat misleading. The Ackermann function may not be primitive-recursive, but it is recursive. Every computable function is, unless you’ve got a revolutionary discovery you’re keeping from the rest of us.

Yes, but you asked him to define ‘simple recursion’ and he presented primitive-recursion, which seems to fit the distinction he was implying earlier on.

Can we agree to handshake and make nice at this point? :wink:

Oh, I agree that he defined the term I was asking about. That’s not the point.

This parses as “It’s inaccurate to use ‘recursive’ to mean ‘computable’, but we do anyway because someone made a mistake way back”. If the assertion was that it’s inaccurate to use “primitive recursive” to mean computable, that would be fine. “Recursive” is more general than “primitive recursive”, however.

The basis for using “recursive” to mean “computable” is
(a) the well-established equivalence of the Lambda calculus (general recursive functions) and Turing machines (Turing computable functions)
(b) the Church-Turing Theses that these two equivalent classes of functions are the only ones that can be computed in any meaningful sense.

Bringing up the mistake people made about primitive recursion can only serve to slam recursion in general, when in fact it’s a particularly elegant mode of computation. ftg may not like it in practice for various reasons, but it runs rings 'round Turing’s approach in a number of theoretical uses, such as proving fixed-point theorems.

ftg’s not talking about recursion as a model of computation. He’s talking about recursion as a programming practice, where it really is best to avoid it wherever possible.

He was at the end of his post, but his claim about “historical basis” was blatantly false. “Recursive = Computable” is not about programming practice. Further, the OP was questioning that a given function was called “recursive” in the computability-theory sense, not in a practical sense.

In some languages, perhaps :wink:

Mathochist: You have completely and totally misread both my posts. What you think I said and what I actually said are two different things. Note in particular your use of the word “implication”. You definitely are making a mistake there.

The mistakes you are “finding” are in your misreading, not in my posts.

I wish I’d known about the other meaning of recursive when I first posted this thread, to avoid this mess. I meant that (it looked like) they said it could only be done recursively.

I’ve used Mathematica and Excel to generate the summation formulas for x up to 16, and I’ve noticed a whole lot of patterns in the coefficients of the completely expanded formulas. (They all come from Pascal’s triangle)

Okay, then would you more clearly explain yourself? You assert that the fact that “in the 1920s, some people thought all computable functions could be defined in terms of simple recursion” is “the historical basis for using ‘recursive’ to mean ‘computable’”, do you not? What is the purpose of this parenthetical other than to explain a (false) distinction between your two given definitions?

In fact, let’s look at those, which were really the core of your post:

Now, definition 1 is rather fuzzy, but I don’t see how it differs from the method of an untyped lambda calculus. If that’s the case, then it is equivalent to a Turing machine, so there’s no difference in your definitions at all.

Really it comes down to this: do you assert that there’s any difference between the classes of recursive functions and (Turing-)computable functions? If so, what exact difference do you see?


int function Forever(int n)
{
   return Forever(n);
}

Forever is recursive in that it calls itself. It is not recursive in being computable by a Turing machine/Lambda calculus/Post system/mu-recursive system, etc.

The Arithmetical hierarchy itself is defined recursively!

Words can have two meanings, in this case, very close meanings that make sense if you know the history.

Ah, there’s the distinction. My mistake was reading your “functions defined in terms of themselves” the way everyone seems to use it casually, with various unspoken riders about either well-definedness or bottoming-out.

Also, I assumed that you meant both definitions as technical senses applicable to the situation at hand rather than one a technical definition and one a general one.