Probability formula for number of rolls to get a particular result

Sure, that’s one approach, and gives some intuitive feel for the result. But you could also consider the sum:
1 +2x + 4x^2 + 8x^3 + \ldots = \frac{1}{1-2x}
That only converges for x<\frac{1}{2}, but you can use analytic continuation to get around that.

As for p-adic numbers, I thought this to be a good intro:

I didn’t realize that some of them contained roots of -1 without having a special “i”.

Well, there are definitely various methods of summing divergent series, some of which do have applications in analysis as well as physics, engineering, and so on.

I am no expert in probability, but I think many times it does not really matter if a series, like a generating function, actually converges, as long as you can work with it as a formal series.

On the other hand, now if X is a discrete random variable, its moment generating function \operatorname{E}(e^{sX}) — this is essentially like a Laplace transform — will not always converge.

N/2 is a good approximation for large values of 2.

TL;DR
I’ve been thinking about the divergent series
1-2+3-4+\cdots and how it might sum to 1/4. There is a process called Cesaro summation that can be used to assign a value to certain divergent series. If you are not interested, stop here, but I wanted to describe it for any who are.

First consider the series
\frac1{1-x}=1+x+x^2+x^3+\cdots
Letting x=-1, we “get” \frac1 2=1-1+1-1+1-\cdots
which is of course nonsense. The sequence of partial sums is
1,0,1,0,1,\cdots which obviously does not converge. Cesaro’s idea was to replace the n^{th} term of the partial sum by the average of the first n terms. When this is done, you get the sequence, called C_1 (the reason for the suffix will come clear 1,1/2,2/3,1/2,3/5,1/2,4/7,\cdots, which evidently converges to 1/2. Coincidence? Probably not, as we will see.

The case at hand is the series (\frac1{1-x})^2=1+2x+3x^2+4x^3+\cdots which, by letting x=-1, gives 1/4=1-2+3-4+5-\cdots. In this case the sequence of partial sums is 1,-1,2,-2,3,-3+\cdots and C_1=1,0,2/3,0,3/5,0,\cdots which does converge, so no help? But note that this sequence looks like it is made by splicing a sequence that is constantly 0 with one that converges to 1/2. Using this fact, it is not hard to show that if we repeat the Cesaro process, we get a new sequence C_2=1,1/2,5/9,5/12,\cdots converges to 1/4. Looking like less of a coincidence.

Here’s a conjecture (I have not looked at it seriously). It is not hard to show that (\frac1{1-x})^3=1+3x+6x^2+10x^3+\cdots, the n^{th} term being t_nx^n, where t_n-\frac {n(n+1)}2 is the n^{th} triangular number. Again, letting x=-1 gives 1/8=t_1-t_2+t_3-t_4+\cdots. This time, I conjecture, you have to go to C_3 to get convergence to 1/8.

Just one further comment. Benford’s law states that in any sequence of number chosen at random and unlimited in the number of digits, the fraction that (base ten) that start with a 1 is log 2, the fraction that starts with a 2 is log 3 - log 2,…, the fraction that starts with a 9 is log 10 - log 9. Cesaro summation gives an explanation of sorts.

Consider the sequence, for example, a_1,a_2,a_3,\cdots, in which a_n represents the fraction of the numbers between 1 and n that start with 3. a_1=a_2=0, a_3=1/3,…,a_{29}=1/29, a_{30}=1/15,…,a_{39}=11/39 and so on. You get a jagged sawtooth graph, that certainly doesn’t converge to anything. Neither does C_1, nor C_2, nor any C_k. But mirabile dictu, if you let k\to\infty, you get a sequence, call it C_\infty that converges to log 4 - log 3.

Well, the faster the terms and partial sums of your divergent series grow, the higher k will have to be in order to possibly make it converge. But if C_k converges for some k, then it can only be to \lim_{x\to1^-}\sum_na_nx^n, where a_n are the original terms.

I wondered about that. Can you give a cite?

Incidentally, I should have mentioned that Cesaro summation leaves a convergent series convergent and to the same value.

Cesaro means are described in section 5.4 of Hardy’s book on Divergent Series. A simple limitation theorem is Theorem 46, which states that if \sum a_n is C_k-summable, then the Cesaro partial sums A_n^{k'}=o(n^k) for every order k'<k, also including a_n=o(n^k).

Theorem 55 proves that if a series is C_k summable for some k, then it is Abel summable to the same limit. (Hardy then given an example of an Abel-summable series for which a_n is not O(n^k) for any k.)

Further limitations of some of these methods as well as necessary and sufficient conditions for summability are described in the following chapters; for instance if the terms of a series grow too slowly then it cannot be Cesaro summable or Abel summable without being convergent in the ordinary sense.

Is it, though? I see no reason to believe that an infinite sum of integers should give an integer result. Any finite sum, sure, but this isn’t finite.

\lim_{x \to -1^-} \sum_{k=0}^{\infty} x^k is obviously \frac{1}{2}. Why shouldn’t we just say that the value of the sum is equal to the limit? We do that everywhere else in math. It doesn’t seem any more unusual than assigning a value to \frac{0}{0}.

We seem to be simul-posting :slight_smile: That is called Abel summation by Hardy. The reason it is not “the” sum is that there are lots of different sums of divergent series defined; however both Abel and Cesaro sums are “regular” in the sense that they leave convergent series convergent to the same value, and even “totally regular” in the additional sense that if the ordinary partial sums diverge to infinity then so do the Cesaro/Abel sums. So if you want 1+2+3+\cdots you will have to employ some other summation method (and give up some nice property, e.g. linearity)

Yeah, easy to see that you have to give up some nice property:
S = 1 + 2 + 3 + \dots
S = 0 + 1 + 2 + 3 + \dots
S-S = 1 + 1 + 1 + \dots
0 = 1 + 1 + 1 + \dots
1 + 0 = 1 + (1 + 1 + 1 + \dots)
1 = 0

Oops.