How to I plot a curve of a random number distribution?

…without having to resort to heavy duty simulation software.

I am currently designing a game (aren’t I always?) and wishes to see the curve of random numbers determined by 1 to 10 + 1, then 2 to 20 + 2, all the way till 10 to 100 + 20.

I can program in C# - I know how to generate a large enough sample group and write them as a delimited text file. But how do I plot the graph? I am thinking that Excel may be able to handle this; no idea how to do it though.

Any idea would be welcomed!

In MatLab, once you build the matrices, for simplicity let’s say:

x = [2 3 4]
y = [5 6 7]

instead of running the command

plot (y,x)

to plot y with respect to x, you can simply type

plot (y)

and it will plot y versus its index. In other words, the first command would plot (2,5) (3,6) and (4,7), while the second would plot (1,5) (2,6) and (4,7).

Hope that helps, I’ve never programed in C.

ETA: Excel will handle it to, you simply select the data you’d like to plot, and don’t select anything to plot it against. It will then just plot the data against its index.

I’m confused by your nomenclature here… Are you just “rolling” some number of dice and looking for a histogram of the results? If so, it’s probably easiest to write a program to tally up the results appropriately and then just plot it in Excel. The brute-force way to do it would be the following: If you ran your program and got the results

1,3,5,3,4,2,5,6,7,2,6,7,2,8,3,6,3,7,2,9,2,2

then you’d just run a loop that cycled you through the entire list, incrementing the elements of another array appropriately. In pseudo-code,


For x = 1 to length(results);
  histogram(results(x)) = histogram(results(x)) + 1;
Next x

This would result in the list “histogram” containing the values

1,5,4,1,2,3,3,2,1

which are, respectively, the number of 1’s in “results”, the number of 2’s in “results”, etc.

For the record, Excel does have some histogram-generating abilities of its own, but they’re kludgy and not all that straightforward. If you’re already using a programming language, better to generate the histogram data directly.

I think you might want look at the open source (and thus free) gnuplot. It’s used a lot in academia.

If you’re using such small distributions, I wouldn’t bother with sampling the distribution (that is, simulating the rolling a large number of time). Instead, loop through all the possibilities and keep a running total of how often each result occurs. In psuedo code, something like this, for 3d10+3:


int hist[1000]
for die1 = 1 to 10
  for die2 = 1 to 10
    for die3 = 1 to 10
      hist[die1+die2+die3+3] ++
    end
  end
end

This computes the exact distribution. Add more loops for more dice. Even with ten dice, a modern computer will compute the histogram quickly.

To graph very simple histograms, have a loop type a number of asterisks or octothorpes on each line before typing a CR/LF. Output this to a text file or to a multiline text box, or to the command prompt. You don’t need to go outside C# to do this. And, it sounds like you can already do all the hard parts inside C#. You sure don’t need to add heavy duty simulation software to create the visual structure of a histogram!

Some of you caught on to what I am trying to do…

I am actually interested in finding out the distribution of from 1d10 + 2 (1d10 being a ten-sided dice), 2d10 + 4 and finally all the way to 10d10 + 20, so basically the range of number is from 1 to 120.

Yep, I probably will be doing all the hard work in C# and just export 120 numbers (frequency) - however, the trick is I need to do it for 1d10 + 2 once, then 2d10 + 4 once, all the way till 10d10 + 20…

Though I would probably just try 1d10 + 2, 3d10 + 6, 5d10 + 10, 7d10 + 14 and 10d10 + 20…

The reason for Excel is, well, so that I can gauge what values are considered low, average and high to set as difficulty rating for a typical RPG skill check.

ETA: It’s a computer-based RPG, so don’t worry about the number crunching :smiley:

By the time you get up to three or four dice, you’ve basically got a Gaussian distribution. The variance of a single n-sided die is (n[sup]2[/sup] - 1) / 12, and when you roll multiple dice and add (or subtract) them, you add the variances to find the variance of the total. The standard deviation of the total is just the square root of the variance. Adding a constant doesn’t change the standard deviation.

So, for instance, 10d10+20 is a Gaussian with mean of 75 and a standard deviation of 9.083 . Now you just need to plot a Gaussian, which should be easy in any plotting program.