News:

Forum changes: Editing of posts has been turned off until further notice.

Main Menu

Plea for mathemtical help

Started by Black Iris Dancer, January 20, 2005, 02:10:21 AM

Previous topic - Next topic

Black Iris Dancer

Okay, so, some friends and I have been playing with dice mechanics for a system we're developing, and we've stumbled onto a mathematical problem that is, honestly, beyond us.

Here's the problem in English:

In our game, players roll handfuls of d10s with a target number indicating success. Say the target number is 8 and 10's are re-rolled (pleasant probabilistic result: each die has an expected value of 1/3 successes—3 dice expect 1 success). For the most part, only n, the number of dice rolled, is variant. Good so far? We have the probabilities for all basic dice rolling, so that isn't an issue.

Here's where things get interesteing: for some kinds of actions, characters may roll multiple times, and choose the best roll (canonical example: spending two months in magical ritual). Now, obviously, this changes the probability distribution. The question is, how much, and in what way?

In math, then, here's the problem:

We have a set of k random variables X1, X2, ..., Xk. Each Xi represents a dice roll. Your pool doesn't change, so they have the same probability distribution P. (I'd like to solve for an arbitrary P, but for the sake of this particular problem, we could take P to somewhat resemble a normal-ish curve with a right tail out to infinity.)

We want to find the random variable Y, where

Y = Max(X1, ..., Xk)

Specifically, I'm interested in ||Y|| (the expected value of Y), and somewhat interested in its variance. Also, I'm very curious as to how ||Y|| varies with k. If I were a betting girl, I'd bet that it's sub-linear (positive first derivative, negative second derivative), since our P falls off pretty sharply. Put another way, if you roll three times, you're expected to get a slightly better roll than you otherwise would have. If you roll a million times more, however, you're not expected to be a million times better, or anything even remotely like it.

I can write something to solve this with brute force (go through and figure out all possible sets of rolls that lead to the best one having one success, two successes, etc... and generate the probabilities of each), but I'm not terribly happy with the idea. I'd like something analytic, at least in terms of P. Also, I'd ideally like to find a discrete PDF model for P. I know there are various stats and probability packages out there that could help, but I don't really know which one to toy with; even a nudge in the right direction would be much appreciated.

(Also, you might be wondering at this point: why does she care? Curiosity, mostly. I suppose it would be nice to know how many times, at most, we should have players roll their dice pool [that is, before the change in the PDF becomes irrelevant], but we can pretty much cut that off somewhere arbitrary, or even not cut it off at all—hey, if you spend two years of your life in ritual of some kind, then you can roll a bunch of times, who cares? Also, I'm almost totally sure ||Y|| with respect to k flattens really sharply, so I'm not too worried about having ridiculous quantities of success. Really, I'm just curious as to how this works out.)

eef

My first answer is to actually go do the knarly crunching.  You'll get a full distribution , can watch the distribution change with P, all kinds of good things.  Especially since your system doesn't quite have linear dice.


More analytically:

To get (Y=y) you need no Xi greater than y and at least one Xi = y; that is
(no Xi > y) * (not(all Xi < y)) =
(All Xi =< y)*(1 - (All Xi < y)) =
(Prob(X1 =< y)^k)*(1 - (Prob(X1 < y)^k))

Sanity check on the formula:  if k=1 we get
Prob(Y=y) = Prob(X1 =<y)*(1 - Prob(X1<y)) = Prob(X1 =<y)-Prob(X1<y),
which checks out.

Now plug in the distribution for X and Bob's your uncle. :-)

Like I said, Monte Carlo sims may be the way to go after all.
<This Sig Intentionally Left Blank>

shaheddy

It's easiest with the cumulative distribution functions. Let C_X(n) be the probability that X<n. Then

C_Y(n)=C_X1(n) C_X2(n) ... C_Xk(n)

since, for Y to be less than n, all of the Xi must be less than n. Then you take first differences on both sides, or derivatives. In your case, I believe the C_Xi aren't too difficult to find.

Another idea you might want to consider is approximating using Lp-norms. That is, if

Y_p=(X1^p+X2^p+...+Xk^p)^(1/p)

then as p goes to infinity, Y_p should converge to Y. You will have to assume that the Xi are nonnegative, which if you are rolling dice, should not be a problem! I don't know enough probability to make this idea more explicit, unfortunately, and it might not be worth the extra complication.

shaheddy

I just realized that your exact system is basically a Poisson distribution, ie the pdf is e^(-x), so that the first method can actually be worked out explicitly. You just have to integrate by parts to get the expected value, which turns out to be the kth convergent of the harmonic series - so that the expected value goes to infinity.

Well, it is sublinear at least ...

You should definitely check my work, but if you want to see all my computations, let me know.