News:

Forum changes: Editing of posts has been turned off until further notice.

Main Menu

Algorithms, complexity and what any of this has to do with R

Started by Don Lag, June 13, 2001, 04:30:00 AM

Previous topic - Next topic

james_west

Being intellectually lazy when it comes to these issues, I've always approached the probability distribution of a game (when I cared that much) by just having the computer do a million rolls and give me the result distribution. You can write that code in about five minutes, it'll run in a minute or so, and it doesn't break your head, no matter how byzantine the rolling procedure (and some games have -darned- complex rolling procedures). It's also gonna be pretty darn close to the 'real' probability distribution.

Gordon C. Landis

Quote
There's a math technique that's pretty useful called Probabilities (I'm not being sarcastic, it's just obvious you aren't too familiarized with it).

No offense taken - applying probablity theory IS a better way to go, most of the time.  I am familiar with it, but not especially sophisticated in its' use - and the particular problem the huge SQL tables were put together to solve does not fit easily into any probability equation I'm familiar with.  

There's also, as someone else here mentioned, the Monte Carlo route - roll a million times and track what happens.  That's often a "good enough" answer, and how (if I remember right) Jared got what he needed on the doubles-triples-etc. question.

I went with the big SQL tables because 1) I could, 2) It actually fit as part of a work project, 3) It would become a resource I could then continue to ask other questions of (OK, now lets count triples and quads as "the same" level of success - what's that look like?), and 4) It would produce a complete, absolute answer.

Gordon C. Landis

www.snap-game.com (under construction)