News:

Forum changes: Editing of posts has been turned off until further notice.

Main Menu

Algorithms, complexity and what any of this has to do with R

Started by Don Lag, June 12, 2001, 11:30:00 PM

Previous topic - Next topic

Don Lag

First off, I must admit to being a computer engineering & physics major in progress. So all this is predictably spwaned from way too much contact with these disciplines.

Many decisions about hte system I currently working on have to do with minimizing the complexity of determining results. For example, I despise AD&D's add-a-few-hundred-modifiers approach to just about everything (although D&D 3rd ed has addressed this issue, simplifying the amount of mods).

In trying to acheive maximum simplicity (and therefore playing speed), I have been tempted in applying algorithm analysis techniques such as considering that ordering N dice would be an O( log(N) ) task at average.

I'm aware the example isn't very bright, but I was just wondering if anyone has seriously attempted to do this kind of analysis.

[ This Message was edited by: Don Lag on 2001-06-12 23:31 ]
Sebastian Acuña

greyorm

Quote
In trying to acheive maximum simplicity (and therefore playing speed), I have been tempted in applying algorithm analysis techniques such as considering that ordering N dice would be an O( log(N) ) task at average.

I'm aware the example isn't very bright, but I was just wondering if anyone has seriously attempted to do this kind of analysis.
I'm utterly lost...come again?
(I know nothing about algorithm analysis techniques or anything mathematically similar...so that gives you my resposne: I haven't! [grin])
Rev. Ravenscrye Grey Daegmorgan
Wild Hunt Studio

Don Lag

heh

umm... I'm not sure how well my explainer abilities measure up, but....

In designing algorithms (basically a structured set of steps to take in order to solve any of a family of problems, for example: sorting a list), one analyses (is that the word?) the time it takes to execute said algorithm.

This "time" is measured in steps, depending on the size of the input. For example, to seek a value in a list (i.e. to find the number '5' in th list { 2, 8, 5, 10, 1 } ) will take an average of N/2 steps. If the list is ordered from least to greatest a better algorithm (Binary search) can be applied. This takes an average of logarithm base-2 of N steps to complete, less than N/2.

My idea come from the observation that a game that takes an addition of N dice on average to solve a combat result is more complex than one who takes searching for the largest die. This is obvious, what I really wonder if in general it is any use to put common algorithm analysis to work on slimming down RPG rules.
Sebastian Acuña

james_west

The phenomenon is one we've addressed here before, although we've called the issue search time (finding relevant rules) and handling time (applying them after you've found them).

It is a very relevant issue, and if you can come up with something clever based on the algorithms you're familiar with, I'm sure it would be well received.

        - James

Ron Edwards

Hi there,

Bit of background: I co-opted the terms search time and handling time from ecology, in which they refer to food-foraging strategies. Both are discussed at the end of the "System Does Matter" essay, which I am convinced is usually read only halfway-through.

To clarify a bit: search time refers to any and all attention spent right up to the moment of the resolution mechanic being employed (e.g. dice hitting the table, for most RPGs); handling time refers to all such attention spent from that moment on until that action is considered "done." Damage rolls, for instance, would be included as part of handling time.

Reducing search time has received a lot of attention in commercial RPG design, but reducing handling time is still in development. I modestly claim that Sorcerer and Elfs do a nice job in this department, as do The Dying Earth, Story Engine, and Ghost Light. Zero, Orkworld, Unknown Armies, Swashbuckler, and Hero Wars come in a very close second, I think. One of the few flaws of Extreme Vengeance is the grunt-work of its handling time.

Not surprisingly, Drama-mechanic games like The Window, some options in Everway, and Puppetland have very low to absent search and handling times, and I think this is generally true as long as there are pretty strict rules about how to employ the mechanic. The bid-based Drama-mechanic games like Pantheon and Soap add some handling time, but as we know, well within acceptable limits.

A related issue has to do with one of the fundamental mechanics differences between Simulationist and Narrativist design; the former tends to go with step-by-step, chronological resolution, reducing non-constructive negotiation as much as possible; and the latter tends to go with Fortune-in-the-middle (when Fortune is employed), retroactive descriptions, and encouraging constructive negotiation as much as possible.

Just how one goes about reducing search and handling time within either of these approaches is going to differ a lot between them. In The Window and Fudge, for instance, I think that the Simulationist mechanic is stripped down about as far as it can go. [Side note: I also think the philosophy of resolution in The Window, no matter how light it is, is contradictory to its Three Precepts, which are Narrativist.] Whereas Hero Wars, for instance, has a lot more point-keeping and dice rolls than The Window, but the mechanics are focused precisely on Narrativist concerns, especially insofar as everyone at the table is engaged, and thus do not generate a "wait" feeling during play.

Best,
Ron

Don Lag

Funny, I wasn't referring to that aspect of complexity (which seems important know that I think of it), but rather the complexity that arises from the actual rules once applied.

For example, an impossibly complicated game could state that one must roll 3 dice, and the value of the roll is the highest common divisor amongst them. On the other hand, a system that asks for a roll with a single die and whatever posp up is the value, is much more simpler.

Adding modifiers would rise the complexity also, as would any algebraic operation between the dice (adding, subtracting, multiplying..). Searching for the maximum amongst a number of dice seems to be simple enough operation (even for 10+ dice), but comparing two different rolls of many dice is a pretty heavy operation.

This all seems rather subjective, and qualitative, and I feel motivated to apply coputation techniques in analysing it deeper (even if it's just out of curiosity).

If I *DO* come around to actually analysing this stuff, first I'll let you all know, and I'll try to see how the Searching & Handling idea fits in.
Sebastian Acuña

Zak Arntson

(Confessing to also being a programmer)

I'd love to see what you come up with.  Complexity & algorithms always interests me (more so with JavaScript since I have to balance the interpreted language AND the code that interprets) ...

Anyhow, I think that the algorithms should apply to the search & handle time.  It makes sense that a die roll and a comparison is easier than a die roll +/- any modifiers, look up on chart, etc. etc.

Also remember that humans are doing the computations.  So comparison will be fastest, addition comes next.  Then probably subtraction & doubling.  Division seems to me to be the end of the computing spectrum (unless you have players doing square roots or logs or something!)

You'll also have to figure in any reference lookups.  Takes up lots of time if you have to reference charts.

And special rules.  I just got done playing D&D and boy do we consult the rulebook a lot just for things like subdual damage.

Anyhow, you've got at least one person excited about complexity ...

Don Lag

yay! Another computer freak.

I actually had a test today for my Algorithms Design & Analysis course, though I swear I had this idea a long time ago and it isn't some weird falling in love with the course thingie :smile:

You're comment on humans doing the computations brings up another idea I've had from a time since... but it's a really dry mathematical discussion (or at least I haven't found the proper way to explain it in commonspeak).

Anyway, I haven't done any real study on the subject, but for starters I think a good starting point would be to consider:

Adding N dice: O(N) //obvious

Searching for highest die amongst N dice:
 Humnas are far better at analyzing a group of elements adn determining global properties than computers. So while a computer would take O(logN) to find the maximum die, a human (I think) would take so little that it would be fair enough to consider it O(1) for most reasonable values of N (up to 10 for example).

I'm stopping myself right away here. I'm starting to see that there are perhaps som considerations missing in formulating this problem.

I'll give it a thought then post later.
Sebastian Acuña

Zak Arntson

Wow.  This is cool.  We get to assign complexities based on human behavior.  You know, my workmate has got this great book on applying psychology to user interface.  Things like how many simultaneous tasks can a human reasonably perform, how accurate can someone remember multiple pieces of information (I think if you go beyond 3 it's bad).

And since people are pattern-matching fiends, and not so good at quick bit-flipping ... the complexities are WAY different than a computer.

So yeah, I would say that:

Picking Highest/Lowest Die among X dice : O(n)
Comparing Roll to Number : O(n)
Comparing Roll to Number & Determining Higher/Lower : O(n)

For more complicated things, I think some research would have to be done.

And reducing complexity wouldn't work the same.  Say adding numbers is O(x * n), well that doesn't reduce to O(n) since a human can't add 50 numbers as fast as 1.  So maybe it needs to be O(n^(x-1)) or something?

Just some thoughts.

jburneko

I just wanted to jump in here and say that there is yet another person who understands this conversation.  I have a degree in computer science and a minor in mathematics.  I also still own all my text books and they are in an accessible place so if you need anything looked up just let me know.

This is very cool.  I'd never really thought about this before but when you think about it you realize that it's a very important design consideration.

Jesse

John Wick

I just wanted to jump in and say...

I DON'T understand this conversation - and I think its very interesting that "I want a simple system" and "algorithms" are being used in the same sentence. :wink:

Take care,
John

[ This Message was edited by: John Wick on 2001-06-14 18:39 ]
Carpe Deum,
John

Mytholder

For John and any other non-techies...

Big-Oh notation is a way of describing how long a particular algorithm (method of doing something) will take to complete. It's expressed in terms of the number of things the algorithm has to process. So if a computer has to sort N things, an O(N) algorithm will take an amount of time that's directly proportional to the number of things. An O(N-squared) algorithm will take a long time, proportional to the square of the number of things. In constrast, an O(1) algorithm will always take the same amount of time.

(It's a year since I graduated, and longer since I actually looked at books on computing theory, so I might be getting some of this wrong.)


John Wick

Sorry Myth, but your explanation didn't help me much.

How about someone using English - preferably small words for us Philosophy types.
Carpe Deum,
John

Zak Arntson

I'll try.

When you write an algorithm, it will take a certain amount of time to complete.  This can be dependent on the amount of data your algorithm must munch before its done.  Programmers get raises (at least they used to, I bemoan the current Microsoft way of "faster computers = we can push bloated code) with quicker programs, so making a program run fast is key.

If an algorithm is O(n), this means that it gets as complex as what you throw in.  There's a one-to-one relationship.

If the algorithm is O(n2), it's not so quick.  I give you three numbers, you do nine (n = 3, n2 = 9) calculations.

And so on.  There's all sorts of complexities, like O(log n) and O(2n), but don't worry about that for now.

What's neat about this method (for programming, anyway) is that you can figure out your algorithm's complexity, say: n2 - 3n + 2, and crunch that down to a complexity of O{n2).  Then you know where to concentrate retooling your code to make it faster.




So how does this apply to gaming?  Here's my take.  Say you have a system where you roll x dice, add them up, and compare that to a difficulty.  The complexity of this would be (I'm just throwing complexities around, I don't know any tested complexity of rolling, comparison, etc)  This is way different than computer theory, since people are good at different things:

(roll 4 dice) + (total them up) + (compare to difficulty) ->
(roll 4 dice) + (sum 4 items) + (comparison) ->
(4n) + (n(4 - 1)) + (1) ->
n3 + 4n + 1

Now, when determining complexity, you take the POTENTIALLY BIGGEST piece, so here we drop the 4n and the 1 to get a complexity:

O(n3)

At a glance, we can tell that where we total the rolls is the most time-consuming.  So if we want to quicken up the mechanic, we should start there.  Say we pick the highest, which is easy for humans.

(roll 4 dice) + (pick highest roll) + (compare to difficulty) ->
(roll 4 dice) + (seek highest of 4) + (comparison) ->
(4n) + (4n) + 1 ->
8n + 1

Complexity: O(8n + 1) = O(8n).

SIDE NOTE: With computers, you can just drop the 8 to get a complexity of O(n).  But with people, I think we should leave it, since a complexity of O(20n) would take a lot more time for a human

Wow ... we're on the forefront of gaming theory (well, probably not, but I bet it's a first for rpgs anyway).




Real Life example (though I didn't do it in theory, I did it intuitively at the time).  In my Chthonian game, combat went Attacker vs. Defender.  Then the defender had a chance to attack, prompting a second Attacker vs. Defender roll.  Rolls in this game are d12 + modifiers.  Then a difference.

I thought this wasn't rules light enough to foster the kind of game I wanted, so my revision (not posted to the web yet) makes a single combatant vs. combatant roll.  Using complexity, the first situation a single "round" of combat would be (with an average number of modifiers of 2):

(1 roll + 2 modifiers) + (1 roll + 2 modifiers) + (figure difference) + (record difference) + (1 roll + 2 modifiers) + (1 roll + 2 modifiers) + (figure difference) + (record difference) ->
4 * (1 roll + 2 modifiers) + 2 * (subtract 2 items) + 2 * (record difference) ->
4 * (1n + nmod - 1) + 2 * (n2) + 2 * (1n) ->
4n + 4n1 + 2n2 + 2n ->
6n + 4n + 2n2 ->
10n + 2n2

Complexity: O(2n2 + 10n).  I'm not going to simplify the complexity, since we're dealing with VERY SLOW COMPUTERS (as in people computing).

Once I get rid of the second bit of combat, and just have two people roll, highest wins and applies difference to loser:

(1 roll + 2 modifiers) + (1 roll + 2 modifiers) + (figure difference) + (record difference) ->
(1n + n2 - 1) + (1n + n2 - 1) + (n2) + (1n) ->
[hand-waving] ->
5n + n2.

Complexity: O(n2 + 5n).  I've halved my complexity (which I guess is intuitively obvious, since the original was just two "rolls").

Yikes.  Time for me to stop writing.

The key here is that we come up with some good complexities for things.  How complex is adding two numbers?  Subtracting them?  Adding 5 numbers?  Comparing one number against another?

And how to reduce the complexity, too.  The typicaly Computer Science way of O(n2 + 5n + 3) = O (n2) isn't going to work.  O(10n) is appreciably slower than O(5n) when it comes to human beans.


_________________
Zak
http://mailto:zak@mimir.net">zak@mimir.net
http://zaknet.tripod.com/hmouse">Harlekin Maus Games

[ This Message was edited by: Zak Arntson on 2001-06-14 20:30 ]

[ This Message was edited by: Zak Arntson on 2001-06-14 20:31 ]

Don Lag

Thanks guys, I was aiming for people with computer science background mostly because I didn't have time to post a big Introduction to Algorithms message (like Zak's!).

I can't help but mention something that has come slightly into discussion, this is humans as computing machines. I'm using the term "machine" in a very general fashion, basically considering that the human brain provides a mecanism for solving problems (I'm not about to delve into the philospher's interpretation of the question).

Does the human brain provide a better mechanism than the Turing Machine? Could one conduct experiments in trying to emasure the humanly optimal algorithm for sorting (putting lots of people to sorta many items, in different numbers and try to observe any type of general behaviour: logN, N, ..)

Random thoughts mostly...
Sebastian Acuña