What is your preferred math for D&D Next?

Instead of starting with the content, creating some numerical modifers that "feel" right, and just letting the chips fall where they may, a smart game designer starts with the success rates they want works backwards.

So D&D Next designers and we as a community need to make the following decisions:


  1. In determining the sucecss of a challenging task, what do you want the relative importance of luck, ability scores, class level, and optimization/game mastery (everything else, such as Feats, Skills, Race, Traits, magic items, spells, etc) to be?


  2. Assuming you've maxed everything out, what should your chance of success be?


  3. Assuming that you're a 1st level character with no investment, what should your chance of success be?


Once you've answered those questions, create the modifiers which will lead to the results you want. 

For example, I prefer a system where there is roughly equal weight on the die roll (luck), Ability Scores (natural ability), class levels (experience), and everything else (character optimization).  I think that a fully maxed out character should have a 80% chance of success (and a DM can hand wave tasks that are not challenging to that character, like a master thief opening a basic lock outside of combat), and that a new character should have a 20% chance of success (so that new players can still try stuff and still have some chane of success).  So the math in my theoretical game would be:


  • Ability Scores capped at 18 (max +4 bonus).


  • 1/5 your class level to any task you are trained/proficient/etc in. (max +4 bonus).


  • Bonuses from all other sources capped at +4 total.



A Challenging task is then set at DC 17.  On a 1d20 + 12 roll I have an 80% chance of success (roll 5 or higher), on a 1d20 + 0 roll I have a 20% chance of success (roll 17 or higher).

Modify any of the above based on your personal biases however you like - it's only provided as an example and not a definitive statement of what D&D Next math should be.  But this is the basic process they should follow, not "oh a save vs. the low level ghoul sorta sucks now so lets screw with all the math to come up with a solution."

So, based on the above decision making process, what do you think D&D math should look like?
I was thinking about making a thread about the same thing. 

How high you wanna go and what the least you got to have to make it?

Math is complex, because you have a lot of elements in the game.
It's the reason why balancing combat is difficult, because there so many ways to change it.

My idea of math is when I build a character for something like tanking, DPS, or support, the
math supports it. I should be able to focus on something I want to do and the math favors me for it.
There actually some games out there were Attacking is better than Defending and Defending is
better than Attacking.  It range from losing hitpoints from attacking something or killing everything
before it hits you. I don't want to make a build for a job and suffers for it, because the other jobs
are more superior.  
Instead of starting with the content, creating some numerical modifers that "feel" right, and just letting the chips fall where they may, a smart game designer starts with the success rates they want works backwards.


Yes and no.

i believe they have some rough math in place (success rates, combat rounds, etc). But you don't play math. Games with great math can still be boring. And math won't feel like D&D or be representational of the game as a whole.
4e showed this very effectively where the numbers came first and were unforgiving and unmodified leading to some pretty long combats. The Player's Strategy Guide highlights this nicely where combats are meant to take 12 rounds IIRC. 

I'm okay with the approach they're taking when're they establish some base math, build classes, and then tweak both based on how it plays. The math is subservient to the playstyle. The game being playable trumps the math.
Rather than continually modifiying the classes and powers to conform to some arbitrary numbers. 

5 Minute WorkdayMy Webcomic Updated Tue & Thur

The compilation of my Worldbuilding blog series is now available: 

Jester David's How-To Guide to Fantasy Worldbuilding.

PCs are 60% class 20% race and 20% other to me.

DC 10 (walking on ice)
Joe Normal 50% chance (1 in 2)
Super Acrobat 100% chance

Pick a simple lock
Joe Normal 25% chance (1 in 4)
Master Halfling Thief about 90% (9 in 10)

DC25 Backflip over someone with no practice
Joe Normal 0% chance
Super Acrobat 50% chance (1 in 2)

Orzel, Halfelven son of Zel, Mystic Ranger, Bane to Dragons, Death to Undeath, Killer of Abyssals, King of the Wilds. Constitution Based Class for Next!

As player, GM, and designer I prefer success being generally more frequent then failure. This is important for:

  1. psychological reasons - higher chances at successes encourage more player participation and involvement.

  2. narrative reasons - we generally attribute the plot and direction of the story around the protagonists' decisions and actions. (Of course, grim/comedic genres and good GM planning can make failure do this as well)

  3. logistic reasons - It's easier for most GM's to plan their story and scenes when they can anticipate reliable characters.

  4. balance reasons - The differences and variance of success rates and are much smaller with higher probability. For instance: If a rogue succeeds at bluffing 90% of the time and the warrior succeeds 60% of the time, then the rogue is really only succeeding 50% more often than the warrior. If you shift the DC's up though so that the rogue is succeeding 50% of the time and the warrior is succeeding only 20%, then the rogue is now contributing 150% more than the warrior.


Of course, much of this is based on how I use skills as a player and GM. If you're a GM that requires only 1 out of 5 players to succeed any given challenge, and you know how to make occasional failure interesting to your players then you might be better adapted to higher failure rates.
If a Fighter leaves town walking 3 MPH at 7:00 AM and the Orc leaves his cave running 7 MPH at 9:00 AM how many bottles of goat urine will the Wizard need to cast his spells for the day?

Money Mouth
What species of goat?

"The Apollo moon landing is off topic for this thread and this forum. Let's get back on topic." Crazy Monkey

When rolling a d20, the modifier should also be in the 1-20 range. By preference, I would divide that evenly between ability scores and class and level and other bonuses. A level 1 wizard with minimal Dexterity and a cheap crossbow should be at +0 or +1 to hit, while a level 20 ranger with maximal Dexterity and an artifact bow would be at +20. That necessarily means that there should be some targets which the young wizard will never be able to hit, and some targets which the epic ranger should never fail to hit (unless you're using rules for automatic success/failure on a 20/1).

Those are really corner case scenarios, though. For practical purposes, the wizard would have a Dexterity modifier of +2 or +3, and the ranger would have +4 or +5, and the difference in class level bonus would equally be down to 2-3 points for characters of the same level.

The metagame is not the game.

If a Fighter leaves town walking 3 MPH at 7:00 AM and the Orc leaves his cave running 7 MPH at 9:00 AM how many bottles of goat urine will the Wizard need to cast his spells for the day?

Money Mouth



3?  I am bad at these types of problems, darn it! 

On a positive note, I do know that it takes two bottles of rat feces and one eye of a warthog to cast "Smite Metagame Dissonance". 

Preferred math for DDN:

"Minimal."

Followed by "invisible."


I want the math to represent levels of distinction or tiers of difficulty where the highest tiers may be near impossible for the low level characters to reach. As soon as the low level character has a good chance to effect the highest, it starts to tear down the culture and expectations of D&D that I have used for countless years. AC would be a good example I am not happy with. Sure you can have a very large creature with a low AC and tons of hit points, but the same would not apply for a devil, or similar creature.
To exapand a bit on my original post:

Any new edition of D&D is going to have LOTS of suppliments and subsystems.  The game developers should remember this.  It's meaningless if the numbers all "feel" right in the play test or core materials, because 1 month after the core materials are published there will be suppliments with stuff that skews the math.  So if you want the math to work, you need a strictly regulated system.  Every bonus needs to be strictly capped and/or non-stacking and/or non-existant.  Otherwise, players can break the math through optimization. 

New low level players should have some meaningful chance of success at any reasonable action that they might want to take.  A Wizard needs a meaningful chance to be able to succeed on a melee attack.  A Fighter should be able to Trip an enemy even if he hasn't invested in the Trip related crunch.  And so on. 

Higher level characters can and should gain things other then numerical bonuses to set them apart from low level characters.  For example, a Fighter 20 with Damage Reduction 20/magic who can Whirlwind Attack every space he threatens and inspires Fear every time he kills an enemy with fewer hit dice would rarely need to worry about being defeated by low level mooks, even if he has no scaled bonuses.  If a low level ghoul has some chance to one shot him, give the Fighter immunity to paralysis, or make the ghoul's paralysis have a maximum hit point threshold. 

The math should be built to make a fun and balanced D&D game.  D&D is not a mass combat game where PCs fight 1,000 peasants.  It is not a simulation game where the probability of hitting mirrors how often "real" Wizard could hit a "real" Orc with his quarterstaff.  D&D is a game of roleplaying, exploration, and tactical combat.  Don't worry about hypothetical examples, which can be solved with a module or DM fiat.  Worry about what real games will probably look like. 
I prefer a Core Math that is consistant (Bounded Accuracy) and that leaves room for options based on style (+1/3 levels rounded up -- the +1 to +6 system, +1/2 levels rounded down -- 4e character Math, +1 level -- 4e Monster math, or other).


I prefer lots and lots of charts.  One system to rule them all can never be as well tuned as one adhoc chart for every situation.   I also want these charts to let me use all of dice in my bag.
The math that I'm really interested in is all on the world building end. I'd plotz if the DMG had a guide of NPC demographics by level and how to build a dungeon. Economies and ecologies!  Actuarial tables!  YEEEESSSS!



I prefer lots and lots of charts.  One system to rule them all can never be as well tuned as one adhoc chart for every situation.   I also want these charts to let me use all of dice in my bag.



Really?  Seriously?  I've been playing D&D since the 1st edition when I was 12 years old, and looking up the various charts was always my least favorite part of the game.  And new players really struggle with it, or having to roll a wide variety of different dice for different things. 

I'm pretty flexible on what the final math should be, but they need some sort of unified mechanic for resolving challenging actions.  Xd20 (1d20 by default, 2d20 or more for Advantages/Disadvantages) + Ability Score + Trained/Proficient bonuses + Whatever (with Whatever being strictly capped somehow) makes the most sense to me.  I'd be willing to listen to math based arguments for why it should be something different.  But "roll something different for each type of check" does not have much of a rationale. 

Instead of starting with the content, creating some numerical modifers that "feel" right, and just letting the chips fall where they may, a smart game designer starts with the success rates they want works backwards.



You lost me right here.   I say get the feel right.   Then tweak the math.  Obviously you have to have some mathematical structures.   d20 roll high to beat a DC.  They know that.   What that DC or even AC is doesn't matter early on.   Just figure out if the classes are acceptable in feel and the overall game flows right.   Then once you have general acceptance of all of that you sit down and iron out the math.

It is what they are doing.  I believe they will be at least somewhat successful.  Will it be perfect?  No.  Will it satisfy the hardcore math fanatics?  No.   Will it be fun for most people?  Yes. 

My Blog which includes my Hobby Award Winning articles.

Preferred math for DDN:

"Minimal."

Followed by "invisible."



Almost agree.
"Invisible"
followed by "extensive"

The underlying math needs to be as meticulous as possible, but the players (and maybe even the DM) should never need to see any of it beyond die-modifiers.
65% is my golden success rate for slightly above average PCs.  It's just shy of 66, which (in addition to ordering the extermination of all the Jedi in the universe) is a number denoting a 2 out of 3 ratio.  That means round by round, any PC doing something he is reasonably proficient in will accomplish that task twice for every three rounds of combat or interaction or what have you.  That's just enough to make him confident, but not so much that the challenge is trivial, and not so little that he wastes a whole turn half of the time.

If the d20 roll high model were adjusted for the assumption that something happens even on a failure, that target number would have to change.

So that's an 8 on a d20.  Another benefit:  If a player rolls a 10 or higher, he should reasonably expect success.  That breeds player expectations (important for DMs to maintain immersion and mess with on occassion) and is a really easy rule of thumb to follow.  Then, when a monster in heavy armor comes clanking along and 12s and 14s DON'T hit him, the tension is appreciable.

Incidentally, with an ideal attack bonus of +4 at 1st level, an 8 and a 4 would make 12.  Go look in the Bestiary and make a mental note of every AC in there.  Which numbers pop up the most often?  12, 13, 11, 12, 13, 13...  This seems to be the assumption shared by the developers as well.

Then, for ease of comprehension and well-informed player expectations and a heuristic simulation of a fantasy world and a host of other good reasons, you could reasonably expect the math for skill checks and saving throws to follow the same assumptions, right?

Wrong.  For the life of me, I cannot understand what they are doing mucking about with half of the core resolution mechanics in the game.  Attack rolls and ACs work fine.  Ability checks and DCs don't quite match up.  Instead of doing something painfully obvious like lowering the DCs to match AC math that has already been proven to work, they want to bring back level bonus, or make players deal with two sets of modifiers and thus twice as much at-table calculation, or some other strange voodoo solution.

I shouldn't complain.  The last couple of articles have been major eye-rollers for me though.

Anyway, whether ability checks and saving throws get the proficiency die or the class bonus, it's really neither here nor there.  The point is:

-d20 + ability mod + one other bonus that hovers around 2-4 but can't get higher than 5-6

versus

-DC that ranges from 7 to 10 to 13 to 16 to 19 to 22 (y'know, like AC does)

and it would add up to give you that golden ratio of 65% more often than not.
Instead of starting with the content, creating some numerical modifers that "feel" right, and just letting the chips fall where they may, a smart game designer starts with the success rates they want works backwards.



You lost me right here.   I say get the feel right.   Then tweak the math.  Obviously you have to have some mathematical structures.   d20 roll high to beat a DC.  They know that.   What that DC or even AC is doesn't matter early on.   Just figure out if the classes are acceptable in feel and the overall game flows right.   Then once you have general acceptance of all of that you sit down and iron out the math.

It is what they are doing.  I believe they will be at least somewhat successful.  Will it be perfect?  No.  Will it satisfy the hardcore math fanatics?  No.   Will it be fun for most people?  Yes. 




D&D is a game written by many, many different people. 

If each writer is doing what "feels right" to them, after the first couple of suppliments, you're going to break the math, which will require DMs and players to constantly modify things, homebrew, or include/exclude certain game breaking options, or otherwise not use the rules as written. 

If they establish some unified mechanic for resolving all challenging actions, then every writer will work within the same framework.  Writers can still write what feels right to them, but they can't break the game. 

In addition to making the game more intuitive and easier to learn for new players, it also keeps all of the many many game developers/writers honest.  I do not need or want 1000 mostly useless Feats that provide slightly different bonuses, and 10 Feats buried within all the trash (that only optimizaers know) that provide the best bonuses.  Just write the 10 best Feats, and have all of the other Feats provide new and different options.  They can stil express their creativity and feelings.  But not by breaking the math.
Just write the 10 best Feats, and have all of the other Feats provide new and different options.

But, we need a feat called "Boot to the Head" which has the exact same game-effect as "Sword in the Crotch" and "Arrow to the Knee" and "Throat Punch", because they're obviously completely different!

Instead of starting with the content, creating some numerical modifers that "feel" right, and just letting the chips fall where they may, a smart game designer starts with the success rates they want works backwards.



You lost me right here.   I say get the feel right.   Then tweak the math.  Obviously you have to have some mathematical structures.   d20 roll high to beat a DC.  They know that.   What that DC or even AC is doesn't matter early on.   Just figure out if the classes are acceptable in feel and the overall game flows right.   Then once you have general acceptance of all of that you sit down and iron out the math.

It is what they are doing.  I believe they will be at least somewhat successful.  Will it be perfect?  No.  Will it satisfy the hardcore math fanatics?  No.   Will it be fun for most people?  Yes. 




D&D is a game written by many, many different people. 

If each writer is doing what "feels right" to them, after the first couple of suppliments, you're going to break the math, which will require DMs and players to constantly modify things, homebrew, or include/exclude certain game breaking options, or otherwise not use the rules as written. 

If they establish some unified mechanic for resolving all challenging actions, then every writer will work within the same framework.  Writers can still write what feels right to them, but they can't break the game. 

In addition to making the game more intuitive and easier to learn for new players, it also keeps all of the many many game developers/writers honest.  I do not need or want 1000 mostly useless Feats that provide slightly different bonuses, and 10 Feats buried within all the trash (that only optimizaers know) that provide the best bonuses.  Just write the 10 best Feats, and have all of the other Feats provide new and different options.  They can stil express their creativity and feelings.  But not by breaking the math.



We are talking about a playtest here.  The feel of the game will iterate dramatically.  

We aren't talking about a splat book two years into the game.   Once they figure out the initial game and do the math, then they will have a model for future development that writers have to follow.   Right now though they are trying to figure out the feel.   Perfect math and lousy feel will be a loser.   Once they nail down the feel they will then iron out the math.

 

My Blog which includes my Hobby Award Winning articles.

My preferred math:

Ability checks are the only means of task reolustion. Saving throws, attack rolls, skill checks are all simply ability checks without their own separate rule systems.

Ability: This bonus comes from natural talent and ability. Ability scores are what modifiy this this and it is generally in the range of -1 to +5 for PCs. An ability mod of +5 should be exceptionally rare with most PCs capping out at a +4. The ability bonus should not increase with level (aka no level based attribute increases).

Competence Bonus: A universal bonus based on character level from +0 to +5 (+1 per 4 character levels). This represents the broadbased general expertise an adventurer has from experience. It applies to all ability checks as well as AC.

Proficiency: This is the bonus gained for training in specific areas of expertise. It includes combat skills such as weapon and implement proficiencies. This bonus goes from +2 for basic training to +6 for mastery (or d4 to d12).

So at most a level 20 PC can have +16 to their rolls. (+5 attribute, +6 proficiency, +5 level)
At minimum a level 20 PC has a +4 to their rolls. (-1 attribute, +0 proficiency, +5 level)

The DC for tasks follows this simple table.

DC 5 = very easy
DC 10 = easy
DC 15 = average
DC 20 = hard
DC 25 = very hard
DC 30 = legendary
DC 35 = godlike

ACs would follow a similar trend as DCs. For level 20 monsters, their AC should probably 20 to 25 range (with some outliers of course).

So what does this mean in general?

A level 20 warrior with 8 charisma and no training in persuasion (+4 total bonus) could still perform an average task 50% of the time. He can perform a hard task 25% of the time. He could not perform a very hard task.

At level 1 the warrior could perform an easy task 50% of the time, average task 25% of the time, and could not perform a hard or very hard task.

A level 20 rogue with 18 charisma and master training in persuasion (+15 total bonus) could perform an average task 100% of the time. He could perform a hard task 80% of the time, a very hard task 55% of the time, a legendary task 30% of the time, and a godlike task 5% of the time.

At level 1 with only basic training in persuasion (+6 total bonus), the rogue could perform an easy task 85% of the time, an average task 60% of the time, a hard task 35% of the time, and a very hard task 10% of the time.

At high levels experience and proficiency are as important as natural ability (or more). At low levels natural ability is the most important factor in determining success or failure.
I prefer lots and lots of charts.  One system to rule them all can never be as well tuned as one adhoc chart for every situation.   I also want these charts to let me use all of dice in my bag.



Really?  Seriously?  I've been playing D&D since the 1st edition when I was 12 years old, and looking up the various charts was always my least favorite part of the game.  And new players really struggle with it, or having to roll a wide variety of different dice for different things. 



A good character sheet has most required charts on it.  This type of play is much easier for new players than a unified d20 system with variable modifiers.
Balanced so that it means something.  
Visible so that I as a DM can break it down and build it up again to suit my needs, I want monster building to be easy quick and accurate.  
Unified, I want the math for an attack roll to be about the same as a saving throw or skill check. Damage is the only thing that doesn't use a d20.

Success rate should be the following based on what you need to roll on a d20 after you add all the modifiers for what you should have at your level vs the DC or AC.

Easy roll a 4 or higher.
Averag roll a  8 or higher.
Difficult roll a 12 or higher.
Very hard roll a 16 or higher.
Next to imposible roll a 20.

Now those will obviously change and if you are AMAZING at something they should be a little lower and if you have no skill/talent what so ever then they should be a lot higher.

I agree with the person who said that the highest possilbe modifier should be +20 that sounds about right to me.  That keeps DC's and AC's around 35 as a capstone.
+5 of that +20 should come from ability scores.
+9 of that +20 should come from class based modifiers.
+6 of that +20 should come magic items, spells, feats, ect...

Break that +9 from class stuff down to +6 if it is something that your class is alright in but not focused like Clerics attack rolls, and +3 if it is something that your class is pretty bad at like wizard melee attack rolls.


Remember this is a public forum where people express their opinions assume there is a “In my humble opinion” in front of every post especially mine.  

 

Things you should check out because they are cool, like bow-ties and fezzes.

https://app.roll20.net/home  Roll20 great free virtual table top so you can play with old friends who are far away.

http://donjon.bin.sh/  Donjon has random treasure, maps, pick pocket results, etc.. for every edition of D&D.

Sign In to post comments