So there's been a lot of discussion about the concept of bounded accuracy -- the notion that there is no assumed improvement in attack rolls, skills, or defenses unless the character devotes character building resources to such improvement. Thi was not true in any prior edition. I will call this phenomenon "assumed progression" as the term "bounded accuracy" tends to confuse.
In OD&D, BECMI, AD&D, and in 3e, every class got assumed progression in attack rolls and saving throws. In 3e, you also got regular improvement in some Ability scores. In 4e, this was assumed progression was extended to all ability checks (including skill checks) in the form of the half-level bonus. Next has assumed much less progression than any previous edition. However, it is not non-existent. Even in Next most classes (not wizards_ automatically improve at weapon attacks as they reach certain levels. All characters automatically improve some Ability scores.
These are all varying levels of "bounded accuracy". One way to measure this is to see the level at which a PC is good at something that is "contrary" to their class' type.
For instance, at what level can a Str-10 wizard fight with her quarterstaff as well as a Str-16 1st level fighter.
In 2e, it is 4th level
In OD&D, it is 5th level
In 1e, it is 6th level
In 3e and 4e, it is 8th level
In the current playtest, it never happens (for the wizard -- a Str-10 rogue achieves parity with a Str-16 1st level fighter in the longsword at 14th level) can swing a longsword (which is not a finesse weapon) as well as a Str-16 1st level fighter (+4 to hit).
This occurs even if the wizard never once uses her quarterstaff in combat and makes absolutely no effort to improve her quarterstaff skills. She just gets better through what I call the "osmosis of adventuring". She picks up competency simply by being in adventures where people are presumably using weapons (either as allies or enemies).
Now, everybody appears to have different levels of tolerance. Some prefer the rapid progression of 3e and 4e. Some prefer the overall flatness of bonuses combined with assume progression as found in OD&D and AD&D. And others prefer the even slower progression of the playtest.
Some also prefer to limit assumed progression to attack rolls, or attack rolls and saves, or to all d20 rolls.
So my question to you is... how much assumed progression, if any, do you want in your game? To determine this, I am going to ask a few questions, and I hope you answer truthfully. These polls will be open for one month.
Poll One: To-Hit Bonuses
At what level should a human wizard with a Strength of 10 have the same bonus to-hit with a quarterstaff (assuming proficiency) as a 1st level human fighter with Strength of 16? Assume the wizard is not spending any character building resources (including spells, feats, etc.) to improve in fighting with the quarterstaff.
Poll Two: Saving Throws
At what level should a human rogue with a Wisdom a score of 10 have the same bonus to save against a charm effect as a 1st level human cleric with a Wisdom score of 16? Assume the rogue expends no character building resources (including skill tricks, feats, etc.) to improve the ability to resist charms.
Poll Three: Ability and Skill Checks
At what level should a human fighter with an Intelligence score of 10 have amassed as much knowledge (for purposes of ability or skill checks) of arcane lore as a 1st-level human wizard with an Intelligence score of 16 who is trained in the skill knowledge (arcane)? Assume the fighter expends no character building resources to improve in knowledge of arcane lore.
I look forward to the results!