I spent about 12 hours today messing around with the game, mostly trying to learn more about MLEs and PCMs. Other people had mentioned seeing a pattern in leagues with low MLEs, namely that the run environment is massively depressed. This appears to be largely due to the Batting Average MLE as it appears to disproportionately affect batters (which makes some sense given what we know about DIPS). However, it seems unlikely that the intention was for leagues with uniformly low MLEs to suffer from a depressed run environment. The run inflation from lowering the strikeout MLE is quite noticeable as well.
It seems like the MLE concept needs some adjusting so that it effects batters and pitchers in a league equally, otherwise setting up a league to get the results you want becomes mighty complicated.
I posted the results of the tests I ran in this thread:
http://www.ootpdevelopments.com/boar...d.php?t=122241
The testing was far from rigorous but I'd imagine the results would more or less hold up with larger sample size given that others have observed the same problem.
Test 1:
I simmed up to the All-Star Break with the following leagues:
FBL - All MLEs set to 1.
FBLBA - All MLEs set to 1 except BA, which is set to .2
FBLEBH - All MLEs 1 but EBH = .2
FBLHR - All MLEs 1 but HR = .2
FBLBB - MLEs = 1, BB = .2
FBLK - MLEs = 1, K = .2
2 subleagues per league, the following are the two LgEras and LgBAs:
FBL - 4.25, 4.68/.254, .273
FBLBA - 2.28,2.53/.168, .190
FBLEBH - 3.91,4.02/.259,.262
FBLHR - 4.23, 4.25/.273, .276
FBLBB - 4.47, 4.13/.265, .264
FBLK - 8.05,8.90/.377,.388
Test 2:
I simmed a full year with 7 leagues. All MLEs were set to either 2, 1.5, 1, 0.8, 0.6, 0.4, or 0.2. LgERAs and LgAvgs for the leagues:
2.0 - 4.68, 4.45/ .284, .277
1.5 - 4.89, 5.05/.277, .286
1.0 - 5.84, 5.09/.313, .288
0.8 - 4.90, 4.96/.290,.283
0.6 - 4.03, 3.92/.252,.249
0.4 - 2.12, 1.92/.177,.169
0.2 - 4.68, 3.80/.277,.258