|
Thanks guys, injury log's interpretation was better, although the reason I asked is because it isn't as cut and dried as you make it sound. Deja Vu's response was still very good food for thought though.
In one game I've played, the player's error rating displayed in the game is basically the number of errors a player will make per 100 full games.
Example. League average 2B makes 15 errors per 100 games. A fielder with a 100 error rating at 2B will show a "15" as his error rating when you are playing the game, it's 100 when you look into the actual ratings in the engine. 200 would be shown as "30" when playing the game, 33 would show as "5" when playing.
So my guess is the engine determines if a player will make an error separate from whether or not he gets a chance.
Whereas in another game, the engine might first determine if a ball is hit at a fielder, then if so, a certain percentage of those plays would become errors, based on the fielder's error rating.
How the game does this is important when you are trying to turn stats into ratings. I was hoping someone would just 'know' how OOTP handles this, so I could make my custom ratings more accurate.
|