View Single Post
Old 04-17-2025, 11:24 PM   #34
jaa36
Hall Of Famer
 
jaa36's Avatar
 
Join Date: May 2011
Posts: 3,106
Quote:
Originally Posted by drzaius View Post
I am trying to utilize this (and the OP's work) to optimize a lineup, was hoping you could explain a bit further. Specifically, how do I translate the above into defensive runs or WAR? For example, let's say I have 4 different players:

1B #1: 50 skill
1B #2: 60 skill
CF #1: 45 skill
CF #2: 65 skill

How would the above translate to runs / WAR?

EDIT:
Not sure if my math is right here, but assuming that a 80/80 catcher would be worth +53 runs (and 5.3 wins), and a 20/80 catcher would be worth -53 runs (and -5.3 wins), I come up with the following:

1B, 50 skill: 0 WAR (Replacement Level)
1B, 60 skill: 0.4 WAR
CF, 45 skill: -0.9 WAR
CF, 65 skill: 2.7 WAR

EDIT 2: Unless my math is wrong. It looks like if I take 1,350 innings and divide by 53, I get 0.039. So that indicates the range difference between a 20 and 80 catcher. Since this is a bell curve, does that mean you can have +26.5 runs at the top (+2.6 wins), and -26.5 runs at the bottom (-2.6 wins)?
You're closer with edit 2- you'd have a difference of 53 runs over the course of 1350 innings with a 20-defensive-rating catcher vs an 80-defensive-rating catcher, but it's not linear or a bell curve. Often there is little if any difference as you approach the upper or lower limits. You can look at the exact numbers in the fifth sheet of the spreadsheet I attached to the fourth post of this thread ("Projection Constants"), and attaching a screenshot here as well to show the exact numbers. The ones in bold are imputed rather than directly calculated, as the sample sizes were too small to provide any value, even with 20 seasons' worth of data. Again, just to be super clear, this was all done in OOTP 25, it's very possible things are different in 26.

The other piece that is confusing and very difficult to understand is the positional adjustments and replacement level calculations- which you really need to look at https://library.fangraphs.com/misc/war/ or a similar source to understand in detail. I used the values from Baseball Reference for positional adjustments (slightly different from the ones you get on Fangraphs), and the upshot is that an average-fielding center fielder is more valuable than an average-fielding first baseman. Which is intuitively obvious, but you need to have a standard for it. So a league-average hitter who plays an average-fielding first base will end up being a less-than-average player (ending up around 1.0 WAR on a full season), while a league-average hitter who plays an average-fielding shortstop will end up being a better-than-average player (ending up around 2.7 WAR).

I'd say that based on these numbers, that the difference between a 50 and 60 defensive first baseman is about (0+.003)*1350=4 runs per year (0.4 WAR), and the difference between a 45 and 65 defensive center fielder is about (.014+0.08)*1350=30 runs per year (3.0 WAR).

Hope that helps and doesn't further confuse things...
Attached Images
Image 

Last edited by jaa36; 04-17-2025 at 11:27 PM.
jaa36 is offline   Reply With Quote