Originally Posted by 1978
What am I missing about the example below?
Field A - Average Rating 1000 average score = 50
Field B - Average rating 950 average score = 52
Field A score 50 = 1000 rated
Field B score 52 = 950 rated the 50 would only be like 980 rated.
The missing component there is the relationship between player (initial) rating and rating-points-per-throw. The PDGA uses two linear equations to match SSA up with an (arbitrary) associated rating-points-per-throw. In order for the difference between a 950 rated player shooting an 'average' 52 and a 1000 rated player shooting an 'average' 50 means a rating-points-per-throw of 25.. much higher than either the PDGA linear compression formula or the observed slope of initial rating vs. scoring spread. Realistically, the value is typically in the 5-15 points-per-throw range, both using the PDGA compression formulas or the slope of initial rating vs. scoring spread. Does that help at all?