View Single Post
  #45  
Old 04-30-2013, 06:00 PM
jeverett's Avatar
jeverett jeverett is offline
Eagle Member
 
Join Date: Sep 2009
Location: Eugene, OR
Years Playing: 5.4
Courses Played: 22
Throwing Style: LHBH
Posts: 981
Quote:
Originally Posted by 1978 View Post

What am I missing about the example below?
Field A - Average Rating 1000 average score = 50
Field B - Average rating 950 average score = 52

Field A score 50 = 1000 rated
Field B score 52 = 950 rated the 50 would only be like 980 rated.
Hi 1978,

The missing component there is the relationship between player (initial) rating and rating-points-per-throw. The PDGA uses two linear equations to match SSA up with an (arbitrary) associated rating-points-per-throw. In order for the difference between a 950 rated player shooting an 'average' 52 and a 1000 rated player shooting an 'average' 50 means a rating-points-per-throw of 25.. much higher than either the PDGA linear compression formula or the observed slope of initial rating vs. scoring spread. Realistically, the value is typically in the 5-15 points-per-throw range, both using the PDGA compression formulas or the slope of initial rating vs. scoring spread. Does that help at all?
Reply With Quote