• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

How Normal are Diamond TRs?

Shadrach3

Par Member
Gold level trusted reviewer
Joined
May 13, 2020
Messages
181
I was bored today :] So of course I nerded out on disc golf review statistics. The result was me creating charts to help visualize how closely Diamond TRs' ratings match a normal curve. Attached are the results in 3 files.

Hypothetically, someone who has reviewed a large number of randomly selected courses and given them "accurate" ratings should line up pretty closely with the normal curve. Someone who has not played a random sample of courses (for example, choosing to only play very highly rated courses), or has reviewed them "inaccurately", will not line up nearly as well.

Of course, this is predicated on the assumption that, like many things in life, the quality of disc golf courses as a whole generally falls onto a normal curve. There should be many courses that are generally of Typical quality (2.5), while very few that are Abysmal (0.0) or Best of the Best (5.0).

This is also a very imprecise system that any statistician would censure me for using. (Every statistics student knows that you shouldn't apply normal distributions to categorical data :/). There's lots of margin for my choices to sneak in. Chief of these is that I had to choose one point of the actual review curve to scale the normal review curve to match - I stuck with 2.5 (the middle of the ratings spectrum) as much as possible, but sometimes it was visually more obvious what was going on to scale it with 2.0 or 3.0.

Enjoy! I'd love to discuss or answer questions here or in my PMs.
 

Attachments

  • Grid 1.jpg
    Grid 1.jpg
    134.2 KB · Views: 161
  • Grid 2.jpg
    Grid 2.jpg
    132 KB · Views: 117
  • Grid 3.jpg
    Grid 3.jpg
    134.2 KB · Views: 105
Not very.

I suspect that many are like me and really are on the Obsessive Compulsive spectrum. Or to put it more positively, my Motivated Abilities Pattern is "Master and Perfect".
 
I was bored today :] So of course I nerded out on disc golf review statistics. The result was me creating charts to help visualize how closely Diamond TRs' ratings match a normal curve. Attached are the results in 3 files.

Hypothetically, someone who has reviewed a large number of randomly selected courses and given them "accurate" ratings should line up pretty closely with the normal curve. Someone who has not played a random sample of courses (for example, choosing to only play very highly rated courses), or has reviewed them "inaccurately", will not line up nearly as well.

Of course, this is predicated on the assumption that, like many things in life, the quality of disc golf courses as a whole generally falls onto a normal curve. There should be many courses that are generally of Typical quality (2.5), while very few that are Abysmal (0.0) or Best of the Best (5.0).

This is also a very imprecise system that any statistician would censure me for using. (Every statistics student knows that you shouldn't apply normal distributions to categorical data :/). There's lots of margin for my choices to sneak in. Chief of these is that I had to choose one point of the actual review curve to scale the normal review curve to match - I stuck with 2.5 (the middle of the ratings spectrum) as much as possible, but sometimes it was visually more obvious what was going on to scale it with 2.0 or 3.0.

Enjoy! I'd love to discuss or answer questions here or in my PMs.


Very nice. My graph has a spike at 2, which represents the large number of "Chicagoland Crappy Niner Baggin' Tours" Notapro, Threeputt, and I have done over the years. Crappy niners tend to cluster up hard at 2.0, thus creating my "higher than normal" 2.0 spike. Neat to see it graphed. :thmbup:
 
Of course this is all just for fun, but your assumptions do have some flaws.

It's fun to see people's curves, though. I think that mine has the weirdest shape of any. cefire may be the 2nd most unusual. Here are the main reasons for the shape of my curve--
1) I play every course in the state, regardless of rating. Thus in Feb 2021 I completed playing every course in VA, and am now back to working on playing every NC course. In each state I don't think the majority of courses will be 2.5. I think that may vary by state, though. IL will be really skewed by the bazillions of Chicagoland niners.
2) BUT I don't review every course I have played. I have only reviewed ~120 courses of the 620ish I have played, so my reviews are limited and selective.
2) My reviewing goals have changed over time-- In my early days on DGCR I reviewed as many courses as possible so that I could get to Diamond TR as quickly as possible, so I reviewed everything I played until I got the diamond, then my output decreased.
3) In later years my life priorities have shifted, so now I focus more on other things for the courses I make the time to review. (I am not normal, so after I play a course and post a DGCR review my whole process takes about 3 hours per review.) I now review courses that have none or few reviews; I review courses that are at the high end as a way to show appreciation; I review courses on the low end to warn others about what they are getting into; I don't do much with the average courses. Thus 1.5 has the most reviews, then 4.5 with 2.5 being the least.

It's quite interesting to see how some people like TVK and discgolfcraig have curves that are much closer to the bell shape of the brown line.
 
Very nice. My graph has a spike at 2, which represents the large number of "Chicagoland Crappy Niner Baggin' Tours" Notapro, Threeputt, and I have done over the years. Crappy niners tend to cluster up hard at 2.0, thus creating my "higher than normal" 2.0 spike. Neat to see it graphed. :thmbup:

Ah, that makes sense. That's also why I'm not claiming anything about these graphs, because the reality is that nobody plays a truly random sample of courses, and everyone has a different story.

A good counter-example to having a cluster on the low end is having a lot on the high end. I think Upshawt1979 is a great example, because he specifically says on his profile page that he targets courses rated 3.5 and higher, and you can see that that's the range where a huge majority of his ratings are.
 
Of course this is all just for fun, but your assumptions do have some flaws.

It's fun to see people's curves, though. I think that mine has the weirdest shape of any. cefire may be the 2nd most unusual. Here are the main reasons for the shape of my curve--

Okay, I like that explanation! I think another explanation for curves with a dip in the middle would be reviewers who use a different scale for 9ers vs. 18ers. If someone uses 0.0-3.0 for nine-hole courses, and 2.5-5.0 for 18-hole courses, their overall graph would hypothetically have humps around 1.5 and 3.5 with a slight dip in the middle, as a combination of two normal-ish curves.
 
It's quite interesting to see how some people like TVK and discgolfcraig have curves that are much closer to the bell shape of the brown line.

I think it makes the most sense. Folks like TVK, Wellsbranch, and DiscGolfCraig have written more reviews than most of the TRs, so they're less prone to selecting only certain courses. Plus, they're the people that review every single course they play, and they tend to blanket entire areas of the country. The similarity of their graphs just helps confirm that they are using the full ratings continuum and aren't generally askew in their perceptions of how good (or bad) courses are.
 
Interesting to look at. Thanks for doing/sharing,Shad!
I'll say mine are skewed a bit for a few reasons.

1) I've done a poor job reviewing all the courses I've played. :eek:. I suspect if I rated all the course I've played, my curve would look different.

2) I'm more likely to review better courses than meh more average courses because better courses just make me want to talk about them.

3) As I've said many times before, I make an active effort to hit better courses. When it comes to anything farther than an hour or so away, I specifically intend to play a lot of courses I'd hope are 4.0 or higher, and to avoid courses that are below 3.0.
 
Shadrach,

Thanks for using your skills and inclinations to do this! I love analyzing data, so this is fun. Some observations...

TVK may be your best case since he has played the most courses, and he has reviewed almost 100% of the courses he has played.

Disccolfcraig has completed playing every course in South Carolina and he is close to 100% on his reviews too. It's interesting that his curve is bell shaped.

Given the data you started from, I interpret the TRs whose peak is greater than 2.5 to mean that they prefer to focus on playing and reviewing higher quality courses.

A converse corollary is that reviewing below average crappy courses in out of the way places earns you very few Helpful votes, so this site has an inherent reward for reviewing popular great courses and a disincentive to spend time reviewing remote below average courses.

And one more corollary to that last statement-- it gives me extra appreciation for those who have traveled all over their state to play every course, not matter how crummy, and then write up reviews, even though they won't get many helpful votes. The folks who have played CA, IL, MN, ND, SD, TX, WI come to mind.
 
I think it makes the most sense. Folks like TVK, Wellsbranch, and DiscGolfCraig have written more reviews than most of the TRs, so they're less prone to selecting only certain courses. Plus, they're the people that review every single course they play, and they tend to blanket entire areas of the country. The similarity of their graphs just helps confirm that they are using the full ratings continuum and aren't generally askew in their perceptions of how good (or bad) courses are.

It's funny you presented this today. I was just observing this morning how my course rating average has slowly increased from really low (2.37) to just below average (2.45). There are two main reasons for this. The first is that I've readjusted some course's rating over time either due to follow-up rounds or simple preferences. The other reason is that I've stopped (or slowed down) reviewing the mediocre 9-holers. All those 1.0 and 1.5 ratings were weighing down my average rating, which I felt was giving me an overly critical perception. I reviewed another course today (SOAC), which makes my last four reviews ratings of 4.0, 4.0, 3.5, and 4.0.
 
Very nice. My graph has a spike at 2, which represents the large number of "Chicagoland Crappy Niner Baggin' Tours" Notapro, Threeputt, and I have done over the years. Crappy niners tend to cluster up hard at 2.0, thus creating my "higher than normal" 2.0 spike. Neat to see it graphed. :thmbup:
Damn. The spike at 2 is a thing. :| Just another thing I get to blame on Chicago...
 
Okay, I like that explanation! I think another explanation for curves with a dip in the middle would be reviewers who use a different scale for 9ers vs. 18ers. If someone uses 0.0-3.0 for nine-hole courses, and 2.5-5.0 for 18-hole courses, their overall graph would hypothetically have humps around 1.5 and 3.5 with a slight dip in the middle, as a combination of two normal-ish curves.
I think that most of us have a different scale for 9 hole courses; my highest rating for a 9 holer is 3.7. Your explanation makes some sense, yet that also assumes that people play and review the same number of 9 holers as 18 holers. I think that there are more 18 hole courses, people prefer to play 18 hole courses most of the time, and people get more votes for reviews of 18 hole courses.

That then leads me to think there are 2 separate categories-- there are the courses played and the courses reviewed. The two sometimes have different motivations to accomplish. Obviously, these graphs are only able to show what have been reviewed.
 
I don't have much to add to this, let me just say I really enjoyed looking at the graphs and reading through the thread! :) Thanks for posting Shadrach!
 
I was just observing this morning how my course rating average has slowly increased from really low (2.37) to just below average (2.45)
Congratulations on the 0.08 increase! haha

But in all fairness, with the hundreds of reviews that you have written it takes a great deal just to change your average by even 0.01
 
Congratulations on the 0.08 increase! haha

But in all fairness, with the hundreds of reviews that you have written it takes a great deal just to change your average by even 0.01

It just means I'm inching closer to being 'average'. Think I've been aiming for that goal my whole life.
 
I spent 8+ years traveling across the US in an RV, and during those travels definitely sought out the higher rated courses to play. When you are only spending a day or two in an area, it's pretty tough to pass up the top courses!

So it comes as no surprise to me that my average is skewed toward the higher end. When we settled in one place or another for several months I generally played every course in the area and picked up a wider distribution of courses, but I have definitely played and reviewed an atypical group of courses.

I can look at my "courses played" map and see exactly where we travelled, and especially can see the clusters representing where we spent our summers.
 

Latest posts

Top