• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

I don't understand Udisc course ratings

I honestly don't understand why Udisc ratings are so inflated and why there are such a predominance of 5.0 ratings.

OK, it's obvious that homeboy bias is rampant on Udisc. Also, since any review can be deleted there are many cases where the homies just "report" reviews under 5.0 to get them deleted. In addition, there are no guidelines at all about what a 5.0 rating means, and there is a limit to the number of characters that can be typed into Udisc reviews. And this is all being driven by newer players using phones so they don't type much. Another factor is the Amazon/Yelp review culture that has become so prevalent. And since people record their scores on Udisc and at the end of the round they are asked to rate the course, they just quickly and nonchalantly do it while walking back to their car.

(...Well, maybe I partly answered my own question.)

Yet still, are the Udisc "feel good" ratings unique to Udisc or are they part of a larger cultural zeitgeist? On Udisc I see 5.0 ratings given to crummy little 3 hole practice areas at elementary schools, and I see some people who give a 5.0 to every single course they play. Is this part of the "everybody gets a trophy" and "lets not hurt anybody's precious little feelings" mentality? A Udisc 5.0 seems to mean "I was out with my friends throwing discs here." On Udisc a 5.0 seems to be equivalent to a like on Facebook or Instagram.

I am baffled. What do you think?
This is the PDGA's fault for not assessing the courses and giving a standard rating. Without that, the rating is all anecdotal. I can't tell you how many times I went to play at an out of town course and found an unplayable course with a 4.0-5.0 Udisc rating. The PDGA needs to get out there and standardize course ratings.
 
This is the PDGA's fault for not assessing the courses and giving a standard rating. Without that, the rating is all anecdotal. I can't tell you how many times I went to play at an out of town course and found an unplayable course with a 4.0-5.0 Udisc rating. The PDGA needs to get out there and standardize course ratings.
First....when traveling and playing out of town, use DGCR to determine what courses you want to play.

Second....having the PDGA catalog, rate and review all the world's courses is not really feasible, let alone pragmatic.

Third....reviews > ratings.
 
Udisc just sent out an email to course ambassadors about changes to their course ratings. You can now rate a course using 6 categories which is much better than a single rating for the entire course.


We've heard your feedback and have some exciting news to share.
Detailed course ratings are here! What does that mean?
  • Users can now review six categories about features on your course (in addition to the overall rating you're used to):
    • Upkeep
    • Shot variety/design
    • Tee areas
    • Signage/wayfinding
    • Amenities
    • Scenery/views
  • Players can give ratings a thumbs up to echo shared sentiments or mark them as helpful.
  • Each reviewer's count of total courses played is visible to you and all other players.
Not a bad idea, they must read a lot of Wellsbranch's reviews.

But I have a question ... since there are separate ratings for Tee areas and Signage, does that mean baskets just fall under Upkeep? I would think the quality of baskets is most important on a course, if you don't have those, you are just throwing frisbees at trees.
 
Not a bad idea, they must read a lot of Wellsbranch's reviews.

But I have a question ... since there are separate ratings for Tee areas and Signage, does that mean baskets just fall under Upkeep? I would think the quality of baskets is most important on a course, if you don't have those, you are just throwing frisbees at trees.
What percentage of courses have "quality" baskets? I would guess it's over 95%.
I would define a "quality basket" as any recognized brand basket, less than 20? years old and is not damaged.
I've played +100 courses and I can only remember only 3 or 4 having terrible baskets.

UDisc probably figures that if they have a separate basket rating people might give a course's baskets a low rating because they don't like them, even though it's a high quality basket. For example..."I hate Discatchers!" = a 0 rating..."Chainstars suck!!!" = a 0 rating..."Gateway baskets are garbage!!" = a 0 rating.
There are probably people out there that would give the entire course a "zero" rating because of certain baskets.
 
There are probably people out there that would give the entire course a "zero" rating because of certain baskets.

Wait - does anybody on uDisc give anything other than a 5.0 (DGCR reviewers excluded, of course)???
 
This is the PDGA's fault for not assessing the courses and giving a standard rating. Without that, the rating is all anecdotal. I can't tell you how many times I went to play at an out of town course and found an unplayable course with a 4.0-5.0 Udisc rating. The PDGA needs to get out there and standardize course ratings.

The PDGA tried that, years ago. Or a user-managed version of it. It had an incredibly long checklist list of criteria for users to complete, assigned a points value to each, and gave a course a rating.

If you looked at it, you would at least have information on every feature of a course.

The attempt was to be objective, but it merely pushed the subjectivity down ladder a rung or two. How much value should be placed on, say, bathrooms vs. scenery? No flexibility there. But how does a user rate the quality of teepads? That's subjective. How much extra to give to a course that's really unique? No way to account for it, but in the real world that's often the most attractive feature of the highest-regarded courses.

The end result was that nobody used it.

It would be impossible for the PDGA to "officially" rate courses -- too many courses, too many new ones. But even if they could, it would result in the opinions of a few people, by a standard set by a few people.

DGCR's system is pretty good, and probably as good as we can expect. It has the foundation of having started with a reasonable range of ratings (i.e., not a flood of 5.0s), and an average of opinions is a pretty good measure. You can filter for the ratings by people who have played a lot of courses, which is even better, though in reality it doesn't change the result very much.
 
Top