• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

DGCR v UDisc Rating

FWIW, I wasn't trying to imply there was a problem.

But from his earlier post, it seemed to me that Rich was, and he seems to have had a few gripes with DGCR over the years.

So I was simply asking him what he wants to see happen.
 
the top 10 does matter. if your course gets there, i guarantee you would have an uptick in visitors. a course in this area totes 'come play one of the top 10 courses in the world' because they made the list. i would like to see the top 10 change more here, especially for those on the list that have changed or are seasonal (only excellent/playable certain parts of the year). you know where top10 does change... udisc.
Of course it matters. I asked the owner of Harmon Hills if he noticed a change in traffic after making the top ten. He said it made a huge impact...not just an uptick.
 
the top 10 does matter. if your course gets there, i guarantee you would have an uptick in visitors. a course in this area totes 'come play one of the top 10 courses in the world' because they made the list. i would like to see the top 10 change more here, especially for those on the list that have changed or are seasonal (only excellent/playable certain parts of the year). you know where top10 does change... udisc.

Conceded, it makes that difference to a PTP course wanted higher attendance. That's a half-dozen or so people.

I'm not sure why it makes so much difference to users, though. And if people are flocking to Top 10 courses, they're going to go somewhere; change the formula and it's just different courses they'll bucket list.
 
If anyone would like to assist with adding more data, here is a link to the google sheet where I did the data analysis. Just scroll to the end and you can input course name, DGCR rating, UDisc rating.

usp=sharinghttps://docs.google.com/spreadsheets/d/1v0e7ie9dc38zMDO_WP_ww-wT9PhzhIGNTkmW9DSHTFI/edit?usp=sharing

Just added around 30 or so courses, mostly in the Cincinnati area. All the debating aside, I really do like the basic concept. I think finding the standard deviation between the ratings could add a bit of interesting info as well. I may mess around with that.
 
Before you guys spend too much time, you should know that the Udisc rating system, when tabulated at the end of the year for the various best courses lists depends on the number of ratings as well as the actual rating.
 
Before you guys spend too much time, you should know that the Udisc rating system, when tabulated at the end of the year for the various best courses lists depends on the number of ratings as well as the actual rating.

DGCR does as well. Regardless, that shouldn't matter for a comparison of ratings though right? I'm not wondering about their list of top 10 or however many vs the front page of DGCR or anything.
 
DGCR does as well. Regardless, that shouldn't matter for a comparison of ratings though right? I'm not wondering about their list of top 10 or however many vs the front page of DGCR or anything.
Personally, I would be most interested on how these comparisons stack up at the top...Udiscs raw rating data is not the same as how it ends up. DGCR's rating,once you get 21 reviews is not influenced
by # of reviews. With Udisc, it is very much is affected by # of rating/reviews at least until you reach 300. What you are seeing is just the first tier of how they rate courses.
 
Don't know if it's been mentioned or not but when I logged on UDisc yesterday there was a survey available asking about the topic of rating courses. They are very aware that ratings are bloated and inaccurate. My solution was to suggest having 'pro reviewers' of courses. They would need to have written a minimum arbitrary amount of reviews first. The second requirement would be that users can give a thumbs up or thumbs down for reviews, that way the 'pro reviewer' would need to have a high percentage of good reviews. Thus, the top reviews shown on UDisc for each course or each hole layout would be from the highest rated pro reviewers on the platform.

Hopefully if they implement something like this, the courses like Sheetz Gold Course, would have plenty of the reviews with thumbs down indicating reviews are inaccurate. I've personally started writing the best reviews I can with the limited amount of characters. I try to hit on facilities, signs, teepads, baskets, and layout with some quick pros and cons.
 
DGCR is a Forum, I could honestly care less about the ratings portion of this website. I would assume I am not in the minority.
Udisc asks me to rate a course after I play, however I usually don't.

If I want to see if a course is any good, I just go play it.
Pretty sure the underlined is true for damned near most people on this site, for courses that are relatively close to them.

Personally, I don't want to drive 5-8 hrs, and spend nights on the road just to play ocourses that only rate as decent to very good on my personal scale. When I put time and money into a DG trip, I'm specifically planning to play courses I think are at least a 4.0, with maybe a few 3.5's to round things out and fill the trip, and hopefully, some 4.5's.

I think that's true for a great many traveling casual players. Quality reviews, ratings, and pics are very useful when deciding where we should spend our time.
 
Last edited:
I think that's true for a great many traveling casual players. Quality reviews, ratings, and pics are very useful when deciding where we should spend our time.

Case in point, we are taking a weekend trip in a few days to play Northwood Black, Eagle's Crossing, and Harmony Bends. Now EC has only one review, but it was thorough, and I have seen a round played there on video, so I have a great feel for what it is. I expect my arm to be numb by MO morning...
 
When I put time and money into a DG trip, I'm specifically planning to play courses I think are at least a 4.0, with maybe a few 3.5's to round things out and fill the trip, and hopefully, some 4.5's.

Those highly-rated courses are certainly the highlights of the trip. But lower-rated courses can be a ton of fun, too. Old-school woodsy par-3 courses are often rated around 3.0, and I have a great deal of affection for that style of disc golf.

Playing a sprinkling of lousy-to-mediocre courses helps me appreciate really well-designed courses even more. But you have played more courses than me, so perhaps you have already developed that appreciation. :D

I think that's true for a great many traveling casual players. Quality reviews, ratings, and pics are very useful when deciding where we should spend our time.

Agreed!
 
Those highly-rated courses are certainly the highlights of the trip. But lower-rated courses can be a ton of fun, too. Old-school woodsy par-3 courses are often rated around 3.0, and I have a great deal of affection for that style of disc golf.

Playing a sprinkling of lousy-to-mediocre courses helps me appreciate really well-designed courses even more. But you have played more courses than me, so perhaps you have already developed that appreciation. :D

That's the beautiful thing. Everyone can figure out what works for them. If hitting a few less than stellar courses helps someone enjoy a trip more, great!

Good info is still required give you some idea which courses are likely to be:
More fun/less fun
Easier/@ss-kickers
So hilly you might only hit 1-2 courses that day
Navigational nightmares

...or whatever else is meaningful to you when mapping out a trip.

If someone's happy just ambling up to any 'ol course and bagging everything they can, that's cool, too. I certainly can't speak for TVK, but having spent some time with him, that seems to be how he rolls. He wants to hit everything he can when he visits an area, regardless of the ratings. What anyone else happen to think about a course probs won't affect his decision to play it.

But I'm thankful he reviews damned near all of them so that cherry pickers like me have good info to plan with, rather than hoping to randomly stumble upon a few special courses.
 
DGCR vs U-Disc rating vs Google Maps rating...
I checked 10 courses in Upper Michigan (that's the upper peninsula Not the Traverse City/Petosky area)
UDisc ratings were higher every time, varying from .2 higher (Powder Mill & the Tailings) to 1.1 higher (North Bluff).

Furthermore, when you look at Google Maps the course ratings go up another few tenths across the board. (I have no idea why, but Powder Mill had no Google Maps ratings)

I'm sure it's because almost all DGCR ratings have a lengthy explanation of the rating. On UDisc or Google Maps it's easy to give a course a "4" or "5" if you don't have to say why it's worthy of that rating.
 
I'm sure it's because almost all DGCR ratings have a lengthy explanation of the rating. On UDisc or Google Maps it's easy to give a course a "4" or "5" if you don't have to say why it's worthy of that rating.

Again, nothing really prevents people from writing short reviews on DGCR. They just usually get panned for lack of content, and not really saying much about how the course actually plays.

Nothing wrong with a short but decent review that explains the variety and disc play.

...and doesn't rate some pitch'n'putt niner a 4.0 :\
 
When I put time and money into a DG trip, I'm specifically planning to play courses I think are at least a 4.0, with maybe a few 3.5's to round things out and fill the trip, and hopefully, some 4.5's.

I think that's true for a great many traveling casual players. Quality reviews, ratings, and pics are very useful when deciding where we should spend our time.

I wholeheartedly agree!

This was my main (and almost only) use for this site for a dozen years or so. I only started paying somewhat regular attention to the forums last year.
 
The concept people are sorta sniffing around in this thread is "barrier to entry. It's a term from economics that sums up the costs of being a new producer of some good or service. It tends to get reused in UIX/engagement design. The lower the barrier, the more people will do whatever it is you are asking.

Entering the uDisc rating market for a course has an extremely low cost. One click on an app (you already are using right at the moment), maybe a single typed sentence fragment. Entering the the DGCR rating market for a course is, in comparison, extremely high. Go to the website (later). Refind the course. Create a login. Read a lot of text. Enter something in all the required fields, etc. The interface for reviews even subtly encourages you to actually read other reviews.

But that barrier to entry is because the good being produced is higher quality. The reviews on DGCR are intended to be higher cost, because "we" (the user base) want more thorough reviews. No matter what you do to make it easier to review, you are still going to fall way behind uDisc on total reviews for that reason, and that's by design.

It's also important to note that (I don't believe) timg is not doing this as a profit maximizing venture. uDisc level of engagement here might be more of a PITA than it's worth for him.

Also: Overall forum use and review generation are completely different things and you can't compare the two in any meaningful way, especially not based on post/review current logged in activity. I write posts on a daily basis. I write reviews on a yearly basis. The relevant metric would probably be something like "total number of unique IPs that see a forum post in a time period" vs. "total number of unique IPs that have viewed reviews in a much longer time period".
 
Again, nothing really prevents people from writing short reviews on DGCR. They just usually get panned for lack of content, and not really saying much about how the course actually plays.

Nothing wrong with a short but decent review that explains the variety and disc play.

...and doesn't rate some pitch'n'putt niner a 4.0 :\

At least DGCR has the "Best of the Best" tag on a rating of 5 and a "Phenomenal" on 4.5 that I think makes people with less experience go at least a little lower.
 
I'm sure it's because almost all DGCR ratings have a lengthy explanation of the rating. On UDisc or Google Maps it's easy to give a course a "4" or "5" if you don't have to say why it's worthy of that rating.

based on the survey available from UDisc they are possibly trying to get away with this. One part of the survey presented an interesting idea. Instead of being able to rate stars, the user would answer some quick questions (i am assuming here, ex teepads, baskets, course condition, etc) and that would generate the star rating from an algorithm. Could be interesting if developed right.
 
Top