• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

DGCR v UDisc Rating

I rate a course as it stands when I played it. I'll mention things such as future improvements being in the works. If I have heard of such things. The problem is that that isn't a certainty. I'd prefer a review that states how the course looked/played at the current time of the review. Not "when" they get signs, tees, etc. Sometimes those things don't come to fruition. Worth mentioning I think, but I don't try to factor that into my actual rating usually. It's a slippery slope though.
 
I rate a course as it stands when I played it. I'll mention things such as future improvements being in the works. If I have heard of such things. The problem is that that isn't a certainty. I'd prefer a review that states how the course looked/played at the current time of the review. Not "when" they get signs, tees, etc. Sometimes those things don't come to fruition. Worth mentioning I think, but I don't try to factor that into my actual rating usually. It's a slippery slope though.

That's my take, too. I've added statements such as "will be a 4.0 when broken in" in the comments, while keeping the 3.0 rating. With a few other considerations:

* If the course is too unfinished, and I don't think the review will be valid in 6 months, I won't review it at all.

* Unless it's a 1-time play for me, I can always revise my review or rating in the future, when the signage and teepads and brush clearing and whatever have been done.

Eventually, there'll be enough newer reviews that my old rating won't put much of a dent in the course's overall rating.

Footnote: I've never asked to remove the oldest reviews/ratings from our private course, even though it's doubled in size and changed considerably since then. They tell an interesting story, at least to anyway who may read that far back, which is probably just a group of one.
 
If the review is more important than the rating, perhaps the DGCR policy should be to only allow reviewing the course when its status is officially shown as "under construction/revision" and disallow entering a numerical rating other than in the text of your review? That way, there wouldn't be any transition ratings in the rating average. Those who reviewed it during transition could go back in and update their review and enter a rating when the "under construction/revision" flag was removed after course revisions were completed.
 
Ah, but what's the right number of recent review to weight, and how much more should they be weighted, to give the right rating for a course?

(Caution: Whatever you answer, someone else will answer differently).

Would the formula be the same for a course with 13 reviews, and one with 104?

In my opinion, any formula -- weighting recent reviews, out of town reviews, trusted reviewers, whatever -- just replaces a straightforward average with a subjective and debatable formula. And that's assuming that it's an easy calculation to code into the website (which it may be, for all I know).

Weight = 1 divided by the square root of n.

Where n = 1 for the most recent review, 2 for the next most recent, etc.

For a course that has 13 reviews, the newest one would count for 17% of the rating.

For a course that has 104 reviews, the newest one would count for 5.3% of the rating.

There. [Dusting off hands.]

You're welcome.

Next question.
 
Clever. What makes it right? That is, the holy grail of a rating that unfailingly reflects exactly what a particular course deserves, which seems to be what those who call for weighting the ratings, dream of?
 
Weight = 1 divided by the square root of n.

Where n = 1 for the most recent review, 2 for the next most recent, etc.

For a course that has 13 reviews, the newest one would count for 17% of the rating.

For a course that has 104 reviews, the newest one would count for 5.3% of the rating.

There. [Dusting off hands.]

You're welcome.

Next question.

So a review that came 5 minutes ago is not worth as much as one that comes right now? No thanks, next solution.
 
If Visionquest has been discussed, sorry, I didn't read the entire thread.

That course was on my wish list but apparently has fallen far in the last couple years. Its current DGCR rating is 4.65. It is listed as extinct, even though it is NOT extinct. The only rating in the last 3 years was 2.5.

Udisc has it rated as a 3.9, but does actually show it as an active course, with reviews in the last couple weeks. Even with its "got baskets, must be a 4.5" mentality, Udisc overall rating does at least partially show the demise of the course.

So just one data point, but I am certainly in favor of weighted reviews, dropping the impact (but not the content) of older reviews, etc. Of course, with only 1 review in the last 3 years, I guess the course rating would not be affected that much anyway in this instance. Which means...

I think it still goes back (I know I am a broken record) to needing more actual current reviews/ratings. However that can be accomplished without sacrificing quality. Udisc has 1061 ratings of VQ (94 reviews in the last 3 years, not sure if ratings with no reviews are counted in that 1061 or not). DGCR has 65 reviews (1 in the last 3 years).

However much better DGCR reviews/content are than Udisc (and they are), if Udisc is getting 100 times the reviews of DGCR, then that is not a good trend.
 
Just for the sake of continuing on topic, is anyone aware of a course where the DGCR and Udisc ratings are very close?

One of my local courses gets a 3.78 on here (which seems to be a bit overinflated in my opinion) and Udisc has it at a 4.3. Fairly close if you ask me but are there courses where the ratings between the two platforms actually line up? Better yet, are there courses out there that are rated higher here than on Udisc?
 
Just for the sake of continuing on topic, is anyone aware of a course where the DGCR and Udisc ratings are very close?

One of my local courses gets a 3.78 on here (which seems to be a bit overinflated in my opinion) and Udisc has it at a 4.3. Fairly close if you ask me but are there courses where the ratings between the two platforms actually line up? Better yet, are there courses out there that are rated higher here than on Udisc?

The Lake Marshall courses are as follows:
Lions- 4.9 Udisc, 4.93 here
Lair- 4.7 Udisc, 4.67 here
Lambs- 4.3 Udisc, 3.83 here

Hawk Hollow- 4.5 Udisc, 4.74 here

I would guess most of the top courses on this site are rated higher here than on Udisc even if marginally so.
 
The Lake Marshall courses are as follows:
Lions- 4.9 Udisc, 4.93 here
Lair- 4.7 Udisc, 4.67 here
Lambs- 4.3 Udisc, 3.83 here

Hawk Hollow- 4.5 Udisc, 4.74 here

I would guess most of the top courses on this site are rated higher here than on Udisc even if marginally so.


I guess it kind of makes sense that a really good course would have a similar rating on both since courses generally get high ratings on Udisc across the board. Thanks.
 
If the review is more important than the rating, perhaps the DGCR policy should be to only allow reviewing the course when its status is officially shown as "under construction/revision" and disallow entering a numerical rating other than in the text of your review? That way, there wouldn't be any transition ratings in the rating average. Those who reviewed it during transition could go back in and update their review and enter a rating when the "under construction/revision" flag was removed after course revisions were completed.

this is probably one of those solutions that sounds simple but might be a coding nightmare.
and i would guess that the majority of people would either never go back and enter a number (which doesn't bother me) or that they would just waiting to add their presumably critical rating as soon as it opens without having played the course again.


However much better DGCR reviews/content are than Udisc (and they are), if Udisc is getting 100 times the reviews of DGCR, then that is not a good trend.

why? a lot of people keep saying this but it is not self-evident at all

sure DGCR needs more data. it will take time, as has always been the case. but i don't want that data to look like the data that is on udisc because that data is near worthless.

being the #2 site in course reviews is not necessarily bad for DGCR. i think it's actually a boon. all those trash reviews and reviewers have a home that is not here.
 
here's a good thought experiment

take all the reviews for your home course from both DGCR and udisc, add them all together, and get the average score. do you think that is a reasonable rating for your home course? if all those reviews were here then how many pages of reviews would you have to scroll through to get to one that is longer than 2 sentences and actually describes the course? is that what you want DGCR to look like? sure you could limit it to TRs but then why are we even having this conversation if we're just going to throw out the majority of reviews when we want good information?
 
Last edited:
this is probably one of those solutions that sounds simple but might be a coding nightmare.
and i would guess that the majority of people would either never go back and enter a number (which doesn't bother me) or that they would just waiting to add their presumably critical rating as soon as it opens without having played the course again.

The proposal Cgkdisc makes, if I understand it correctly, is pretty simple to code but it would only work on Udisc because it relies on people making updates to status. When I look at course conditions here, even popular courses aren't updated for months and others for years. I imagine a WIP flag would do same.
 
The proposal Cgkdisc makes, if I understand it correctly, is pretty simple to code but it would only work on Udisc because it relies on people making updates to status. When I look at course conditions here, even popular courses aren't updated for months and others for years. I imagine a WIP flag would do same.
The person involved with an existing course upgrade or redesign would be responsible for flipping the flag on DGCR to the "Review only" status. Then, when the updates/revisions are completed, they would flip the flag back to "Review & Rating" status.

To show how fast things happen on UDisc, I got word in the afternoon that the three new holes had gone in early that morning at a nearby course where I'm involved with the upgrade. Someone playing in late morning gave the course a 2.0 on UDisc for poor directional signage before I was able to update the routing map on UDisc to reflect the new routing. The only option provided by UDisc was to enter a completely new course listing to wipe out prior reviews/ratings.
 
The person involved with an existing course upgrade or redesign would be responsible for flipping the flag on DGCR to the "Review only" status. Then, when the updates/revisions are completed, they would flip the flag back to "Review & Rating" status.

To show how fast things happen on UDisc, I got word in the afternoon that the three new holes had gone in early that morning at a nearby course where I'm involved with the upgrade. Someone playing in late morning gave the course a 2.0 on UDisc for poor directional signage before I was able to update the routing map on UDisc to reflect the new routing. The only option provided by UDisc was to enter a completely new course listing to wipe out prior reviews/ratings.

haha i understand the references here

and unfortunately sounds about right
 
* If the course is too unfinished, and I don't think the review will be valid in 6 months, I won't review it at all.

I played an under-construction course nearly 100 miles from my house years ago. I knew it would be a long time before I could/would play the course again, but I felt I had information worth adding. So, after consulting with TimG, I did probably the longest course conditions update in history. A few years later, I finally got back to play the course again and do an actual review. And to follow up on the point others have made, some planned features never materialized. I'm glad I didn't write a review based on what wasn't there and never came to fruition.
 
So a review that came 5 minutes ago is not worth as much as one that comes right now? No thanks, next solution.
Yep.

To show how fast things happen on UDisc, I got word in the afternoon that the three new holes had gone in early that morning at a nearby course where I'm involved with the upgrade. Someone playing in late morning gave the course a 2.0 on UDisc for poor directional signage before I was able to update the routing map on UDisc to reflect the new routing. ...
 
we just had a new course open for snow season

19 so far posted scores on udisc

dgcr only has myself and another hardcore course bagger we call "langerr" wish list it otherwise no action

little uneven
 
predictable... we know where the traffic is

it only has 1 review over there. when you do get a chance to play it, let us know if you agree with the 5.0 rating.
 
Top