• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

A couple of thoughts on 5 star reviews

I think this is a great idea. Currently, you can check the "View Only Trusted Reviewers" box, but that doesn't change the rating. I think being able to view the TR-only rating would be a great feature that doesn't really have any downside, other than the few minutes of programming Tim would have to put in to make it happen. :)

hmmmmm... very cool idea.

I'm playing a "5-disc" rated course tomorrow in Colorado, so I'm totally jazzed to see what the hype is all about on a 5-star course to begin with... This topic has become as popular as par. GREAT STUFF EVERYONE!:cool:
 
Our new course opened today and it is a solid 4-4.5 star. I scored it as an 86.5 which puts it in between and went with a 4 because it did have a few issues...though mostly trivial.

Its not perfect though...nothing is, but I am damn glad to have it 5 miles from the house and 5 miles from work.
 
TIM would it be impossible to just make the ratings based on a 10 disc rating. Thus a 2.5 disc would be a 5 and a 4.5 would be a nine. I tend to agree with many who feel that often for some inane reason people view 2.5 as a low score when it's really supposed to mean a solid playable course. Thanks in advance and I'm sure this has been brought up before.
 
Yeah it would give more flexibility too...some inbetween room, and if your badass course is 9/10 it is way more awesome than 4/5.
 
I don't think it would make any difference. You'll just get a bunch of 10's. The people that blindly rate 5/5 will just pick whatever is at the top of the scale.
 
Yeah I guess the halves do the same...you could have 1000 discs and some places would still get em.
 
I prefer to do it like this:
dt_dogbert-bah.gif

that one was on our refrigerator for years :)
 
then lower your standards a little! I have heard plenty of 5 disc worthy comments about your course....even if you have to count cow pies as ammenities! :D Eventually I need to get up there for VTI....even as a spectator.

i will put the collection of holes at HH up against pretty much any course anywhere, however cow pies, some less than stellar baskets on the letter holes (slowly being improved), a couple of tees which need improvement (slowly being improved as well), and occasional high grass when i haven't had time to mow would leave it as a 4.5 for me.
 
Similar to movies.yahoo.com, there's two ratings, one by "critics" and one by "yahoo users". The two grades are often slightly off, but the overall rating system is effective, for me anyway. I've come to expect that the users grade will be slightly higher than the critics grade - but not always. Either way I am able to make a semi-educated decision as to whether to invest my $8.50 or wait until it comes out on dvd.

I could see something like that: DGCR trusted reviewers (or DGCR reviewers - of which there is a limited number who follow rigid ranking criteria and put their reputation as reviewer on the line) and DGCR users - everyday players/users.

Having two ratings would allow the user to make their own judgement. This would show the hometown bias effect, emotional votes, etc.

Of course, there would be some work on the DCGR side to come up with a system of naming/managing trusted reviewers, number of reviewers, etc. With 50 states and a whole lotta courses to cover, it could be a challenge, but not an insurmountable one.

I think we might be on to something here

This may work. And I'll keep going with the movies analogy in my explanation. When I look at the critics' reviews and the peoples' reviews, I'll generally base my interest on the peoples' review. This is because I go to movies to have fun, not to complain about bad acting, direction, or writing. These all factor into a critic's review, but a viewer's review is usually just an indication of how much they enjoyed the movie.

I do the same thing with DG courses. If I enjoyed playing, then it's a good course. There's only been 1 course I didn't enjoy, and it's also the only course I reviewed, if only to warn people about it. The other courses I played I'd have to go back and play again, by myself, and take notes on how to rate them. For me, this would take away from enjoying the round.

I think dual ratings may not be perfect, but in this case may be our best option as far as viewing ratings, but we still need to read the reviews and read between the lines on some of them.
 
Apart from the guys who just sign on to hype their home course, the rest of us evolve in our reviewing and hopefully mature...I went back and read some of my earlier reviews and smile...I was so stoked about finding this site and having the chance to review courses...at the time Milo Mciver was the best course I had ever played and to me it was a five and in my review I said it was what DGin heaven will be like... :)

I still stand by the MM rating (IMHO) but I've played 3 more courses that I rate a five and 1 a 4.5, 4 that are a 4 and a few 3.5s and 3s that I hope are ALL in heaven :)
 
I think that in any statistical analysis it is quite common to have abnormally high and low readings (or ratings in our case). Maybe a reasonable way to eliminate the "Jackhole" quotient is to eliminate the highest and lowest ratings for each course that has say 10 or more reviews.


Exactly what I was going to say. Drop the highest and lowest. And if a course has only 1 or 2 people rating it, then average rating should say, "course has too few ratings to be considered accurate".

We have a 5-point rating system at work to review our team leader candidates, and I've seen first-hand how one bad or good rating can skew the data. But the sword has two edges, and while people complain about the high ratings, keep in mind that there is a whole range of negative spectrum to play with, and while I might rate something 5-discs, there is nothing stopping you from rating it 1 or 0-discs.

My Lemon Lake Blue is a perfect example. I rated this course "5-discs" when I first played it. It was in perfect condition, and the weather was perfect. But after playing it this weekend, after heavy rains, I've gotten the chance to see it under less-than-ideal conditions, and I feel justified in knocking it down a notch or two. In spite of that, the course is still a great course. But my rating is only part of the whole picture - you need to consider all of the reviews, and also play it yourself to make the final judgement.
 
It's only another's opinion. For some, a three might be really bad because they have higher scales. I think people should use five if they want to. If it fits with their opinion, use it, its there.

Personally, I've excepted about 3/4 - 1 1/4 of disc rating inflation. Meaning if I play a new course and it is listed as a 3 & 1/2 then I'm sure it's likey to be what I would term a 2 & 1/2, in my opinion.
 
I don't really like the idea of ratings so much. To be honest I wish they weren't there.

The ratings on this site are one of the most valuable features for players to determine which courses to play in unfamiliar areas. I hope they never go away. Just continue to be backed by more and more data and become more accurate.


When I look at the reviews of a course I just click the button that says "view only trusted reviewers" and that way you get a good idea of the course.

The best remedy for the grade inflation epidemic on this site has been to click on the "view only trusted reviewers" button. From what I've noticed, TRers reviews are typically about .5 - 1.0 points/stars lower than the non-TRers. It seems the biggest reason is the typical home-course bias.

As the site has amassed a now reasonable number of TR's this has become a viable strategy for many courses.


I think this is a great idea. Currently, you can check the "View Only Trusted Reviewers" box, but that doesn't change the rating. I think being able to view the TR-only rating would be a great feature that doesn't really have any downside, other than the few minutes of programming Tim would have to put in to make it happen. :)

Ahhhh... Adam makes an excellent point that I was thinking while reading through the last two days worth of postings.

Tim, this is a great idea. At a minimum I think when checking the "TR only" option then the Rating and Rating Detail should be recalculated using on the TRs' ratings.


I could see something like that: DGCR trusted reviewers (or DGCR reviewers - of which there is a limited number who follow rigid ranking criteria and put their reputation as reviewer on the line) and DGCR users - everyday players/users.

Having two ratings would allow the user to make their own judgement. This would show the hometown bias effect, emotional votes, etc.

Tim, again this is an option above and beyond the minimum mentioned above for what I think DGCR should do. I think the overall rating shown on all the search pages, etc. should be calculated on everyone's rating, but on the Review specific tab I would actually like to see the TR only rating displayed along side the overall rating (without having to click "TR only").

These are options that were never viable before as the number of TRs and number of courses reviewed by TRs didn't warrant it. But as the site has grown more and more recently I think this would be a good evolution.


Exactly what I was going to say. Drop the highest and lowest. And if a course has only 1 or 2 people rating it, then average rating should say, "course has too few ratings to be considered accurate".

I disagree with this one. Even if a course only has one vote and it's a 1.0-star rating, I'd rather see that do make my own decision as to whether or not I want to play that course.
 
I think the overall rating shown on all the search pages, etc. should be calculated on everyone's rating, but on the Review specific tab I would actually like to see the TR only rating displayed along side the overall rating (without having to click "TR only") ............. as the site has grown more and more recently I think this would be a good evolution.

Sounds Perfect! :D
 
To do a TR only rating I'd have to do some tweaking of the system since there isn't a way to tell who is a TR without running things through a formula. Another challenge is it's always changing with new people making the cut so the rating would be changing even if no new reviews were present. I'd have to figure out when someone becomes a TR and then go through all the courses they reviewed, etc. Not too difficult, just kind of time consuming. It's a good idea and I'll try to get to it when I can.
 
How about, everyone can review and rate a course but only trusted reviewers rating count to the courses overall rating? Because you know someone with TR status has been on here a while and is probably legit. All non trusted reviews will hate this idea just like i would if i wasnt a trusted reviewer, but at the same time it would be the answer to better accuracy.
Not to throw a wrench of dissent into this discussion, but I think there's a problem with the "Trusted Reviewer" status that some people might not be considering. Much like the Top Ten list, its geographically discriminatory in favor of areas that have more DGCR members. If you want to earn TR status, you're going to get it quicker reviewing courses whose pages get hit more often than those who do not. By omitting data from people who simply haven't hit the criteria for TR status, you're also putting the data of these lesser reviewed courses into the hands of the one TR member who may have played it, or quite possibly nobody.
 
if a course has only 1 or 2 people rating it, then average rating should say, "course has too few ratings to be considered accurate".
Maybe you could change the color of the discs on the page, so a course with 10+ reviews discs show in one color and a course with <10 reviews discs have another color? That would make it really easy to see the status of the reviews from that one search screen.
 
This problem is easy to solve. Just get rid of the list of "top rated courses" on the front page. Voila-no reason to grade inflate.
 
Not to throw a wrench of dissent into this discussion, but I think there's a problem with the "Trusted Reviewer" status that some people might not be considering. Much like the Top Ten list, its geographically discriminatory in favor of areas that have more DGCR members. If you want to earn TR status, you're going to get it quicker reviewing courses whose pages get hit more often than those who do not. By omitting data from people who simply haven't hit the criteria for TR status, you're also putting the data of these lesser reviewed courses into the hands of the one TR member who may have played it, or quite possibly nobody.

good wrench to throw in.......BUT I think the way tim currently has things set up, it is pretty easy to see what a course is really going ot play like. Case in point from a recent personal experience.

I just played Beaver Ranch in Conifer, CO last weekend - the course averages out as a 5 disc rated course, so I was obviously pretty excited to play. While playing the course I was really trying to figure out why all the hype - I was comparing it to my local courses and other good courses I have played and really could only muster a 4.5. Don't get me wrong, it was a truly great course, but it certainly lacked some essentials to be a 5 rated course IMO. When I got home a checked the course page again and filtered the trusted reviewers only....to my relief ALL teh TR's actually had it rated at a 4.5.

I think the local home bias is fine! I think the non-trusted reviewers are fine! We all started "un-trusted" at the beginning and eventually earned the respect of our peers to tally up the positive votes! For some of us it took awhile (must be you guys don't get to the midwest often....) for others playing all the high rated courses things may have come a little easier. In the end it doesn't really matter because we love to play the game AND we have a resource that allows us to share our fun with others, which includes a review that is substandard from time to time.

Give the site another year and tim will have continued to implement even better ways of averaging out ratings and the trusted reviewer stats may even be different. It's all good!!! You can look up a course and decide for your self whether you are going to include it in your disc trip or not, and that my friends was not available all that long ago. This is your 1 stop shop for disc golf across the U.S. live it & love it.....and then review it :)
 
I really don't see this as much of an issue. It seems to me that there is of course those who will shamelessly rate a course higher than it deserves, but in the aggregate it all gets averaged out. I suppose the argument could be made that if a course had only a few ratings than a 5 disc rating would throw off the average. But don't you think a course worthy of a 5 disc rating would get more than a few reviews. If you really wanted to get technical you could make a program to calculate the standard deviation of each average which would give you a better idea of where that average comes from, I believe one standard deviate from both sides represents 95% of the population.
 

Latest posts

Top