• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

A couple of thoughts on 5 star reviews

I've noticed an increase in 5 star ratings, also. If I'm traveling somewhere I like to use this site to decide which area courses to play. When people inflate the ratings of their favourite course, then it becomes a crapshoot. I think the ratings system has become a source of competition among some DG'ers, to rate their home course higher than everyone else's home course.

I suggest if you travel somewhere and you pick a course off this website that's been rated 5 stars, but from your experience it really didn't measure up to it, that you review it and make a note that the course rating appears to have been inflated by locals. At least the next person traveling to the area, can then have a more unbiased review of that particular course.
 
Last edited:
Relax and enjoy the reviews for what they are--subjective descriptions of played courses. Today, I played Bryant Lake in Minneapolis and loved that the upkeep was sensational--except some of the lines to the basket needed more imagination and variety. So, I would give the course a 4.0, maybe a 4.5. I have played over 100 courses and I have to agree that I have not yet seen a course that rates a 5.0.
 
I think Eagle hit it right on the head. I know that when we travel, we try to hit the best courses along our route. Some have been given 5s by people and that is their opinion. I feel like I am harder on these courses during my reviews because of the "letdown" factor. I fully feel that Idlewild and Winthrp are the only 5s I have played. To me, they lived up to the hype. My other example is my review for Shilito Park in Lexington, KY. It had been getting between 3 and 3.5 from most everyone. To me it was a letdown and I gave it either a 2 or 2.5...I can't remember right now. I got a few NH votes but that's no big deal. I think that anyone going out of there way to play there might be letdown as I was. The problem created by multiple "shady" 5s given by the drive-by reviewers is that it causes experienced players to go out of their way to play courses. I would say that in areas with a high concentration of courses this could be a problem.

Take the Minneapolis area. There look to be some awesome courses there. If you take a trip to the area and happen to be disappointed with a course, not only did you get disappointed, but you may have missed out on playing another quality course that you might have enjoyed more.

Anywho, everyone has their opinions and they all count the same. I usually look for TRs, but we all start somewhere, and new users often write reviews just as good as any TR. But I know that disappointment has been an issue for me on a few courses. But play all you can. Review all you can. And as the site grows, the statistical law of large numbers will take effect and the ratings for a given course will naturally gravitate to where they belong.
 
EricJ: I don't know if this would help, but if there was a way to create a "list" of criteria for rating a course i.e. a template with some of the following:

1. Course design implements risk/reward.
2. Course requires a variety of shots.
4. Course has OB, water and or street.
5. Course is tight and technical OR long and open.
6. Course has concrete tees.
7. Baskets are in good condition
8. Course has drinking water.
9. Course has restrooms.
10. Course has an on-site pro shop.
11. Course is well maintained.
12. Course has good flow.
13. Course isn't to crowded.

This is just a quick list I made up. I know a lot of you guys can come up with other things you go by when rating a course, so feel free to add your two cents. If we can come up with a standard to rate courses by and create a template in the course ratings section, it might help with more accurate ratings.
 
EricJ: I don't know if this would help, but if there was a way to create a "list" of criteria for rating a course i.e. a template with some of the following:

Eagle,

Don't mean to be a Debbie-Downer... this is a good idea, but it's been tried before and it doesn't work. Forcing a checklist down people's throats doesn't work and there's still subjectivity involved in deciding if a course is "tight and technical". The best you can do, when you decide to start reviewing, is to study other folks' reviews (Olorin, for example), decide what things you like and don't like, and then develop your own style.

But like I said before, I don't want this thread to turn into trying to objectify something that's subjective. There's a specific problem that needs to be addressed.
 
Last edited:
The best remedy for the grade inflation epidemic on this site has been to click on the "view only trusted reviewers" button. From what I've noticed, TRers reviews are typically about .5 - 1.0 points/stars lower than the non-TRers. It seems the biggest reason is the typical home-course bias.
 
Eagle,

Don't mean to be a Debbie-Downer... this is a good idea, but it's been tried before and it doesn't work. The best you can do, when you decide to start reviewing, is to study other folks' reviews (Olorin, for example), decide what things you like and don't like, and develop your own style.

But like I said before, I don't want this thread to turn into trying to objectify something that's subjective. There's a specific problem that needs to be addressed.

NP, I haven't been around here that long, so I didn't know it's already been tried. I figured it would even the playing field for the players new to the sport or those who haven't played more than 1 - 2 different courses, by giving them a reference tool.

On another topic, MAN.....I would really like to play Mesker park in Evansville, IN. right now. I'm having a bad disc golf jones, and that course looks SICK!
 
Last edited:
If you want a template there is already one here. When you write a review there is a link you can click on to give general ideas on what to write/base your review on. I figure it is also a good way to figure out my rating. It works for me.

I just think some common sense when reading ratings and reviews is in order. The ability to click TR only reviews is good. But is there some way, say after a 6 month period, that if a person does not log in again to drop their review and rating. It seems to me that is the person that is doing the damage.
 
This isn't a 5-star review, but it still sums up the grade inflation problem.

http://www.dgcoursereview.com/reviews.php?id=3082&mode=rev#11128
Pros: Best course ever for beginners and Pros alike.

And this comes from an expert who's played a grand total of 3 courses. He rates the course a 4.5, which means it's probably around a 3. Yes, grade inflation is, has been, and always will be, a problem.
 
Maybe the answer is a "Cream of the Crop" tab (ala Rotten Tomatoes) that shows you the average of the trusted reviewers if three or more of them have reviewed the course? That way you could have both of them side by side. But the overall rating would always trump the Cream rating.
I think this is a great idea. Currently, you can check the "View Only Trusted Reviewers" box, but that doesn't change the rating. I think being able to view the TR-only rating would be a great feature that doesn't really have any downside, other than the few minutes of programming Tim would have to put in to make it happen. :)
 
Rating inflation is running rampant; nearly everytime I log on I see two or three 5 star reviews on the front page. I'm sure most of these courses are great, but 5 stars should be pretty damn special.



Also, I'm sick and tired of reading three sentence 5 star reviews. If you are passionate enough about a course to give it 5 stars you should be able to write more than three sentences about it. And don't give me that "everyone else has already said all that can be said" crap. Forget their reviews and describe your experience in your own words. I know everyone here isn't an English major, but if you can find your way to this website you should be capable of writing rational sentences.

Here's what I propose to fix the problem:

#1 - If you rate a course 5 stars, there needs to be some sort of character minimum in your review. Maybe 1,000 letters or or a couple of hundred words, something along those lines.

#2 - If this is your first review, you can not rate a course 5 stars.

#3 - If haven't marked any courses as played (other than the course you are reviewing), you can not rate a course 5 stars.

I suppose there are ways around all three of those rules, but they'd help deter drive-by homeboyism.

As a sidenote, Timg, is there any way to see the overall average of all courses from 6/12/08, 12/12/08, and 6/12/09? I'd bet money that it's gone up at least one or two tenths of a point in that timeframe. Some of that is due to people naturally seeking out better courses to play and review, but a good chunk of it is also people over-rating their home course/area/state with little to no frame of reference.

***To be clear, this is by no way a condemnation of 5 star reviews currently on the front page... although a couple of them are questionable. I'm looking at the overall picture here.***

/rant


wow! maybe it is just me but this is a bit extreme. who cares if people overrate a course by your standards, there is alot of people on here, which is obviously going to create many different perspectives on what they like or dislike. rate the courses that u play the way u want to. to me it does not matter what the course is rated, i want to play every course i have the chance to. just lovin the game!

as quoted by a DGCR member that i have seen on post,

there is no crying in disc golf!!
 
wow! maybe it is just me but this is a bit extreme. who cares if people overrate a course by your standards, there is alot of people on here, which is obviously going to create many different perspectives on what they like or dislike. rate the courses that u play the way u want to. to me it does not matter what the course is rated, i want to play every course i have the chance to. just lovin the game!

as quoted by a DGCR member that i have seen on post,

there is no crying in disc golf!!


You're missing the basic point of having a rating system in the first place. The rating system is in place to give people, usually non-locals, a true sense of how courses stack up in a given region. If everyone rated every course a 5, which people have every right to do so, it muddies the waters for people being able to know the difference between the good and bad courses.

I couldn't care less how the courses in the Charlotte area stack up against courses in Texas, California, etc in a subjective rating system. I think most people probably agree with that, unless it's one of the Flip City zealots. It's more important for people coming to the Charlotte area, or any city, to be able to see a clear difference in how the courses stack up against each other. People need to know courses A, B, C (Winthrop, Renny & Hornet's Nest) are the top 3 here, they're the must plays. It'd hurt all the Charlottte-area courses if every single one were rated a 5, and people played lesser-quality courses than the true elites.

And that's why having a rating system is important.
 
and also if you think a course is the ultimate disc golf experience then there should be more than one or two sentences to describe your amazing experience at said course. I have not played a 5 yet but I am only at 32 played courses. I am willing to drive a maximum of 650 total miles in a 1 day golf trip so I would like to be able to trust the overall score of the course. But the best thing about this sight is the pictures of the tee shots of each hole, in my opinion. that is better than any review because you can see the meat of the course so to speak.
 
I think that in any statistical analysis it is quite common to have abnormally high and low readings (or ratings in our case). Maybe a reasonable way to eliminate the "Jackhole" quotient is to eliminate the highest and lowest ratings for each course that has say 10 or more reviews.
 
also another thing to watch out for is emotional reviews. People that give a heavily wooded course a bad review because of poison ivy. Almost every heavily wooded course that I have played had some. And people that are still mad when they are writing their review because a course was hard to navigate. This above all can ruin a round for the more impatient players who demand pristine and well kept courses that have good signage and arrows pointing you in the right direction. That is why when I play a hard to navigate course I try to remember as many tips to put in my review to help the next golfer not waste a lot of time and energy. I love heavily wooded courses the most and get into them so much that some ivy will not ruin a round. But I totally understand having a high standard and being picky. If a person's 1st round is on a course that is hard to navigate with poison ivy everywhere, they might think that all courses are like that and never play again. The more people playing means more courses to play, supply and demand.
 
Similar to movies.yahoo.com, there's two ratings, one by "critics" and one by "yahoo users". The two grades are often slightly off, but the overall rating system is effective, for me anyway. I've come to expect that the users grade will be slightly higher than the critics grade - but not always. Either way I am able to make a semi-educated decision as to whether to invest my $8.50 or wait until it comes out on dvd.

I could see something like that: DGCR trusted reviewers (or DGCR reviewers - of which there is a limited number who follow rigid ranking criteria and put their reputation as reviewer on the line) and DGCR users - everyday players/users.

Having two ratings would allow the user to make their own judgement. This would show the hometown bias effect, emotional votes, etc.

Of course, there would be some work on the DCGR side to come up with a system of naming/managing trusted reviewers, number of reviewers, etc. With 50 states and a whole lotta courses to cover, it could be a challenge, but not an insurmountable one.

I think we might be on to something here
 
what we have is 10 different rankings. why not instead of do halves just do 1 thru 10 ranking. Maybe someone will think more about giving a course a 10 more than a 5. Maybe not. But we are used to top 10 lists and on a scale of 1 thru 10 and so on. Also when we give a course a 3 we are saying it belongs in the upper half quality level. But when we see a 3 it almost feels like it is barely above average when we are really saying that it is a good course.
 
But when we see a 3 it almost feels like it is barely above average when we are really saying that it is a good course.

This is one of the biggest reasons why I am a proponent of using letter grades. A "C" is middle of the pack - does not feel as bad.

5.0 = A+
4.5 = A
4.0 = A-
3.5 = B+
3.0 = B
2.5 = B-
2.0 = C+
1.5 = C
1.0 = C-
0.5 = D+
0.0 = D

Since the 2 scales do not match up perfectly there is no D- or F (and IMO there are so few real F courses that it is not worth having)

To me a 2 disc course is a C+ course..... still slightly above average, but it "feels" or looks bad in the display of discs. Also a 3.0 rated course is a very solid course (like a B student is a solid student)....even though it looks just a smidgeon above average with only 3 discs.

This approach lends itself to more low ratings.....thereby helping with the grade inflation many have noticed and complained about.
 
I just think some common sense when reading ratings and reviews is in order. The ability to click TR only reviews is good. But is there some way, say after a 6 month period, that if a person does not log in again to drop their review and rating. It seems to me that is the person that is doing the damage.

Yeah, to me, that is the only way you could possibly prevent inflation. At least by forcing people to log in more they might get ideas about objectivity or appreciate other courses that are not their home courses.

Either way, I've come to view DGCR as a popularity contest rather than a scientific project.
 

Latest posts

Top