• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Movement in top 10

Would there be enough solid reviews from ineligible reviewers to allow Selah to remain eligible for consideration for the top 10 list? If so, you've cheated a course.

I believe that putting the limitation would eventually wind up hurting more courses because the drive by writers would have to write more terrible reviews to make sure they count. Do you really think the writers of the well-written reviews are going to write more so their voice is heard? Someone who writes good reviews, Jay Dub, wouldn't. Now you have more reviews from the people you're trying to eliminate and less from the ones you want. I know I'd rather have Jay Dub's four and jackhole's one than jackhole's five and Jay Dub's none.

I think I misunderstood your point. I don't like the suggestion of getting rid of reviews from "drive-by" reviewers, and I agree with your point that it would make it tougher for out of the way courses to get the recognition or the more popular and accessible courses. I agree with you that it's not necessarily true that someone who has played many courses but only written a few reviews should be discounted.
 
I think more often than actually being verbally pressured/encouraged to right reviews with high ratings, players do so as a sign/token of appreciation. The compulsion is internal rather than external.

That I could see, though I personally try really hard not to let that affect my ratings. It does seem that the experience of a cool private course does tend to raise ratings somewhat. I have seen several times posters claim that reviewers are compelled or convinced by a private course owner to give a good review and I was saying that I haven't experienced that.
 
Fixed revision placing Phantom Falls in it's correct place.

Rank. Course Rating (old rating)

1. Flip City 4.87 (4.85)
2. Idlewild 4.84 (4.76)
3. BRP 4.74 (4.77)
4. HB Gold 4.70 (4.72)
5. HB Granite 4.69 (4.70)
6. HB Blueberry 4.68 (4.71)
7. Maple Hill 4.65 (4.69)
8. Milo McIver 4.64 (4.56)
9. Tyler 4.61 (4.7)
9. Hornings Meadow Ridge 4.61 (4.66)
11. Beaver Ranch 4.60 (4.63)
12. Holler n the Hills 4.59 (4.63)
13. Moraine 4.56 (4.6)
13. Phantom Falls 4.56 (4.69)
15. Ashe County 4.53 (4.54)
16. Winter Park 4.52 (4.59)
17. Water Works 4.50 (4.53)
17. Whistlers Bend 4.50 (4.53)
19. Brakewell Steel 4.48 (4.54)
20. Deer Lakes 4.45 (4.66)
21. Sky High 4.39 (4.52)
22. Foundation Park 4.31 (4.51)

Courses that no longer qualify because they dropped below 20 reviews.
Course Rating (old rating) new number of reviews

Selah Lakeside 5 (4.98) 17
Selah Creekside 4.8 (4.88) 15
Rollin Ridge 4.67 (4.67) 15
Magic Meadows 4.4 (4.66) 4
Shawshank 4.57 (4.60) 14

... and the elephant in the top ten room

Flyboy 4.9 (4.9)

I do like to see Moraine that high, and past Deer Lakes. DL is awesome, but some of its hype is reputation in my eyes. Moraine is the real deal.

Interesting points about the ratings system. Perhaps this point has been already raised, but I feel like less frequented courses would suffer most from the proposed change. There are a lot of smaller courses out there that already aren't getting reviewed a lot, so their ratings could be skewed, potentially being left to only one or two reviewers who meet the criteria. Just an off-the-cuff thought that might not be applicable, or might have already been addressed.
 
That I could see, though I personally try really hard not to let that affect my ratings. It does seem that the experience of a cool private course does tend to raise ratings somewhat. I have seen several times posters claim that reviewers are compelled or convinced by a private course owner to give a good review and I was saying that I haven't experienced that.

I haven't played very many private courses (7 by my count), but I haven't experienced it either. I just remember the owner meltdown from that thread.
 
I haven't played very many private courses (7 by my count), but I haven't experienced it either. I just remember the owner meltdown from that thread.

Ha, yeah, I managed to avoid that particular owner when I played that course. :p
 
Moraine is the real deal.

I think it is an amazingly beautiful and peaceful course. Well designed with multiple tee pads for different skill sets.

My opinion is that one design aspect is way over-used. There are too many holes that require a long drive to be placed in a very small landing zone to give an optimal approach angle to the basket through a very tight and short fairway segment. This is a cool feature to use on a few holes per course....but Moraine has too many of them.
 
How about, for new DGCR reviewers, ALL of their reviews will only be viewable to other DGCR members UNTIL they attain Bronze TR status? That way there's no incentive for the general public to latch on to bogus reviews (whether inflated or deflated) and create an account for the sole purpose of continuing such behavior.

My two bits.

^ AND this method encourages greater site participation in the form of more reviews. Let the course ratings stand as they do -- all reviews figure into the rating -- but ONLY logged-in members can even see the non-TR reviews and determine whether they are worthy for up-voting (again removing the variable of a non-member seeing a non-TR's inflated/deflated review and deciding based on that review to join the site and promote/demote said review).
 
.
From your perspective of playing over 700 courses, it's not top ten quality. He has played 130 and it's not in his top 20...not quite the same comparison.

I'm not 700, just 220, but I agree that while BRP was manicured, it is one of the few courses (from top 10 to 1.5's) I have played that I actually got a little bored with it before my round was over. Phantom Falls is another one. The common factor to both of these is the repetition/similarity of several holes.

If I had a machine that could transport me instantly to any course I have played before, there are so many courses I would choose to go back to before I would choose BRP or PF. I do not have a burning desire to return to those 2 courses, where I would love to have another shot at Highbridge or Beaver Ranch or IDGC or.......

However, understand I am not saying those two are bad courses, just not my absolute favorites--- remember this thread is focused on top 10. But if everybody else thinks they are top 10 worthy, then they are (if the top 10 is formulated using what everyone thinks). Then we can all get on here and debate/disagree rather than working or sleeping. If we all agreed, we would be bored. :D

____________

Also, I challenge all people who have not played Phantom Falls but insist it is a pitch and putt to get out there with only a putter (or take a buzzz too if you want) and then post your score. If you deuce 18 holes and par the other 18 then call it what you want. (but you won't, because after hole 9 you will go to the car to retrieve your best long range understable driver)
 
From a MN guy.. BRP is boring.

Great course, love the work they do, cant find a better place to throw in the winter, hell of a challenge but its a sod farm with some woods and a couple drainage ponds.

Probably MN's most over-rated course. Don't get me wrong its a great place to play disc golf but you throw it more than 10 times and its whatever. A few of the recent changes actually make me like the course less for some reason the flow is off once you get through the first 9ish.
 
First of all you're wrong in that none of the top ten had a tenth of a difference, Phantom Falls does. There's a couple more that are right a tenth of a point. Then the bottom three on the list has one over a tenth and two over two tenths. Statistically these are pretty big variances on such a small scale.

Your point about whether somebody will be more or less inclined to go play a course if you change by a tenth doesn't really play in to the conversation we're having. It's a thread about ranking courses on a website that ranks courses. How people interpret the data is their own business, however the site should aim to give the most accurate representation of the rankings if they're going to put them out there.

Ah, a fundamental difference in view. I don't know Timg's intent. But I don't see this as primarily a website that ranks courses, but a website that rates courses, with the ranking being a minor and relatively unimportant side issue.

That's why my perspective is that he shouldn't complicate the rating forumula just to refine the Top 10 rankings.

(And on Mando's list that I cited, Phantom Falls was #13, not Top 10, which is what I was commenting on).
 
I've started noticing this conversation veering off course, and it's started to become comical. Unless your home course is in the Top 10, or just missing the cut, nobody cares that much the top 10 list. It really doesn't make a difference to anyone if your course is ranked 5th, 10th, 11th or 25th. The fact is any course even in the discussion means it has to be pretty good. And yes, we'll debate whether pretty goods means above average, great or elite.

The point of the site has always been to help a fellow disc golfer out, letting others know about all courses, good and bad. I've played some great courses thanks to this site. I've also played courses that weren't that great due in part to people, often locals, over inflating a course's rating. For the most part though, I think most people are in the same ball park when it comes to ratings.

I've had people complain that I'm too tough in my ratings of courses. In the end, what's really the difference between someone rating a course 3.0 vs 3.5, or 4.0 vs 4.5? A 3.0 rated course and a 3.5 rated course are going to be in the same ballpark in terms of overall quality, so what's the big deal?

If this site gets too exclusive, it's going to ruin the fun, driving many people away. Just because certain people discovered this site several years earlier than others, shouldn't amount to much. Let people review, let their voices be heard. Now, if you're going to bash courses or course owners, your voice may only be heard for 20 minutes before your review is pulled. Besides I thought people in Colorado are much more mellow these days, since January 1st to be exact.
 
Such a stupid statement. That would leave like 10 members who could review. U would have no top ten.

It was intended to be a hyperbole, designed to demonstrate that no elitism is OK. The only correct system is one allowing all opinions to count, regardless of whether you or anyone agrees. The secondary point was I contend that experience and a large course played dossier is the counter arguement to the frequent reviewer thought. Though frequent reviewers often fit into the above categories, outliers will still exist. No ideal system exists, except to allow everyone to have a voice.
 
When calculating overall score, I suggest multiplying each individual rating times the number of reviews that reviewer has done, then divide by total rater reviews. (In effect, your ratings get more weight the more reviews you've done.)

Example:
- Course A has 2 reviews, a 4.5 by a guy with 87 reviews, and a 2.0 by a guy with 3 reviews.
- (4.5*87) + (2.0*3) = 397.5
- 397.5 divided by 90 total reviews = 4.42 overall rating

Note that when a person posts a new review, the overall rating changes for ALL courses that reviewer has reviewed.

If you want to get really fancy, also take each reviewer's ThumpsUp to ThumbsDown ratio and figure that in as well. I can do an example if you're really interested and/or don't understand.

Notes:
*I agree with those who say this is totally unnecessary.
 
I like that idea for the sole reason that it encourages more reviews to be written. At this point, everything that can be said already has been said for most courses.......and said over and over for the popular ones.
 
It was intended to be a hyperbole, designed to demonstrate that no elitism is OK. The only correct system is one allowing all opinions to count, regardless of whether you or anyone agrees. The secondary point was I contend that experience and a large course played dossier is the counter arguement to the frequent reviewer thought. Though frequent reviewers often fit into the above categories, outliers will still exist. No ideal system exists, except to allow everyone to have a voice.

I got it. still stupid. we are trying to have a discussion with "real" solutions. not your exaggerated nonsense.


On a side note, this may be the first time ever,

I actually like Grodneys Idea above.
 
I got it. still stupid. we are trying to have a discussion with "real" solutions. not your exaggerated nonsense.


On a side note, this may be the first time ever,

I actually like Grodneys Idea above.

No, you don't get it. I am simply defending everyone's right to review, rate and have an opinion. But, the solutions discussed insist on discounting many reviews. Based on some belief that they do not meet your standards. You continue to simply dismiss those opinions because you do not agree with them, much like you are attempting to do with mine. The "real solution" is to a problem you percieve, I don't see a problem.
 
ZMan review of Blue Ribbon Pines;
"This is the nicest dgc I have ever played,without a doubt."

Would be hard to quote me on something I haven't written yet ... Not sure where this came from, but "ZMan44" did not write it. And the only username for a "ZMAN" has no reviews ...hmmm...

If BRP didn't have the boring filler holes and had better tee signs, and fewer drunk folks wading through the water hazards, it would be pretty close to a Top 10.
 
Would be hard to quote me on something I haven't written yet ... Not sure where this came from, but "ZMan44" did not write it. And the only username for a "ZMAN" has no reviews ...hmmm...

If BRP didn't have the boring filler holes and had better tee signs, and fewer drunk folks wading through the water hazards, it would be pretty close to a Top 10.
Sorry about that...I was looking at your Bryant Lake review and thought I was on BRP.
 
I don't think there's a magic formula to get a definitive Top 10. The system "is what it is". Back in the early days of the site, the Top 10 was a cool novelty. Now, it has become pretty contentious. It's never going to align with an individual's opinion, and that's probably a good thing. Personally, I use the site to identify courses to play when I'm on the road. I travel a lot for work and keep my discs in tow. I'll be in Dallas in a week or so for the first time in many years. So I'll identify some courses to hit while I'm there. My goal is to review a course so that someone in my shoes would find it useful. The Top 10 is simply a by-product of different reviewers with unique preferences. Some may review a course based on tourney experience at that course. That's fine. Some people prefer aesthetics and tranquility over design. That's fine too.

While I'm definitely no Mashnut, I have bagged 20 states at present and over 175 courses. (and no, I have not dilligently recorded my "played" courses on the site in a few years) I've been blown away at courses where I had low-to-moderate expectations based on information here. I've been thoroughly disappointed by some courses that didn't meet the expectations I had drawn from the site. But 90-95% of the time, my experience at a course has been in close correlation to my expectations derived from DGCR. To me, that's a mark of success and Timg and others who have worked so hard to make the site great should be proud.

In short, the Top 10 really isn't that important in my book, and I incidentally derailed the thread by identifying examples of courses that I disagree with being in the Top 10. This was only an attempt to make the simple point, that nobody's personal views will align perfectly with the overall views of a population of people...and that this premise should not invalidate either 1) the opinion of the indivdual or 2) the output statistic of the population. Sorry for the tangent...back to discussion.
 
Top