• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Bad/Homer Course Reviews

the 3rd under cons he says "keep your disc on the fairway" Like on some courses he really enjoys the rough.?

On a good course you would enjoy the rough.

Some of the most-fun throws are the off-fairway trick shots you are forced to make to hit a specific unusual flight path.

Especially when they work.

Disc-eating long grass, or impenetrable thorn bushes, are not enjoyable rough.
 
Yeah. There are some courses where I throw a bad shot off the tee and I get a bogey, or maybe even a double bogey. There are other courses where I throw a bad shot off the tee and I get a bogey or double bogey, but I also lose my disc or I bleed. The latter is definitely a con.
 
In fact, multi-layout courses are likely to be a bit more confusing to navigate, making the experience even less enjoyable for the visitor, especially when time is a factor

True, and I've tried to acknowledge that in my more recent reviews - usually with something like "this wouldn't be a problem after your first time through."

Even when it's a bit confusing for a bagger like me, multiple tees and baskets are a plus and they tend tomake me consider a higher score.

That said, I've played lots of courses where the two baskets are so similar that I wouldn't change how I play, and I wonder what's going through course designers' minds when two poured concrete tees are within 30 feet of one another.
 
Even when it's a bit confusing for a bagger like me, multiple tees and baskets are a plus and they tend to make me consider a higher score.

That said, I've played lots of courses where the two baskets are so similar that I wouldn't change how I play, and I wonder what's going through course designers' minds when two poured concrete tees are within 30 feet of one another.
I'm with you on these points. Add'l tees and baskets can make me think about whether the course should be rated higher than the layout I've played, but the additional tees and baskets have to provide variety and make sense.

Just going through the motions of adding a 2nd set of tees and/or pin placements doesn't necessarily improve a course if it doesn't change how you'd play the hole, or if one of the placements is so awkward that it becomes an option no one plays.
 
This isn't a bad review per se. More about the voting process. The newest review for Wofford College is a simple, says nothing, 7 sentence review. It already has 5 positive votes in 3 hours. The review immediately below it has the same amount of content, provides as much in-sight, and is relatively the same in terms of quality. Yet, that one has only 1 positive vote and 5 negative votes. Additionally, there are 3 high quality reviews for the course from 2011. They have 1, 2, and 3 positive votes respectively. It's interesting those reviews have been virtually ignored (aka any review from PizzaGod or Upshaw1979 for remote courses), while this basic one is getting lots of attention.
 
This isn't a bad review per se. More about the voting process. The newest review for Wofford College is a simple, says nothing, 7 sentence review. It already has 5 positive votes in 3 hours. The review immediately below it has the same amount of content, provides as much in-sight, and is relatively the same in terms of quality. Yet, that one has only 1 positive vote and 5 negative votes. Additionally, there are 3 high quality reviews for the course from 2011. They have 1, 2, and 3 positive votes respectively. It's interesting those reviews have been virtually ignored (aka any review from PizzaGod or Upshaw1979 for remote courses), while this basic one is getting lots of attention.

Let me make it clear the review written today (12/27) is perfectly fine. It's the inconsistency in voting that I'm highlighting. Maybe this is being addressed in the wrong forum.
 
I have seen a major uptick in votes cast per review since the beginning of COVID, and especially in positive votes. I recall some discussions about folks doing more reviews, and generally being more encouraging and positive on the site. There was also a post from someone who published ratios of thumbs up to thumbs down for trusted reviewers, and I think that was a bit of a wake-up call for some.

There was a similar discussed on the "Up and Coming Good Reviewers" thread back in July, and the consensus seemed to be that reviewers were moving up through the Trusted Reviewer ranks much more quickly than previously.

Looking at my own reviews, I averaged 4 - 6 votes per review up through 2019. But the six course reviews that I've done in the past 15 months have gotten from 11 to 20 thumbs up. The newer reviews are no different than the older ones, but the voting numbers have changed dramatically. Good? Bad? Who knows...
 
Let me make it clear the review written today (12/27) is perfectly fine. It's the inconsistency in voting that I'm highlighting. Maybe this is being addressed in the wrong forum.

Reviews of similar content/quality for a given course may get quite different tallies based on a several factors, not the least of which is who's voting and there threshold for quality.
Might also have to do with when the review is posted and how long it stays on the front page, and who notices it.

Also, fewer "regulars" may be spending time on here during the holidays.

Plus I'm not sure you can expect the sort of consistency your looking for.
 
Reviews of similar content/quality for a given course may get quite different tallies based on a several factors, not the least of which is who's voting and there threshold for quality.
Might also have to do with when the review is posted and how long it stays on the front page, and who notices it.

Also, fewer "regulars" may be spending time on here during the holidays.

Plus I'm not sure you can expect the sort of consistency your looking for.

You're right about the length reviews being on the front page can correlate with reviews. Obviously, bigger named courses are going to garner more attention and eyeballs. I'd imagine a lot less people are going to randomly be checking out Wofford College's course compared to Flip City, Idlewild, Harmon Hills, etc. Take a look at the dozens of reviews Pizza God has written that have garnered 0, 1, or 2 reviews mainly because they're for out of the way, mediocre courses.
 
This isn't a bad review per se. More about the voting process. The newest review for Wofford College is a simple, says nothing, 7 sentence review. It already has 5 positive votes in 3 hours...
From the comments I've seen it looks like we are ignoring the obvious fact that the Udisc rating mentality is creeping into DGCR and polluting the main area where DGCR still stands out. i.e.- DGCR ratings are the most reliable ratings available on the planet.

I attribute this in large part to the massive influx of new players and that their first exposure to course ratings is on Udisc.

BUT are the Udisc "feel good" ratings unique to Udisc or are they part of a larger cultural zeitgeist? On Udisc I see 5.0 ratings given to crummy little 3 hole practice areas at elementary schools, and I see some people who give a 5.0 to every single course they play. Is this part of the "everybody gets a trophy" and "lets not hurt anybody's precious little feelings" mentality? A Udisc 5.0 seems to mean "I had fun with my friends throwing discs here." On Udisc a 5.0 seems to be equivalent to a like on Facebook or a heart on Instagram.
 
... cultural zeitgeist? ...

whoa, zeitgeist? I see we're throwing $10 words around now.

I agree that it's remarkable that a terse review can get 5+ upvotes super quickly whereas a few years ago one was lucky to get that many even if their review was on the front page for days. I do agree that the pandemic might be one reason for the seeming abundance of upvotes. I can't speak about the UDisc factor, I've never used it and probably never will - I just enjoy writing my score down w/ pencil and paper. That said, to echo what markmcc said, I struggled early on to get upvotes - from 2012 to 2018 I reviewed something like 60 courses. A total of 5 of those reviews have garnered 10+ upvotes, and most of those only hit double digits months (if not years) after writing them. Compare that to 4 out of 7 reviews I've written since 2019 that have 10+ upvotes. None of them are high caliber courses but the votes are there.

For me, after I read a review I ask myself the same question - did I find the review HELPFUL? If there was even just one sentence that I found helpful, I'll give it a thumbs up. If the question was "Did you find this review well-written, lucid, cogent, and humorous?" my upvote to downvote ratio would probably be reversed.
 
From the comments I've seen it looks like we are ignoring the obvious fact that the Udisc rating mentality is creeping into DGCR and polluting the main area where DGCR still stands out. i.e.- DGCR ratings are the most reliable ratings available on the planet.

I attribute this in large part to the massive influx of new players and that their first exposure to course ratings is on Udisc.

BUT are the Udisc "feel good" ratings unique to Udisc or are they part of a larger cultural zeitgeist? On Udisc I see 5.0 ratings given to crummy little 3 hole practice areas at elementary schools, and I see some people who give a 5.0 to every single course they play. Is this part of the "everybody gets a trophy" and "lets not hurt anybody's precious little feelings" mentality? A Udisc 5.0 seems to mean "I had fun with my friends throwing discs here." On Udisc a 5.0 seems to be equivalent to a like on Facebook or a heart on Instagram.

And a lost disc seemingly means the course sucked and a 1.0 rating.
 
From the comments I've seen it looks like we are ignoring the obvious fact that the Udisc rating mentality is creeping into DGCR and polluting the main area where DGCR still stands out. i.e.- DGCR ratings are the most reliable ratings available on the planet.

I attribute this in large part to the massive influx of new players and that their first exposure to course ratings is on Udisc.

BUT are the Udisc "feel good" ratings unique to Udisc or are they part of a larger cultural zeitgeist? On Udisc I see 5.0 ratings given to crummy little 3 hole practice areas at elementary schools, and I see some people who give a 5.0 to every single course they play. Is this part of the "everybody gets a trophy" and "lets not hurt anybody's precious little feelings" mentality? A Udisc 5.0 seems to mean "I had fun with my friends throwing discs here." On Udisc a 5.0 seems to be equivalent to a like on Facebook or a heart on Instagram.

Part of the issue is that UDisc almost forces you to review a course. When you finish a round, or want to look at an old scorecard it constantly pops up asking you to review the course. So I think most people just click 5.0 if they liked the course and really low if they didn't just to avoid having to put any thought into a forced review. Then they don't even make you clarify why you are giving your rating.

DGCR reviews are typically made by people who actually want to write them.
 
To shed some light on the Udisc ratings, you have to assume the people rating the courses really don't play much disc golf.

I consider myself a pretty casual player. While I probably play two or three rounds a week and a handful of tournaments each year I don't always score my practice/casual rounds on Udisc. I was looking at my 2021 stats the other day and I have scored 61 rounds in 2021 which works out to just over one round per week.

According to Udisc, I have played more rounds than 80% of all Udisc users. Udisc also says that I have played more courses than 95% of users and apparently my 61 scores rounds were only at 13 different courses.

If I'm in the top 5% of users in the number of courses played and the top 20% of number of rounds played then there are probably a lot of reviews being left by people who really don't play many rounds and seldom play very many courses.

While I do occasionally read reviews, you really have to consider the source. What makes a course good or bad for very casual or inexperienced players has little to no bearing on what a more seasoned or serious player would think of a course.
 
. . . According to Udisc, I have played more rounds than 80% of all Udisc users.

Probably true. However, I score maybe 1 out of 10 rounds on UDisc, so those numbers may be a bit off. Also I don't know if UDisc counts league dubs rounds, etc.

Udisc also says that I have played more courses than 95% of users and apparently my 61 scores rounds were only at 13 different courses.

This is probably very true. I know a lot of disc golfers, including some who are very good, who largely stick to playing local courses.
 
Probably true. However, I score maybe 1 out of 10 rounds on UDisc, so those numbers may be a bit off. Also I don't know if UDisc counts league dubs rounds, etc.



This is probably very true. I know a lot of disc golfers, including some who are very good, who largely stick to playing local courses.

Agreed. I play hundreds of rounds in a year and only a couple tournament rounds ever make it to UDisc. In fact, I don't record ANY rounds, anywhere. Well....an erasable scorecard.
 
This is probably very true. I know a lot of disc golfers, including some who are very good, who largely stick to playing local courses.

We see that here in Charlotte, with a large portion of local DGers not having any clue about course outside the city limits. Case in point, someone asked on the FB page yesterday if it was worth making a day trip from Charlotte to play Winthrop. He had ZERO clue Winthrop = Rock Hill, SC = right across the state line from Charlotte = 20 - 30 minute drive.
 
If I'm in the top 5% of users in the number of courses played and the top 20% of number of rounds played then there are probably a lot of reviews being left by people who really don't play many rounds and seldom play very many courses.

I'm also in the top 100 on Udisc. And I have around 200 not marked as played on there. It's kind of surprising, or not really, how many in the Udisc top 100 are active on here. And that's just the names I recognize.
 
I would prefer to just see comments about a course and not a number rating.

A 1 for me might not be a 1 for you....but if I say....

eight tee pads didn't have tee signs and the baskets were not visible from the tee pad; also two greens were under water after the recent rain with the basket pole being two inches in the water....

that has meaning. You know that you are going to have to figure out where the 8 baskets are...probably by walking the fairway to see them. Also, you know that you probably shouldn't play right after it rains.
 

Latest posts

Top