• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

BS review of Black Falls DGCR

Status
Not open for further replies.
Although I am clearly not a "renowned" reviewer to any stretch of comparison, I do think there are reviews which are too subjective based on the player's round. I remember playing storr's pond the first couple of times and wanting to write a review akin to "impossible lines, overgrown forest", etc. However, after checking myself and comparing those emotions to how others saw the course, it became clear that my initial negativity towards the course was because of my own poor performance.

Perhaps we can find solutions to this issue:
1) enable a retort of a review from either the owner or a serious local players who has played the course multiple times. This would be similar to how Google play enables creator responses for apps
2) simplify the reviews greatly. This would likely be less desirable for the community at large given that this is a review website. However, my thought on reading reviews is, "does this review suggest that the review would come back?" I rarely care about the numerical review score
3) post the reviewer's average score at the course they are reviewing, next to the spot in the review where it states when the last time they played the course was. This would be able to better detect "performance bias" within the review
1. Already is in place with Designer Feedback.
2. No thanks.
3. This would not give you much info unless you know the player or their rating(rating really doesn't tell you much either). This would also require players to actually play the course like a tournament round and to actually record it and be honest about it. I rarely play new courses I review like a tournament round unless it's actually a tournament and I rarely keep score of casual rounds and like to throw extra shots to get a better feel of some holes.
 
What the sidewinder said.

The real solution is to allow multiple reviews of a course, so the ratings average out, and the comments by those who like it offset the the comments of those who don't. Then the reader can figure it out from the various opinions presented.

Oh, wait, that's what we have now.....
 
It's just a kick in the dink that's all, just like the people that don't pay greens fees
My only advice would be to make it more noticeable where to put the greens fees and the tee for hole 1, it took some searching to find both, otherwise the course was fairly easy to navigate and was in excellent condition when I played last year.
 
Last edited:
. At first I though he may have paid at Black Falls then drove up the road and played the exctint Cherry Hill course which still has baskets and some signs.

It's funny, but I actually know of a review where the only possible explanation is the reviewer got courses mixed up.
 
All I wanted to do in my post was to provide ideas to prevent what seems to be an issue towards the OP; although I acknowledged in my post that the ideas were not perfect or likely to be popular, my hope was to have an open discussion about it to see if the ideas would foster some way to improve the review biases. I fail to see that I was sarcastic, off-putting, or condescending in any way.

Thank you for letting me know about designer feedback- didn't know that existed. You provided an interesting perspective on idea #3

DavidSaul: I am unclear what you actually contributed to the discussion in your above post; if you thought the current system does not need change, then why not directly say it? Just because you are an "Ace Member" does not mean you can be an "Ace Hole". One would think you would provide more thought-out or at least amiable feedback
 
DavidSaul: I am unclear what you actually contributed to the discussion in your above post; if you thought the current system does not need change, then why not directly say it? Just because you are an "Ace Member" does not mean you can be an "Ace Hole". One would think you would provide more thought-out or at least amiable feedback

David is actually one of the more thoughtful posters on this forum. If that bit of humor bothered you, you might want to grow some thicker skin. Cause you ain't seen nothin' yet...
 
David is actually one of the more thoughtful posters on this forum. If that bit of humor bothered you, you might want to grow some thicker skin. Cause you ain't seen nothin' yet...

I do not get where the humor in his post was. Regardless, it's not that it bothered me. If you review the original point of this thread, you'd see that someone who has worked very hard on a course is upset about a review, and understandably so. All I wanted was to perhaps come up with ways that offsetting reviews that seem unwarranted could be decreased.

What is upsetting is his mindless post that did nothing to advance the topic of the thread from an OP who posed a reasonable argument. Granted, that is commonplace on discussion boards; I was simply expecting better. Not amusing, snarky comments are what get reasonable threads off course.
 
Sorry to offend you; it was not my intent. I was just being lazy.

Veterans of these message boards have seen dozens and dozens of proposals for a "better" way for this site to work, usually imposing someone else's standards on the reviewers. I'm convinced that we're better off using a consensus; allowing reviewers considerable freedom, and letting it average out. Mostly, outlier reviews have a negligible effect on the overall rating though, as I said in an earlier posts, on lightly-reviewed courses like the OP's---or mine---they have a little more impact. Really out-of-line reviews are sometimes deleted by Tim, the site owner. But any suggestion that ratings and reviews must be done a certain way, tend to only weight the reviews towards the author's opinion instead of a general consensus.

Not that it can't be improved. Indeed, the designer's response (or whatever it's called, I'm being lazy again here) is a fairly new feature, and a nice change. It doesn't impose restrictions on the reviewer, but allows the course owner or designer to address issues raised by him. In fact, the very review the OP was ranting about was one he'd posted a very effective and appropriate response to.

So of your suggestions, #1 already exists, more or less; #2 is what I was answering in my flippant way, but have done so here in more detail.

#3 is just problematic, as sidewinder22 described well. It's not a bad idea; sometimes I'm curious about it as I read a review. Some reviewers describe their skill level in their review; some show their rating in their profile. But many are casual and don't have one. And I'm among those who doesn't keep score when I'm visiting a course, and my scores are all over the place on courses I play often. Not sure what I'd put down. It might be of interest when someone describes a course as very tough or easy or long or short; not so much on beautiful or well-maintained (or not). Moreover, anything that makes reviewing courses more difficult, might reduce the number of reviews, which is not in our interest.

Now, after slogging through all that verbiage, you might at least forgive me for being brief in my prior post.
 
I do not get where the humor in his post was. Regardless, it's not that it bothered me. If you review the original point of this thread, you'd see that someone who has worked very hard on a course is upset about a review, and understandably so. All I wanted was to perhaps come up with ways that offsetting reviews that seem unwarranted could be decreased.

What is upsetting is his mindless post that did nothing to advance the topic of the thread from an OP who posed a reasonable argument. Granted, that is commonplace on discussion boards; I was simply expecting better. Not amusing, snarky comments are what get reasonable threads off course.

As for this, hopefully my first 2 posts in this thread showed sympathy with the O.P.

Or at least empathy, since I'm in a similar boat.
 
What the sidewinder said.

The real solution is to allow multiple reviews of a course, so the ratings average out, and the comments by those who like it offset the the comments of those who don't. Then the reader can figure it out from the various opinions presented.

Oh, wait, that's what we have now.....

There is only one thing that I would tweak if possible, and that's allow us to rate a course 4.25 and 4.75. If converted, we have a 1-10 scale now, if you don't count zero discs. But that doesn't matter, as far as I know. If added, we would have 12 choices. Would that cause a problem? Also, does a course have to be at 4.26 to make it to 4 1/2 discs, or is it 4.25?
 
As a reviewer, I like the idea of quarter-disc ratings. It would allow me to be more precise in differentiating between top courses.

But it would really have a negligible effect on a course's overall rating. I'm not sure it matters enough to instigate.
 
Honestly, I've never correlated my review score of a course to how well I played it. If anything, if I scored well, and wasn't familiar with the layout to begin with, it's possibly due to the course being rather uninspiring in the challenge department.

The thing is, many people who read any of my reviews probably haven't played the course either, so how would they know from my score if a bad rating was deserved or I was just being a pissant?

The funny thing is, when someone is blaming the course for their crappy play, often the context of their review outs them as such.
 
All I wanted to do in my post was to provide ideas to prevent what seems to be an issue towards the OP; although I acknowledged in my post that the ideas were not perfect or likely to be popular, my hope was to have an open discussion about it to see if the ideas would foster some way to improve the review biases. I fail to see that I was sarcastic, off-putting, or condescending in any way.

Thank you for letting me know about designer feedback- didn't know that existed. You provided an interesting perspective on idea #3

DavidSaul: I am unclear what you actually contributed to the discussion in your above post; if you thought the current system does not need change, then why not directly say it? Just because you are an "Ace Member" does not mean you can be an "Ace Hole". One would think you would provide more thought-out or at least amiable feedback

Don't feel too bad, I've made a similar argument before to improve the site and was shot down by all the usual haters, heres a thread I made awhile back:

https://www.dgcoursereview.com/forums/showthread.php?t=86904
 
Don't feel too bad, I've made a similar argument before to improve the site and was shot down by all the usual haters, heres a thread I made awhile back:

https://www.dgcoursereview.com/forums/showthread.php?t=86904

Still seething, 4 years later?

I followed that link and didn't see any haters---just people pointing out the flaws and pitfalls in requiring more user info for reviewers. They weren't hating you, or the idea.....just disagreeing with the latter.

In the meantime, if it's important to a reader, he or she can click on a reviewer's profile and see whatever the reviewer wanted to reveal about himself. PDGA # (where you can see his tournament record), rating, scorebook, age, experience, hometown, courses played and reviews of them, and a great deal more.
 
Reviews on the internet can be inaccurate?!?!?!

People who leave bad reviews based on their ****ty play are the same types of people who cut down trees/limbs on a course: they don't get it and never will.

The review system works fine; over time scores average out.

If you're using the review score to decide which course to play, it's still on you to parse the reviews and the reviewers. If a course has a 3.5 rating with 9 reviews by players with less than 2 years playing and 2 courses played, is that rating accurate? Read the course description, look at the length of the course, check out the map, and look for a review by someone who has played for several years and a bunch of courses.

Oh, and if you drove out of your way through gorgeous Vermont to play a course and had a bad time, that's your fault, not the course's. Ignore that review.
 
There might be something to be said for having some experience under one's belts to be better qualified as a reviewer. I've stated myself that people here should get a least one year, ten courses (with preferably some outside their home area) and maybe 1000 played holes in the books before they start doing reviews.

What I don't get is this notion of tying reviewer qualifications to a player rating that will inevitably peak at some point for everyone and then gradually decline somewhat as they age. For a great deal of us, that peak is going to be nothing exceptional. In spite of those, our knowledge of what makes a great course, vs. an okay one, vs. a lousy one is only going to get better. A player rating is not a grade you make for something that hasn't a lot to do with competitive play.
 
Although it might not address the OP's issue, the mechanics of this site or the inherent biases in any rating system, people who are truly interested in a course's quality will never be fully seduced by the fuzziness of a semi-arbitrary number when checking out an unfamiliar one, search function filters nothwithstanding. Moreover, people spending time enough to research a course before committing to a road-trip to a new course are also probably wise enough to evaluate the reviews, granting each the weight it deserves. It's fair to expect an earnest reader to shoulder at least that much burden...

and in my case, I think it safe to assume I've played more than 19 courses...a lot more...
 
Although it might not address the OP's issue, the mechanics of this site or the inherent biases in any rating system, people who are truly interested in a course's quality will never be fully seduced by the fuzziness of a semi-arbitrary number when checking out an unfamiliar one, search function filters nothwithstanding. Moreover, people spending time enough to research a course before committing to a road-trip to a new course are also probably wise enough to evaluate the reviews, granting each the weight it deserves. It's fair to expect an earnest reader to shoulder at least that much burden...

and in my case, I think it safe to assume I've played more than 19 courses...a lot more...

No offense, but I have always assumed everyone's courses played to be accurate, except for innovadude's. Why are you holding out on the rest?
 
There might be something to be said for having some experience under one's belts to be better qualified as a reviewer. I've stated myself that people here should get a least one year, ten courses (with preferably some outside their home area) and maybe 1000 played holes in the books before they start doing reviews.

What I don't get is this notion of tying reviewer qualifications to a player rating that will inevitably peak at some point for everyone and then gradually decline somewhat as they age. For a great deal of us, that peak is going to be nothing exceptional. In spite of those, our knowledge of what makes a great course, vs. an okay one, vs. a lousy one is only going to get better. A player rating is not a grade you make for something that hasn't a lot to do with competitive play.

Yes.

John Houck claims to be a weak player. Myself, I'd be interested in his opinion of a course.
 
Status
Not open for further replies.

Latest posts

Top