Flip does have more crappy reviews than any other course, that's true, but it also has a 5.00 ranking from eight trusted reviewers which is higher than the average of the other folks who've reviewed it (4.877). Idlewild is the same way, 17 TRs for a 4.92 average versus a 4.88 from other folks. My theory is that Idlewild gets more TRs because it's centrally located (compared to west michigan) and is probably playable for a couple more months out of the year.
I've said this once, and I'll probably say it again a thousand times...
The criteria for becoming a trusted reviewer on here, particularly at the Bronze level, is a joke.
All you need is 10 reviews, and 50 helpful votes from 10 unique people. Some people have gotten more than half the necessary total for bronze TR status off of a single review. This too needs to change.
A single review should count for five helpful points max. That way you have to get five helpful points from at least ten different reviews. Under the current system people can get cheap helpfuls by reviewing popular courses or ones in areas with a lot of DGCR members, even if they incur a few negatives in the process.
I guess what I'm getting at is that I don't understand why you're trying to imply that either Flip or Idlewild or any of the top 10, for that matter, aren't worthy of the praise they've received. If folks who travel hours upon hours to play these courses (like me) didn't think highly of them after playing they wouldn't be in the top 10 anymore.
The thing is, I suspect most of the reviews on here are not from people who have traveled hours and hours. I'll bet most of them are from a reveiwer who lives within 100 miles of the course. I took all 57 of Flips reviews and crunched them into a spreadsheet. Here's what I found.
- 25 of the 57 reviewers have listed that they've played less than 10 courses. If someone has played that few, and has such a small frame of reference, how do they know what a five star course is?
- The average number of courses played was 27.8. This drops to 17.7 when you only take the middle 60%.
- The average playing experience is listed at 8.23 years. Of course 21 of the 57 reviewers did not list their playing experience, and the formula does not count the ones who did not list their experience.
- Of those who did not list their playing experience, all but six of them have five or fewer courses listed as played, which would imply that their experience isn't much and would likely drag the average down much further.
And I'm not picking on Flip here, as pretty much every course that has been reviewed significantly has this problem to some degree. Its just the most glaring example of what's wrong.
If there's something I'd like to see added to people's reviews, its the distance range between the reviewer's home zip code and the course's zip code. This would help distinguish which reviews are from travelers and which are from locals.