I don't see anything in your posts in this thread, about Udisc. Not in your original post, not in your subsequent complaints about Tim, not in the talk about crowns.
It seems to be about dissatisfaction with DGCR.
To be fair, when I saw this post I immediately thought this thread was likely going to be a critique on how much hate UDisc reviews have been getting (and the associated post with the same name only replacing DGCR with UDisc). Rich has even posted somewhere here recently that people shouldn't be so quick to discredit a short review, simply because it is short. Which he kind of has a point, it's not overly fair to completely discount someone's thoughts on a course just because they don't spend as much time to articulate that view as TR here on DGCR.
More onto the topic at hand:
Personally, overall I think DGCR is fine the way it is. Ratings in general are typically fairly subjective, and I don't think there is a perfect system for rating courses. Any potential fix would probably be too easy to abuse.
For instance, what if 25% of reviews that were most recently written/revised (or reviews written/revised with the past year) being double-weighted so that when a course slowly undergoes a fairly substantial overhaul / an overtime redesign, reviews from the original layout aren't reflected in the score? Obviously the way it should work is the old course considered extinct and a new page made. However, sometimes this process is so slow that it is hard to determine the right time for this to happen. Courses like
Angry Beaver which had its layout tweaked and redesigned since it opened is now a completely different course than it used to be. (If I am adding correctly, over half the holes are significantly changed or completely new. The problem is with this method a new course without significant amount of reviews, like
The Lions could absolutely have its score tanked by one abysmal review.
So maybe reviews that receive a certain number of Thumbs Up being double-weighted, to reflect what is perceived as being accurate, well written information, and not just a person who was upset they didn't play well, lost a disc, etc. The issue here is that too many people would be likely to use the Thumbs Up/Down feature to affect the rating of courses they like/dislike instead of focusing on whether the review was actually helpful.