Idlewild = 1 star?

A) Timg that cat is HILARIOUS

B) Maybe Idle and Flip will eliminate each other and Beaver Ranch will take 1st (as someone from Colorado Im partial to the Beave)



p.s. Heh, partial to the Beave :D
 
:D


i like this one too
2902748386_772197259b.jpg
 
I am a fan of the Beave as well but I have yet to play Flip and Id. I am actually amazed at how well the Beave is doing yet it is by far the best course I have played. I guess I am a local once again but was not for a while there. Local or not, you should play the best courses and write legit reviews.

I had a professor back in college who liked to call out the punks of the world. "Will Smith is a punk, yeah I said it." He also called himself the black Scandinavian. But in his words, people who write bogus reviews of any kind are, "punks."
 
Oh, and I would much rather locals tell me everything that they do not like about the course, I get suspicious when they write nothing in the Cons section.
 
Here we have an example of an insignificant review that made a insignificant alteration to an insignificant statistic. What is significant, and troubling, is that this was dealt with by some means other than simply ignoring it. The need was seen to take it down, for the sanctity of that dubious top 10 list.

The list itself of course invites corruption, as people seek to stack the vote in one way or another to get their course a cut above others they possibly if not likely have not played. And the cure seems to be worse than the disease - if we get rid of that 1 star from a guy who was having a bad day, how many 5 stars do we get rid of because of home course bias? How many 3.5s or 4s were just put in to sabotage a rival's stats? This then becomes less about user reviews, and turns more into a game of lobbying the moderator for some intervention in the interest of getting a "justice" that is ultimately impossible to achieve.
 
Here we have an example of an insignificant review that made a insignificant alteration to an insignificant statistic. What is significant, and troubling, is that this was dealt with by some means other than simply ignoring it. The need was seen to take it down, for the sanctity of that dubious top 10 list.

The list itself of course invites corruption, as people seek to stack the vote in one way or another to get their course a cut above others they possibly if not likely have not played. And the cure seems to be worse than the disease - if we get rid of that 1 star from a guy who was having a bad day, how many 5 stars do we get rid of because of home course bias? How many 3.5s or 4s were just put in to sabotage a rival's stats? This then becomes less about user reviews, and turns more into a game of lobbying the moderator for some intervention in the interest of getting a "justice" that is ultimately impossible to achieve.
Nobody lobbied me in this case, I just made a judgment call. Maybe it was wrong but it seemed like the right decision at the time. I've left plenty of reviews that go against the grain that many people complained about because they were at least passable and backed up the rating given. I suppose I could have just let the guy hang but the review was so poorly written and so out of left field that it's like the guy didn't even play the same course as the other members.
 
Tim was right to take it down. a reviewer can not put "nothing" in the pros column. I have reviewed 1 star courses and can still find something positive to say. To not have a single positive thing about such a popular course was BS.
 
I agree, seems like Tim did the right thing here. Reviews are so rarely pulled, it's not even close to an issue.

I had a review pulled, I rated a course in Southern Kansas as a Zero, The day I went to play the course there was no Baskets in the area the map said it was. I don't know if the baskets had been moved to set up a temp. course some where else or not. It was not a big deal, as the course has only been played by a couple of other and only rated once with a .5 or 1. Hope the baskets are back out and if I make it back there I'll stop by and play a round.
 
I had a review pulled, I rated a course in Southern Kansas as a Zero, The day I went to play the course there was no Baskets in the area the map said it was. I don't know if the baskets had been moved to set up a temp. course some where else or not. It was not a big deal, as the course has only been played by a couple of other and only rated once with a .5 or 1. Hope the baskets are back out and if I make it back there I'll stop by and play a round.
If you say you didn't play a course in a review you can't really review it. That's why that one was deleted. I've done that a few times where someone would post "couldn't find the course", etc. where they didn't play the course for whatever reason.
 
I'd like to suggest a way around this problem. Rather than rank the courses as if the reviews were that precise to two decimal places, just place courses into groupings that are "tied" and place them in random order within their category. For example, all courses that average 4.75 and higher are in the 5 star category and are equal. They are randomly listed as five star courses with any ten being on the Home page at any time. All of the courses with average reviews between 4.25 and 4.74 are 4.5 star courses, etc. Then there's no reason for shenanigans in a battle for the top spot.
 
or you could pull the outliers and see what happens...
 
There are some good ideas here. Whatever solution to the problem is found, assuming one is implemented, I would hope would be an objective system, rather than yanking the reviews that "do not fit" on a case by case basis. It seems to set a bad precedent.

The Olympics used to drop the highest and lowest scores, in part to keep the East German judge from screwing things up. Something equitable could be worked out here, I'm sure. Drop the highs and lows? Take the middle 90% or 80%? Only report the median (would result in a lot of ties)? Q test, perhaps?
 
There are some good ideas here. Whatever solution to the problem is found, assuming one is implemented, I would hope would be an objective system, rather than yanking the reviews that "do not fit" on a case by case basis. It seems to set a bad precedent.

The Olympics used to drop the highest and lowest scores, in part to keep the East German judge from screwing things up. Something equitable could be worked out here, I'm sure. Drop the highs and lows? Take the middle 90% or 80%? Only report the median (would result in a lot of ties)? Q test, perhaps?
I don't think an overhaul is really necessary. I can count on one hand the # of reviews I've removed from the site so this is far from a common occurrence considering there are almost 15k reviews now. Maybe at some point dropping the outliers will be something to consider for the future. I will concede that removing this one particular review may not have been the right call but what's done is done.
 
This whole flap over whose course is #1 is silly. Coming from the Chicago area, (where there are no courses even close to the top ten) I look at all of the courses in the top list and dream of vacations. I feel that all of these courses must be awesome to get so many 4.5 and 5 star reviews. Which is the best is totally subjective, and doesn't really matter. Just being in that group is enough. Our levels of experience and skill vary dramaticly, so our views of a particular course will as well.
 
you don't have to defend yourself Tim G, This review was a joke. As far as people reviewing courses that average 4.8 as a 3.5 to lower the rating, there is nothing you can do. but if they write "Nothing" in the pros column and "the course blows" in the cons, you should be able to get rid of it.

Also reviews that have 0 thumbs up and 10 thumbs down may be a way to remove reviews that are poor or bogus ( I may have 2 reviews that are close to that)
 
Top