Hacker's Central Blog
Regular View | Archive View

Looking Under the Hood of Golfweek's 2011 Best Canadian Courses Ratings
Wednesday, 27 April 2011 15:00

Now, I think it is great that Golfweek ranks golf courses (we're all about golf course ratings at Hacker's Guide). In presenting the 2011 Best Canadian Golf Courses, the writer mentions which architects built the courses. That's all well and good, but like celebrity chefs, it's not necessarily how great the architect is (some definitely are) but if a famous "name" is stamped on it, they can sell rounds (see Pete Dye, Greg Norman, Tom Fazio, Jack Nicklaus, etc.). This factor alone shouldn't be a determinent if the course is good or not. Most golfers wouldn't know Pete Dye or Tom Fazio if they ran into them on the street.

Another thing that bugs me is that when I did a quick review of the Top 5 "modern" Ontario courses on Golfweeks list, I found that only one of these courses was available for "hackers" to play.

When I searched around for the underlying rating system that is used by Golfweek, it wasn't very obvious how they did it.  Not to pick on Golfweek, but if you look at an example of what they rate a course on, every one of their 10 factors is on-course related, from tree management to conditioning.  What about the clubhouse, the signage, the practice facilities, the food, the customer service, etc., etc.

If Donald Trump had a clue about golf architecture, he'd build his own courses, but alas he can only buy them (with lots of other people's money).  Sorry I digress.

Also, who rated these courses and put scores to paper?  How many people did it? Is this the opinion of only one person or a group? If it was multiple raters are the scores listed averages?

My point is that a publication that portends to rate golf courses, and in turn be touted as an authority, should provide transparency when it comes to what they rate on and it should be a system that is broadly-based, not specific to the architecture of the course.

Additionally, of the 60 courses that were rated in this Golfweek list, how many weren't? In just the province of Ontario, there are over 840 golf courses. If that is true, the 32 Golfweek courses located in Ontario represent UNDER 4% of all the courses in that province!

Golfers make buying decisions based on reviews and saying a course is the best in Canada shouldn't be based only on the golf course architecture or the architect.  How the trees are managed shouldn't be a major consideration for a rating.  What happens when a tornado comes through and all the tree are gone?  Is this top-rated course now no longer worthwhile?

When I go to TripAdvisor.com for vacation advice or Amazon.com for book, CD or electronics advice, I'm looking for lots of reviews from regular people that bought the product or visited the destination. I make a judgment based on dozens or hundreds of reviews, not one. When I read Consumer Reports to get car ratings, I know exactly what they rated my new car on because their system is transparent.

When I read a golf review, I don't want to know who the architect was that built it, but what kind of experience will I have when I get there. Each golfer values different things about a golf course. Some may like the beautiful vistas and others prefer a great 19th hole. Many will look at the greens fees. Regardless, a golf review should be much more extensive than 10 subjective questions on the setting, the hole routing, the trees, the hole variety or the 'walk in the park' test.

It never ceases to amaze me that golfers will take a rating at face value without looking under the hood and learning what actually a course is being rated on. Many wonderful courses don't even get considered because they're not on a publication's list.

For a comparison, check out our Hacker's Guide Rating System .

0 Comments (must be logged in to post comments)