Online golf addicts have undoubtedly seen (and perhaps have been authors of) the barrage of complaints in forums and blogs about the Golf Digest Hot List. And every year you can hear many of the same criticisms: the list is based on whether the manufacturer advertises in Golf Digest, if companies have personal relationships with the Editors, a ranking process that favors larger equipment manufacturers, and panelists that don’t thoroughly test all clubs but rather gravitate to those they know best. And hey, what were each club’s numerical scores by category? Can we see the scores for all products tested, not just those that made the Hot List?

No ranking system is perfect, of course. As reviewers, we know this as well as anyone. Like boxing and figure skating, scoring for the Hot List is a blend of both objective data and subjective opinion. Personally, I can’t imagine having to rank hundreds of clubs in a period of just three days (‘Want to Become One of Our Next Hot List Testers?’, Golf Digest, February 2011, p.113). Talk about an exhausting process!

Let’s take a look at the Hot List process for the average player panelist, who for 2011 had a handicap index of 7.0. How many swings would it take that individual to be able to reliably score and rank a club for both Performance as well as Look/Sound/Feel (we are going to assume that players in the panel do not score for Innovation or Demand)? Golf Digest used 10 swings per club in its evaluation of new-groove wedges and chippers (‘How The New-Groove Wedges Affect You’ and ‘Does Using a Chipper Make You a Chopper?’, Golf Digest, February 2011, pp. 127, 130), but let’s say that five swings is all our average player panelist requires.

We don’t know how many entries there were for the 2011 Hot List, but Ken Morton Jr., who has been involved in every Golf Digest Hot List, offers an estimate of 1,000 clubs (‘Making of the Golf Digest Hot List,’ NCGA Magazine, Winter 2011). Assuming each player tests each club (a big assumption, but given the subjective nature of the testing, isn’t that the only way you can legitimately rank clubs?), this means testing an average of about 330 clubs a day to finish the reviews in three days (across six categories – drivers, woods, hybrids, irons, wedges and putters). At five swings apiece, this totals 1,650 strokes, or the equivalent of over 18 rounds worth of golf…each DAY! Our heads would spin from trying to keep all of the competitors straight. I wonder if Mead, Advil and Hirzl are sponsors of the event. Hmmm.

In the end, the judges provide us with the 2011 Hot List: an easily digestible group of, for drivers as an example, one Editor’s Choice winner, 8 Gold award winners, and 6 Silver award winners. With over 300 driver entries (‘Making Simple Overly Complex’, Golf Digest, February 2011, p110), less than 5% took home an award. Talk about competition! This is quite a different story from 2009, when 471 separate entries were considered and a whopping 116 made the Hot List (see Golf Digest 2009 Hot List: The Process & Glossary).

We look forward to watching Golf Digest’s continued refinement of their Hot List process. Our hopes for Hot List enhancements include more detail in select club summaries, such as the Editor’s Choice winner.

We also believe it would be worthwhile to include comments related to the review that could impact a consumer’s buying decision. The Fourteen iron set is expensive, certainly, but it would be worthwhile to know they are manufactured in the United States as that could very well influence a buyer’s decision.

If Golf Digest wants to be over-the-top, they could include advanced testing data from on their website, such as audio files of club sounds as well as an assortment images of each club, particularly the view from a player’s perspective at setup. Providing this information would allow users to score some of the subjective criteria themselves, instead of relying on the judges. Perhaps a reader prefers a tinny-sounding driver with a squared head, but the judges don’t. Maybe even one day users will be able to score these criteria themselves, and then these scores will be blended with judges’ ratings in other categories that would be more difficult for user’s to score, such as the feel of the clubs (that would require having access to all of the clubs and a range) to get a more personal score. Hey, a reader can dream, right?

Until then we’ll keep imagining the day when full detail behind the scoring for all clubs tested are released, and think of the conversation that the Hot List judges will get to have with the manufacturer whose latest and greatest club scores a 32 out of 100.

Leave a Reply

Your email address will not be published.