4.14.2011

The "Score Revolution"


I recently attended a seminar about wine reviewing during the Washington Wine Commission's Seattle Taste Washington weekend.  Panelists included Sara Schneider, Sunset Magazine, Becky Murphy, Director of the Dallas Morning News wine competition, W. Blake Gray, Gray Market Report blogger and moderator, Sean Sullivan of Washington Wine Report.  Each described his system for reviewing wines -- 20 points, medals, 100 points and stars, respectively.

A debate about the value of the 100 point score ensued.  Gray argued that he must use the system because that is what the publications for which he writes desire.  Others argued that this scale is skewed, misrepresents some wines (i.e., the lack of 90+ point Sauvignon Blancs relative to number submitted), and is ultimately damaging to the industry and consumers. Christophe Hedges, from Hedges winery, advocates a complete departure with  "Score Revolution". He eloquently described his goal of eliminating all scoring and passed out the below bumper stickers.  A fun way to create awareness of the effort, but I was just surprised to see that the website to which it referring is under construction with no message nor link to the Hedges page. (As a marketer, I always recommend having your materials complete and consistent before launching anything.)


As is typical in an argument, I seek to understand both sides (I'm not a good debater!). In this particular situation given my present and prior posts in the industry, I can personally relate.  As a wine marketer, if a client's wine scores above 90, I am the first to post it on a wide range of marketing materials -- Facebook, product sheets, e-newsletters to consumers and trade (with additional content, of course), website, etc., etc.   Good scores sell wine, especially to distributors and consumers. Period.

However, as a former wine buyer/sommelier and professional taster and competition judge, I do not like this scale.  It doesn't make sense to me given its breadth -- what is the difference between a 10 and a 69, for example?  I don't want to drink either!  A buyer is paid to evaluate, so it's not surprising that she would have her own scale.  (When I worked selling wine, I respected this -- I always had a score or two handy, but I did not immediately present it to the buyer -- this is simply knowing your audience.)

I've always used a simple 1-2-3, where the 1s are average/not distinctive, 2s are solid/interesting/recommended and 3s are something special.  There are of course rare circumstances where a flawed wine is a 0 and spectacular bottling is a 3+, but the vast majority of wines are in the 1 to 3 range.

I once worked for an owner who insisted that all wines had to be above 90 points.  This frustrated me given the sea of beautiful 87-89 wines that would never make the cut -- I used to joke that I would open a shop next door with all of the sub-90 "rejects".  It also encouraged distributor reps to be less than accurate with their score reporting, which in turn created more work for me going into each of the publication's systems having to verify scores.  Despite working within a policy with which I disagreed, I respected it because it was the owner's prerogative.

As an industry member, I at first had conflicting positions given my experiences.  At the end of the day, the only side of this equation that truly matters is the business side -- once again, demand is the star.  If the buyer or the consumer want scores, they will stay.  If they decide to forgo them given better wine education and more polished palates/confidence, the situation will change.  I think it's wonderful that there are a range of scoring types and styles -- it's a free market of sorts, and it should remain so.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.