The "Score Revolution"

I recently attended a seminar about wine reviewing during the Washington Wine Commission's Seattle Taste Washington weekend.  Panelists included Sara Schneider, Sunset Magazine, Becky Murphy, Director of the Dallas Morning News wine competition, W. Blake Gray, Gray Market Report blogger and moderator, Sean Sullivan of Washington Wine Report.  Each described his system for reviewing wines -- 20 points, medals, 100 points and stars, respectively.

A debate about the value of the 100 point score ensued.  Gray argued that he must use the system because that is what the publications for which he writes desire.  Others argued that this scale is skewed, misrepresents some wines (i.e., the lack of 90+ point Sauvignon Blancs relative to number submitted), and is ultimately damaging to the industry and consumers. Christophe Hedges, from Hedges winery, advocates a complete departure with  "Score Revolution". He eloquently described his goal of eliminating all scoring and passed out the below bumper stickers.  A fun way to create awareness of the effort, but I was just surprised to see that the website to which it referring is under construction with no message nor link to the Hedges page. (As a marketer, I always recommend having your materials complete and consistent before launching anything.)

As is typical in an argument, I seek to understand both sides (I'm not a good debater!). In this particular situation given my present and prior posts in the industry, I can personally relate.  As a wine marketer, if a client's wine scores above 90, I am the first to post it on a wide range of marketing materials -- Facebook, product sheets, e-newsletters to consumers and trade (with additional content, of course), website, etc., etc.   Good scores sell wine, especially to distributors and consumers. Period.

However, as a former wine buyer/sommelier and professional taster and competition judge, I do not like this scale.  It doesn't make sense to me given its breadth -- what is the difference between a 10 and a 69, for example?  I don't want to drink either!  A buyer is paid to evaluate, so it's not surprising that she would have her own scale.  (When I worked selling wine, I respected this -- I always had a score or two handy, but I did not immediately present it to the buyer -- this is simply knowing your audience.)

I've always used a simple 1-2-3, where the 1s are average/not distinctive, 2s are solid/interesting/recommended and 3s are something special.  There are of course rare circumstances where a flawed wine is a 0 and spectacular bottling is a 3+, but the vast majority of wines are in the 1 to 3 range.

I once worked for an owner who insisted that all wines had to be above 90 points.  This frustrated me given the sea of beautiful 87-89 wines that would never make the cut -- I used to joke that I would open a shop next door with all of the sub-90 "rejects".  It also encouraged distributor reps to be less than accurate with their score reporting, which in turn created more work for me going into each of the publication's systems having to verify scores.  Despite working within a policy with which I disagreed, I respected it because it was the owner's prerogative.

As an industry member, I at first had conflicting positions given my experiences.  At the end of the day, the only side of this equation that truly matters is the business side -- once again, demand is the star.  If the buyer or the consumer want scores, they will stay.  If they decide to forgo them given better wine education and more polished palates/confidence, the situation will change.  I think it's wonderful that there are a range of scoring types and styles -- it's a free market of sorts, and it should remain so.


Time Management & Customer Service

Today I read an interesting article, Inanity of Immediate Response by Daniel Markovitz, Stanford and Ohio State professor. While written for the Institute of Management Consulting (IMC) members, it also applies to wineries, which are first in the hospitality industry and second in wine production.

Markovitz laments the common consultant cry, "I didn't get anything (strategic) done today because I had to respond to my clients". We all get a flood of communication these days, and it is not uncommon to feel overwhelmed by the amount, frequency and desire to respond. Since it doesn't look like the tide will recede, those of us serving clients and customers need to re-think how we process all of this communication.

Your inbox and phone should not plan your day -- your brain should. Just because you receive a communication, it doesn't mean you need to interrupt your work, particularly if you are deep in higher level strategy or an important project. (I typically schedule strategic thinking and planning work in the early morning and conduct it with my calendar and email account logged out to resist the urge to check them.)

Instead of constantly checking email throughout the business day, do so on a regular but timely basis -- say every hour or two, or once in the early part of the day and again at the close. If someone has a true emergency, they will phone.

When phone calls with requests come in, schedule the follow up in your calendar and communicate the timeline to your customer while you have his attention (versus the type A urge to drop everything and handle incoming needs immediately). Excellent service does not have to be immediate.

Managers should speak with staff about communication policies and develop a corresponding protocol. For example, phone calls are returned the same business day and emails within 24 hours. In establishing process, you will help your team prioritize which will enhance efficiency, service and ultimately, case sales!