The most widely used marketing tool for wine is the score.
In just about every wine shop you’ll find tags on the shelves stating that the above wine got a score from some self-anointed wine expert that supposedly indicates its quality.
But have you ever asked: On what was the score based? Is a wine with a score of 88 really 2 points better than one getting an 86?
Or, more importantly, just how was a score determined? How qualified is that expert at rating wine? And was that wine tasted blind or with sight of the label?
This last question is germane if you ask the additional question: Is it possible that a part of the score is composed of preconceived notions based on the brand on the label, and the price?
Have you ever seen a wine selling for $5 scored above 90 points? I haven’t either. Then the following story might be of interest.
Three years ago, at the Riverside (Calif.) International Wine Competition, which I coordinate, three separate panels of four wine judges evaluated various groups of wines double blind. By consensus, three different panels of judges had ruled that three different Sutter Home wines deserved gold medals.
The judging was done without regard to price. Thus, the Sutter Home wines were up against wines selling for $20 to $30 a bottle -- or more.
Each of the three Sutter Homes wines had a suggested retail price of $5.
That, among many reasons, is why I trust the results of wine competitions a lot more than I do the scores of disparate wine experts who rarely if ever say how they judge the wines they rate.
Imagine the following two scenarios:
1: The setting is a lavish banquet, the wine is served in crystal goblets, and the food is exquisite.
2: The setting is a high school football game in the snow, the wine is served in a plastic cup, and the food is cold hot dogs.
Now assume that the exact same wine is served. Which setting is more ideal for enjoying and analyzing the wine?
So, when a wine expert fails to say when, where, and how a wine was evaluated, doesn’t that factor into what the eventual score will be?
Far too many factors affect the awarding of a score. It is for that reason that I have never used scores when rating wine, and simply state what I perceive to be excellent -- and price often factors into my ratings.
When you see the results of multiple wine competitions and note that one wine received three gold medals, two silvers, and a bronze from various panels tasting wines without sight of the label, isn’t that stronger verification of quality than any single number awarded hastily can ever be?
Wine of the Week: 2007 Ravenswood Zinfandel, California “Vintner’s Blend” ($10) -- Lovely wild berry spice aroma with hints of blackberry jam and a trace of mint. Only 13.5 percent alcohol and excellent balance in this often discounted red wine.