What’s in Wine Competition Results? More Texas Winners in 2011 Lone Star Competition
There has been a lot of banter on the Internet (Blogs and Twitter) on the validity of wine competitions. I’ve judged in several wine competitions; most commercial, but some amateur that I find particularly interesting. There are several judging methods used in these competitions. Some using a summation of individual numerical scores (1-5 or the U.C. Davis 20 pt system) and others use a more communal method where judges hold up cards with their decision (no medal, bronze, silver and gold). Other follow the primary judging with a second round restricted to just the top scoring wines from the initial judging round.
Some of the more prestigious wine competitions require judges to pass tasting examinations. In a competition that I ran in Houston for several years, I made each wine panel taste a series of wines (some flawed) to evaluate judges and to better understand the variance between panels. However, most competition do not do these things opting to seek out educated tasters from the wine trade, certified sommeliers or the realm of experienced wine writers.
With the wide latitude offered in wine competitions in how can consumers trust wine competition results? After all, there is no A-list of the competitions or even an A-list of judges.
An interesting study was published ( Journal of Wine Economics, Volume 3, Issue 2, Fall 2008, Pages 105–113, “An Examination of Judge Reliability at a major U.S. Wine Competition” by Robert T. Hodgson). The study found that at a major wine competition over several years, about 10 percent of the judges were able to replicate their score within a single medal group. Another 10 percent, on occasion, scored the same wine Bronze to Gold. Judges tended to be more consistent in what they didn’t like than what they liked. These results certainly call into question the value of wine competitions in their ability to select the best wines, or at least a single wine judges individual scores.
Despite these seemingly disheartening results, the major counterbalancing factor in support of wine competition results is, in wine competitions, wines are commonly evaluated by a panel of judges. While the abovementioned study found individual judges lacked some reliability, the overall results of a panel of judges provided much higher reliability and consistency. The multiple viewpoints of the judging panel provides checks and balances that generally overcome the limitations of single judge reliability.
In general, my experience agrees with the reliability of the judging panel concept. Even though I may not have always agreed totally with the results of a judging panel, I believe the multiple viewpoint approach results in a more accurate representation of how a wine will be received by a group of consumers and, in fact, the market as a whole.
Another way I look at wine competition results is to evaluate the results from multiple competitions, and when it comes to evaluation of Texas wines, I like to see the results from both in-state and out-of-state competitions. What I’m looking for is consistency in the results. Rarely have I seen a wine get no medal in one competition and then the same wine score big in another, if coming from the same vintage and batch. However, I will admit that the competition gets stiffer in the out-of-state since many of these competitions have more experienced judges and also attract a wider range of wines from premium producers on a worldwide basis.
My recommendation for Texas wineries is to establish their medal winning credentials at in-state competitions like the Lone Star International Wine Competition first. Then, they should explore out-of-state wine competitions that attract the strongest competition in the varietals and styles of wines they want to make.
I also hear people say that wine competition results are “rigged” or “fixed”. I’m sure that you can find “pseudo-wine competitions” like that, but, the vast majority of major wine competitions around the world are run by competent and honest people and have in place checks and balances. For example, every competition where I’ve judged, the wines were double blinded and neither the server or the judging panel knew the producer of the wine. The wines came pre-poured with only a numbered tag identifying the wine. When this number was match on the judging sheet, I could find the wine’s category (and maybe % alcohol, residual sugar, and major blending constituents).
Recently, I judged in the Lone Star International Wine Competition and was impressed with the improving caliber of the wine judges since I first did this over a decade ago. What I look for in a professional wine judge is someone that can judge a wine, not based on his or her likes or dislikes in terms of grape variety or wine style, but those that approach it more analytically. These are people who ask a few important questions before coming up with their scoring. The major questions are: Is the wine flawed or not? Is the wine a good and valid expression of the grape type or wine style? Does the wine have relevance or commercial value in today’s global marketplace? Will it provide a pleasant experience for consumers? Additionally, these judges are people that justify their scoring if asked by others on the panel and don’t say “I just don’t like it”, or “I love this wine”.
The complete list of Texas winners from the 2011 Lone Star International Wine Competition are by clicking 2011 LSIWC TEXAS WINNERS. Please note that this list does NOT give appellation and some of the Texas wineries are submitting wine made from out-of-state grapes or juice. If you are interested in drinking Texas appellation wines (i.e. those made from at least 75% Texas fruit) you will have to ask further questions or inspect the labels for the words “Texas” or one of the legal sub-appellations like “Texas Hill Country”, Texas High Plains or “Mason County”.
Keep in mind that not every winery that submitted a wine in the Lone Star Competition came home with a medal. Did some worthy wines or even your favorites not received a medal? Perhaps. Do all wineries submit all their wines to competitions? Of Course not.
Until someone comes up with a better process for evaluating wines and rewarding the best-of-the-best, it’s the process we have, and it’s a not bad one for consumers to monitor. Just the same, I recommend that consumers also find a way to taste the widest array of wines possible for themselves and compare the results of their tasting with those of the major wine competitions (most are now easily available via the Internet). If their results come out to be similar to the competitions, then great, use the results of wine competition to guide your selections. However, if you find your palate dances to the nuances of a different wine or style, then of course, follow your palate.
I was asked from someone over Twitter how the results of the Lone Star Wine Competition compare to other competitions (particularly International wine competitions). Well, this week Flat Creek Estate got the news that their 2010 Viognier was awarded a DOUBLE GOLD. It received a Silver Medal in the Lone Star Wine Competition.It appears that the Texas judges were harder on it than the SFO judges.
Sorry for posting this here. I wanted to email you about this article I found, but couldn’t find an email address. This article talks about a vaccine to cure PD that has been developed and proved to be successful. Have you heard anything about it?
http://www.cosmosmagazine.com/features/online/4432/shots-wine