The New York Times is one of the more interesting innovators when it comes to using data visualization to tell a story or make a point.  In particular, the Business section employs a variety of chart forms to reveal what is happening in financial markets.  The Weather Report uses “small multiples” to show 10-day temperature trend for major U.S. Cities.

Even more interesting are the occasional illustrations that appear under the heading of “Op-Chart.”  For a few years now the Times periodically presents on the Op-Ed page a comparative table that tracks “progress” in Iraq on a number of measures such as electric power generation.

Another impressive chart appeared in “Sunday Opinion” on January 10, 2010.  Titled “A Year in Iraq and Afghanistan,” this full page illustration provides a detailed look at the 489 American and allied deaths that occurred in Afghanistan and the 141 deaths in Iraq.  At first glance, the chart resembles the Periodic Table of Elements.  Deaths in Iraq take up the top one-fourth or so of the chart (along with the legend); deaths in Afghanistan occupy the bulk of the illustration.

Each death is represented by a figure, and each figure appears in a box representing the date which the death occurred. One figure shape represents American forces, and a slightly different shape signifies a member of the coalition forces.  For coalition forces, the color of the figure indicates nationality.  A small symbol indicates the cause of each death (homemade bomb, mortar, hostile fire, bomb, suicide bomb, or non-combat related).  Multiple deaths from the same event or cause on a date occupy the same box.

Most dates have only a single death, but a few days standout as particularly tragic:  seven U.S. troops dying due to a non-combat related cause in Afghanistan on October 26; eight killed by hostile fire on October 3rd; seven killed by a homemade bomb on October 27; six Italians killed by a homemade bomb on September 17; five Americans killed by a suicide bomber in Mosul, Iraq, on April 10.

The deaths are linked to specific locations on maps of Iraq and Afghanistan.  Helmand Province was the deadliest place, with 79 of the 489 deaths in Afghanistan.  In Iraq, Baghdad was the most dangerous place, accounting for 42 of the 141 deaths in that country.  While Americans are the largest number, 112 of the dead in Afghanistan were British troops.

There is a wealth of information in this chart with four pieces of information on every death, but in some ways there is too much detail.  To get at the numbers I provided above, I had to manually count the pictures.  There are no summary statistics.  The picture grabs our attention, and immediately conveys the magnitude of the price the U.S. and our allies are paying in Afghanistan.   But if we want to act on data, we need a little more than just a very clever visual display.  Summaries of the numbers would help, here.  It’s useful to know, for example, that 65 of the 141 deaths in Iraq (46%) were due to non-combat related causes, compared to 48 (10%) of the deaths in Afghanistan.  Eighty percent of the fatalities in deadly Helmand province were due to hostile fire; 57% in other parts of Afghanistan were caused by homemade bombs (in Iraq there were 19 deaths, or 13% of the total, from homemade bombs).

Two of the creators of this chart, Adriana Lins de Albuquerque (a doctoral student in political science at Columbia) and Alicia Cheng of mgmt.design, produced a slightly different version of this chart summarizing the death toll in Iraq for 2007 (click here).  That earlier version did not have as much detail about each individual death (location information is not included, for example) but includes some additional causes, like torture and beheading that, thankfully, appear to have disappeared.

The advantage to displaying data in this fashion lies in the ability of our brains to form patterns quickly.  The use of color to designate coalition members makes the contributions of our allies apparent in a way that a simple tally might not.  Even without a year-to-year comparison, we can see that Iraq has become, at least for US troops and our allies, a much safer place than Afghanistan.  Additionally, this one chart presents data that, in other forms, might require several PowerPoint slides to communicate: deaths by date, deaths by city or province, deaths by nationality, causes of death, number killed per incident, and cause of death.

Any complex visual display of data requires making trade-offs.  In this case, for example, the creators arranged the deaths chronologically (oldest first) within each geographic block.  That means that patterns in other variables, such as cause of death or nationality of troops, may be harder to detect on first glance.  The chronological ordering has layout implications, since on some dates there were multiple casualties.

All in all, it’s a great piece of data visualization that to my mind would be even better with the addition of a few summary statistics.

A disclaimer–I counted twice to get each of the numbers I provide above, but I offer no guarantee that I am not off by one or two deaths in any of those numbers.

Copyright 2010 by David G. Bakken.  All rights reserved.

Advertisements

Around the early ’90s, a new job title began to appear in many of the companies I consulted with–“Manager of Customer Insights” (and variations on this theme).  In many cases, this involved “rebadging” of many managers of market research.  The presumed goal of this renaming exercise was worthy–shifting the focus of market research from process to content–and from data collection and analysis to knowledge.  

These days, purchasers of market research services are likely to say that the one thing they most want from their research investment is insight.  Ask these buyers what they mean by insight, however, and they may be unable to answer.  A couple of years ago the CEO of a major market research company conducted a series of “client advisory” forums with the directors of market research (or “customer insights”) from several of the firm’s key clients.  Over the course of these sessions, the clients stressed again and again that they wanted their research partners to provide “insights.”  Finally, the research firm CEO asked them to define insight.  Of the dozen or so clients participating in the forums, only two could offer any type of definition.  One of these was a dictionary definition (see two representative dictionary definitions of “insight” at the end of this post).  

In my own experience, clients who complain that research offered no insight often say “I didn’t learn anything new” or “this doesn’t tell me what action I should take.”  This gives us a clue to the nature of insight and, perhaps, a method for achieving insight.  At one level, insight is seeing something that we have not seen before–a pattern or a connection between things.  As one example of pattern detection, check out a brief article in the New York Times (“Fast Arriving Fads Quick to Flame Out,” May 17, 2009) about a study conducted by Jonah Berger of the Wharton School and Gael Le Mens of Stanford University that looked at the prevalence of first names, as recorded by the U.S. Census.  Looking at data going back to 1880, they found that the faster a name becomes popular (based on number per one million children), the faster it declines to “pre-fad” levels.  Names with distinctive spikes (fast rise and decline) included “Dewey” (c. 1900), “Debra” (c. 1960), and “Amy” and “Jeremy” (1970’s-1980’s).  Some names (Patrick and Katherine, for example) are relatively stable over time.  And, if you’ve recently named a child “Ava” or “Aiden,” you’re part of an uptick in popularity for these names that might not last long.  I want to make two points about this study.  First, the “pattern” is only apparent when looking across a large number of names over a long time period (the  rise and fall might take a couple of decades or more) and second, visual examination of the data makes it easy to see the pattern (there’s a nifty graphic accompanying the Times article). 

In fact, proper visual display of quantitative information is sometimes crucial to drawing insight from data.  Edward Tufte makes this point powerfully in his description of the “failed” data analysis that preceded the disastrous launch of the space shuttle Challenger (see “Visual and Statistical Thinking:  Displays of Evidence for Decision Making” by Edward R. Tufte, 1997).

Patterns are not detectable when we look at data elements in isolation.  The typical survey-based research report is a linear summary of the answers to the survey questions, sometimes with responses reported by various subgroups (i.e., “banner points”). While it’s possible that such simple summary analysis is informative (“I didn’t know that so many of my customers are also buying from my competitors!”), patterns emerge only when we can see how the answers change across relevant dimensions (e.g., time, geography, attitudinal segments, and so forth).  

There’s another aspect to insight.  Sometimes we “see” something that completely changes our understanding.  The Tower of Hanoi puzzle provides an example.  You have a wooden base with three identical dowels.  On the first dowel are stacked several disks of increasing size (smallest on top, largest on the bottom).  The task is to move all of the disks to the third dowel, but…  you can move only one disk at a time AND you cannot place a larger disk on top of a smaller disk.  Solving this problem involves a specific insight–that you can move disks back and forth between all three dowels, as long as you move only one at a time and never put a larger disk on top of a smaller one.  The framing of the puzzle–in the physical design and the task instructions–appears to lead most people initially to attempt a solution in which disks are only removed from the first dowel.  After all, if the solution were immediately obvious, it would not be much of a puzzle.  

While we often can find patterns and connections using systematic, analytic approaches (like the analysis of census data on first names), the type of insight required to solve a puzzle like the Tower of Hanoi is qualitatively different–much more like an “aha” or “eureka” experience.  Once you’ve figured out the solution, you can solve similar puzzles quickly by recognizing the form of the problem.  

“Aha” insights sometimes happen in market research, but in my experience they are most likely to occur when we use qualitative methods, such as case histories, in-depth interviews, and immersion.

The second client complaint (“this doesn’t tell me which action to take”) usually reflects a failure to align the research with the business problem.  Most often, the link between the research activities and the actions available to the firm is missing.  I think this occurs because, by the time the customer insights department begins working with the market research partner, the process is two or three steps removed from the business problem.  It’s important to have a line of sight from the data to the actions that the firm can take.  Consider the auto industry as an example.  Once a model is introduced (that is, the vehicle is designed and engineered, the assembly line has been built, parts have been ordered, and so forth), the automaker has only two ways to impact the choices of consumers: advertising and price.  An attitudinal segmentation at this point might be nice, but unless it directly informs advertising or pricing decisions it’s not likely to help a manufacturer decide on a course of action.

So, it’s important to specify, up front, what we mean by “insight.”  If we’re looking for new knowledge, we need to know “what we know” as well as what we don’t know–and it’s important for clients to share this knowledge with their research partners.  We also need to recognize that insight often results from looking across multiple sources of information, enabling us to see patterns or connections that are not otherwise apparent.

Copyright 2009 by David G. Bakken

Here are the dictionary definitions of insight:

 From the Concise Oxford English Dictionary (Oxford University Press, 2004).  

insight n. 1 the capacity to gain an accurate and deep understanding of something; an understanding of this kind. 2 (Psychiatry) awareness by a mentally ill person that their mental experiences are not base in external reality.

From Merriam Webster’s Collegiate Dictionary 1oth Edition (Merriam-Webster, 1993).

insight n. 1: the power or act of seeing into a situation: PENETRATION  2: the act or result of apprehending the inner nature of things or of seeing intuitively  syn see DISCERNMENT.