Data Visualization Lesson 6: The Ultimate List of Dos and Don’ts

As part of this Research Access lesson series, we’ve explored best practices in the art and science of data visualization.  Here, I round out that discussion with the ultimate list of dos and don’ts:


  1. Start with purpose.  Before you select a data visualization tool, take a moment to put in words what you hope to show with the data.  By clarifying at the outset what we think we are doing with data visualization, it is much more likely that we will…
  2. Select the right graphic.  Like tools in a toolbox, graphics and charts are better suited to some tasks than others.  Pie charts are excellent at showing a whole with between two and four distinct categories; time series come alive in simple line graphs; and bar charts make visual comparisons of multiple groups a breeze.
  3. Dr. Dana Griffin

    Dr. Dana Griffin

    Know thy audience (Grandma is always in the front row).  Even when presenting to a highly technical audience, a non-expert in the audience should be able to explain (at a general level) what is going on in the graphics.

  4. Keep it simple.  Resist the urge to “fancy up” data with graphic options that seem more aesthetically pleasing.  Go with what makes sense to most people.
  5. Winning visuals have titles.  Simple, straightforward titles attract viewers.  Include a short title that either describes the data (for example, the relationship between voter turnout and weather) or the specific pattern you wish to show (i.e.: estimated number of potential voters deterred by rain or snow on Election Day).  Election season bonus here : hat tips to The Weather Channel article and excellent research by Brad Gomez, Thomas Hansford, and George Krause.
  6. Be a slave to labels.  The y-axis and x-axis should be clearly labeled, and preferably will include a zero baseline.  Categories and trendlines should be described in an easily accessible legend.  Relevant information about the data N, sampling method, margins of error, variable measures, or model specifications should provided with the graphic in a footnote or sidebar.


  1. Assume viewers will automatically “get” whatever graphic you create.  Understanding doesn’t happen in a vacuum – it is the researcher’s responsibility to distill data in a way that makes clean, intuitive sense to viewers, and to provide visual tools that guide the audience to desired insights or conclusions (this is why titles, axes, labels, and legends are so important).
  2. Overstate or oversimplify the data.  The conclusions that viewers reach are highly dependent on how specific the data are.  Accordingly, the best graphics resist overstating or oversimplifying.
  3. Make viewers look too long and/ or think too hard in order to get the point you intend.  Time and attention are limited.  Poorly designed visuals risk confusing, irritating, or even alienating viewers (sample internal monologue: “” “so y is the change since last year and x is, wait, what is x?  And what’s that line mean?  Is this good?  Or bad?  I don’t know – it’s probably not important.  Did I turn off the coffee pot this morning?  Need to send that meeting request.  My foot hurts.  I should check my fantasy team standings.”)  Keep viewers’ attention – tell them what you are showing (titles, axes, labels, legends) and inspire their desire to know more.

Data Visualization Lesson 1: Examine the Y-Axis
Data Visualization Lesson 2: Think of Grandma
Data Visualization Lesson 3: Abela’s Rubric
Data Visualization Lesson 4: The Best Pies are Desserts
Data Visualization Lesson 5: Ninth Grade Algebra Wasn’t Worthless After All


Related posts:

  1. Data Visualization Lesson 2: Think of Grandma
  2. Data Visualization Lesson 4: The Best Pies Are Desserts
  3. Data Visualization Lesson 5: Ninth Grade Algebra Wasn’t Worthless After All
  4. Data Visualization Lesson 3: Abela’s Rubric
  5. Data Visualization Lesson 1: Examine the Y-Axis
About Dana Griffin

Dr. Dana Griffin is a consumer insights and decision-making expert based in Seattle. An award-winning researcher and data educator, she earned her Ph.D. in political science from the University of Minnesota with concentrations in research methodology and psychology. With extensive experience in both qualitative and quantitative methods, Dana served as the Director of Survey Research at one of the largest newspapers in Minnesota. Prior to joining the private sector as a research and analytics specialist, she was a faculty member at Furman University and the University of Nebraska-Lincoln.