This two-day executive education course will give journalists working knowledge of key principles that allow them to ask questions that separate credible from non-credible evidence, distinguish causation from correlation, and avoid common pitfalls in interpreting quantitative evidence.

Through a combination of in-class instruction, pertinent examples, and experiential group exercises, participants leave the course with a transformed understanding of how to extract credible, substantive, and reliable information from quantitative analyses and data.

Day One

Morning, 9am – 12:30pm: Creating Shared Language

Key Takeaway

It can be difficult to distinguish between correlation and causation; knowing how to do this is essential because when we get it wrong, you reach false conclusions. The stakes are high!

Agenda

  • Introductions
  • Is this causal?
  • Break
  • Correlation & Causation: What are they and what are they good for?
  • Correlation and Causation: Why knowing the difference matters

Lunch, 12:30 – 1:30pm

Afternoon, 1:30 – 5pm: Correlations, the Cornerstone of Quantitative Analysis

Key Takeaway

Correlation is the cornerstone of all quantitative analysis. Establishing a correlation is not as easy as we may think, you have to look at the right kinds of evidence.

Agenda

  • Post-Mortems and Lessons Learned
  • Is this a correlation?
  • Correlation Requires Variation: Challenges to establishing correlation
  • Break
  • Post-Mortems revisited
  • Tying it all together

Welcome Reception

Day Two

Morning, 9am – 12:30pm: Questioning Causality

Key Takeaway

Sometimes correlation is causation, but you can’t trust an interpretation of causal relationships without accounting for confounders. When using evidence of a correlation to gauge the effect of a planned action, it is critical to assess whether a causal interpretation of that correlation is credible/plausible. To do so, ask: Are there unaccounted confounders? Is there reverse causality? The first line of defense against mistaken correlation for causation is to control for confounders. The problem is that many potential confounders are unobservable.

Agenda

  • When correlation isn’t causation: the problem of confounding variables.
  • The experimental ideal.
  • Making valid comparisons, drawing valid conclusions.
  • Confounding variables and selection effects.
  • Break
  • Now what? Questions you should ask when someone purports to give you evidence of a causal relationship.
  • Thinking through confounding variables.
  • Controlling for confounders.

Lunch, 12:30 – 1:30pm

Afternoon, 1:30 – 5pm: Avoiding Pitfalls

Key Takeaway

The world can create misleading or false patterns. Some of the standard practices of science exacerbate such problems. It is important to remain skeptical, especially about surprising findings. Before reaching an evidence-based conclusion, be sure to turn statistics into substance and ask whether you’ve measured the right outcome.

Agenda

  • Over-Comparing, Under-Reporting
  • Mean Reversion
  • Publication bias: Why most scientific findings are false
  • Break
  • Turning Statistics into Substance
  • Measuring the mission
  • Pulling it all together