Analytics for Decisions

A practitioner-focused workshop on using analytics to drive impact rather than just generate charts. We cover KPI design linked to strategy, how to read and interpret experiments correctly, and the pitfalls of “dashboard theater” where visuals replace evidence. You’ll work through case studies on product growth, program evaluation, and operations—leaving with frameworks to design metrics that matter and to communicate findings persuasively to decision-makers.

KPI Design Experimentation Causal Thinking Decision Science Communication

Saturdays · Hybrid (in-person + online)

Workshop: KPIs, experiments, and decision frameworks

Who it’s for

  • Product, ops, and program leads who make or shape decisions.
  • Analysts who need to connect metrics to strategy, not dashboards.
  • Teams aiming to replace opinion fights with testable hypotheses.

Outcomes

  • Strategy-linked KPIs with targets, owners, and review cadence.
  • Experiment designs you can defend: power, segmentation, guardrails.
  • Anti-theater checklists: when a chart misleads, how to fix it.
  • Clear decision memos and readouts that actually trigger actions.

Learning Path

Module 1

KPI Design

Link metrics to strategy

  • Input vs. output metrics; leading vs. lagging.
  • North Star, counter-metrics, and metric trees.
  • Targets & review cadence; ownership and alerting.
Module 2

Experiments & Causality

Evidence over aesthetics

  • AB tests, CUPED, sequential tests; common pitfalls.
  • When RCTs are impractical: difference-in-differences, IV, synthetic controls.
  • Power, MDE, and guardrail KPIs (latency, error rate, churn).
Module 3

Anti-Patterns

Avoid dashboard theater

  • Cherry-picking, Simpson’s paradox, survivorship bias.
  • Over-segmentation and p-hacking alarms.
  • Design fixes: pre-analysis plans, blind reviews, prereg.
Module 4

Readouts & Decision Memos

Communicate to act

  • Decision framing: context, options, trade-offs, recommendation.
  • Confidence bands and expected value, not single-point boasts.
  • Owner, timeline, and explicit kill-switch criteria.
Module 5

Operational Analytics

From slides to systems

  • Monitoring vs. evaluation; thresholding and SLAs.
  • Metric contracts in BI: definitions, lineage, change control.
  • Governance: access tiers, audit logs, reproducibility.
Module 6

Case Clinics

Hands-on scenarios

  • Product growth: activation, retention, pricing impact.
  • Program evaluation: targeting, spillovers, cost-effectiveness.
  • Ops: backlog, throughput, staffing, and SLA design.

Case Studies

  • Growth: Metric tree from signups → activation → value; AB test with guardrails to protect latency and error rates.
  • Program Evaluation: Diff-in-diffs with staggered rollout; present uncertainty and expected ROI bands to funders.
  • Operations: SLA design using queue basics; dashboards that prioritize actions, not pretty charts.

North Star Guardrails ROI SLA

Case study visuals: metric tree, experiment readout, SLA monitor

Practicals & Templates

Metric Tree Pack

  • North Star + counter-metrics template.
  • Definition sheet with owner & refresh cadence.
  • Lineage and change-log tabs.

Experiment Kit

  • Hypothesis & pre-analysis plan.
  • Power/MDE calculator + sample-size guardrails.
  • Readout template with decisions and next steps.

Decision Memo

  • Context, options, criteria, risks.
  • Expected value with sensitivity bands.
  • Owner, timeline, and kill-switch.

Format

  • Saturdays · Hybrid (in-person + online).
  • 4 sessions · 3 hours each · hands-on clinics.
  • Mix of lecture, labs, and case debates.

Requirements

  • Basic spreadsheet/BI familiarity.
  • Willingness to define and defend metrics.
  • Bring a real decision you’re facing (optional but encouraged).

Assessment & Badge

  • Metric tree + experiment plan + decision memo.
  • Peer review rubric: clarity, validity, actionability.
  • Completion badge with verifier link.
Make analytics decisive. Design metrics that matter, run trustworthy tests, and drive action.