There's a specific failure mode in KPI-setting that looks like rigor and isn't. It goes like this: a working group spends two weeks developing a "comprehensive KPI framework." Fifty-three metrics. Color-coded by department. Reviewed monthly.
Six months later, nobody knows which five metrics the CEO actually watches. The fifty-three are still being reported. They're just not being used to decide anything.
The problem isn't too few KPIs. It's that the KPIs aren't connected to a strategy decision. They're connected to a measurement habit.
The question that cuts through
For every KPI you're considering, ask: "If this number changes by 20%, what decision changes?"
If the answer is "we'd investigate," the metric is diagnostic. It tells you something is happening. It doesn't tell you what to do about it. Diagnostic metrics are useful — but they shouldn't be on your strategic dashboard. They belong in an operational layer, available when something triggers them.
If the answer is "we'd reallocate resources from X to Y" or "we'd accelerate this initiative and pause that one" — that's a strategic metric. It drives decisions. That's the short list you're building.
The test for strategic relevance
A KPI earns a place on the strategic dashboard if it passes three tests:
1. It's directly connected to a named strategic priority. Not adjacent to one. Not a downstream indicator of one. Directly connected. "Net revenue retention" for the retention bet. "Time-to-first-value" for the product-led growth bet. If you can't name the bet it belongs to, it doesn't belong on the strategic list.
2. It has one named owner. Not a committee. Not "the revenue team." A person whose job it is to move that number this quarter. Metrics without owners are thermometers without doctors.
3. There's structured work currently moving it. If there's no active work item explicitly linked to this metric, the metric is aspirational. Aspirational metrics generate beautiful charts. They don't generate results.
Why most organizations have too many KPIs
The standard KPI-setting process conflates measurement with accountability. The finance team measures revenue. The marketing team measures pipeline. The product team measures activation. The customer success team measures NPS. Each team has its own dashboard. The executive review presents all of them.
Nobody asks: of these fourteen metrics, which three are the ones we're actually betting the strategy on?
The proliferation isn't malicious. It's the result of adding without removing. Every quarter someone proposes a new metric. Nobody retires the ones that have stopped being strategic. The dashboard grows. The signal degrades.
Forty metrics don't give you forty times the visibility. They give you forty times the noise.
The three-layer model
A clean KPI architecture has three layers:
Layer 1: Strategic KPIs (3–5 max). The metrics that directly measure whether the strategy is working. These go to the board. They have single owners. There is active, traceable work behind each of them. They don't change mid-year unless the strategy changes.
Layer 2: Leading indicators (5–10). The metrics that predict where the strategic KPIs are going. If the strategic KPI is net revenue retention, the leading indicator might be time-to-first-value, support ticket volume per customer, or feature adoption rate among expanding accounts. These are reviewed weekly by the teams responsible for them.
Layer 3: Diagnostic metrics. Everything else — available, tracked, but not on the agenda unless a leading indicator triggers concern. Not reported. Investigated.
The retirement conversation nobody has
Every quarter, before you add a metric, hold a ten-minute "retirement review." Which KPIs on the current list no longer pass the three tests? Which ones are diagnostic that got promoted to strategic and never demoted? Which ones have been amber for three quarters with no structural response?
Retire them explicitly. Name what they were tracking and why it no longer warrants a place on the strategic list. The act of retiring a metric is strategy as much as the act of adding one.
The Vindaris position
KPIs don't align with strategy by being visible. They align by being connected — to the work that's supposed to move them, the owner who's accountable for moving them, and the capacity that's been allocated to do so. A metric without traceable work beneath it isn't a strategic KPI. It's a hope with a number on it.