The problem isn't the data. The problem is the language.
When sales uses "revenue" to mean the value of contracts signed, and finance uses "revenue" to mean the value recognized under accounting standards, and operations uses "revenue" to mean the value of work delivered, they are not disagreeing about numbers. They are using the same word to mean three different things. No amount of data engineering resolves a semantic ambiguity. No dashboard redesign fixes a vocabulary problem. The resolution requires something older and more fundamental than any technology: an agreement about what the words mean.
This is the insight that most conversations about metrics miss, because it's more comfortable to talk about dashboards and BI platforms than to talk about the harder organizational work of deciding what terms mean and enforcing those decisions. Most metric governance failures — the meetings consumed by reconciliation, the reports that don't match, the decisions deferred because nobody can agree on the number — are not technical failures. They are language failures. The same word, applied with different definitions by different people in different systems, producing different numbers that all appear to describe the same thing.
The Metrics Contract is the written resolution. An agreement about what each number means, where it comes from, who maintains its integrity, and what the organization does when it crosses a defined threshold. Organizations that have it spend their meetings making decisions. Organizations that don't spend their meetings debating which number to believe.
Why the Language Breaks
The language doesn't break all at once. It erodes.
A metric is defined at the beginning of the year. Two months later, someone makes a judgment call about an edge case — a partial refund, a multi-year contract, a discount applied after invoicing — and doesn't document it. Three months after that, someone else makes a different judgment call about the same edge case. By quarter three, the metric is being computed four slightly different ways across four contexts, and the number has become a reflection of who is doing the calculation rather than the underlying reality.
This is definition drift, and it's the default condition in any organization where metrics aren't governed through a written standard. The drift is not visible on its own. It becomes visible in the meeting where two people present different versions of the same metric, or where the board asks a question that takes three days to answer because the relevant number isn't computed consistently across the systems that contain it.
The controlled vocabulary — a shared glossary of terms the organization has agreed on, with explicit rules governing what counts and what doesn't — is the foundation of the Metrics Contract. Each term has a definition, an owner, and a governance process for changing the definition. The vocabulary is enforced, not suggested. Changes require a defined process, not a judgment call by whoever is computing the metric — because definitions that change without governance are more dangerous than no definitions at all.
What a Metric Actually Is
Before a metric can be governed, it has to be understood for what it is.
A measurement is any number that can be recorded. Revenue, headcount, tickets submitted: all measurements. Measurements are raw materials — necessary but not sufficient for decision-making.
An indicator is a measurement interpreted over time — one that, tracked consistently, shows trajectory rather than just position. Revenue this month compared to last month is an indicator. Gross margin over twelve quarters is an indicator.
A key performance indicator — a KPI — is something more specific. It's an indicator tied to a strategic objective, with a target, with defined thresholds that trigger different responses, and with a clear link to a decision or action. The "key" is not a casual adjective. It means this indicator is consequential — that crossing its threshold changes something: a resource allocation, a process intervention, an escalation, a hiring decision.
The distinction matters because most companies track far more measurements than indicators, and far more indicators than genuine KPIs. The result is dashboards that are comprehensively populated and operationally inert. Everything is tracked. Nothing changes based on what the tracking reveals.
The Metrics Contract begins with a harder question than "what should we track?" It begins with: what decisions do we need to make consistently, and what would we need to know to make those decisions well? That question produces a far shorter list than the typical dashboard, and a far more actionable one.
The Five Properties
A KPI without governance is a number with aspirations. The governance is what converts it into a decision tool.
An owner. One person accountable for the definition's integrity and the data's accuracy. Not the analytics team. Not "everyone." One person. When the number is wrong or ambiguous, there is a specific human responsible for resolving it. Without ownership, accountability diffuses and definitions drift.
A precise definition. Not a name — a formula. Including explicit rules about what's included and excluded, with specific handling of edge cases. Gross margin with a stated cost-of-delivery policy. Revenue with a recognition standard for each offer type. Customer count with explicit rules about what constitutes active versus inactive. The precision makes the number reproducible across people and systems.
A named source of truth. The specific system, table, or report that is the authoritative origin of the data. "Accounting" is not a source of truth. "The revenue recognition schedule in the ERP, reconciled monthly to the signed contract schedule in the CRM" is. The precision makes lineage traceable — when the number is questioned, the investigation has a starting point.
A freshness specification. How often the metric updates, and what "recent enough to act on" means. A cash position three days old isn't useful for daily decisions. A gross margin that hasn't been reconciled since last quarter isn't reliable enough for a pricing conversation. The freshness requirement depends on the decision the metric serves.
A decision link. A statement of what changes when this metric crosses a defined threshold. "We discuss it" is not a decision link. "If gross margin falls below X% for two consecutive months, the pricing committee reviews every active offer within 30 days" is. A metric without a decision link is a measurement tracked out of habit rather than a KPI serving a governance function.
Why Metrics Break Quietly
The failure mode of metric governance is not dramatic. Metrics erode through three specific mechanisms.
The first is the definition drift described above — the slow divergence of how a metric is computed across people, systems, and time. Each individual variation seems reasonable. The cumulative effect is a number that has ceased to measure what anyone thinks it measures.
The second is Goodhart's Law. Charles Goodhart observed that when a measure becomes a target, it ceases to be a good measure. Organizations learn to optimize the variable being measured in ways that satisfy the metric while degrading the outcome it was supposed to represent. Calls made goes up; call quality goes down. Tickets closed increases; customer satisfaction erodes. Projects shipped accelerates; rework climbs.
The antidote is pairing metrics with counter-metrics: measuring the output alongside the quality of the output. Collections velocity paired with dispute rate. Projects completed paired with defect rate. The pairing makes gaming expensive enough to be visible.
The third is altitude confusion. Strategic metrics — the handful of numbers that tell you whether the business is winning — get mixed into dashboards alongside operational metrics and process health indicators. The signal is present in the data. The altitude confusion makes it indistinguishable from noise.
The organizational fix is a metric hierarchy: strategic outcomes at one level (reviewed quarterly), leading drivers at a second (reviewed weekly), process health at a third (monitored continuously, escalated by exception), and data integrity checks running in the background. Each level has its audience and cadence. The executive cockpit shows the top two levels. Drill-down exists for the rest.
What the Contract Produces
An organization with a functioning Metrics Contract — defined KPIs with owners, precise definitions, named sources, freshness standards, and decision links — doesn't just have better data. It has a different kind of meeting.
The difference is detectable in the first fifteen minutes. Without the contract, those minutes are spent establishing which version of the numbers everyone is looking at, explaining differences, debating which is correct, and occasionally arriving at an informal consensus that will need to be re-established next time. The actual agenda starts late, if it starts at all.
With the contract, those fifteen minutes don't exist. The numbers are defined. The definitions are shared. The meeting begins with the question the metrics are there to answer: what are they telling us, and what does that imply for what we do next?
That shift — from re-establishing shared reality at the beginning of every conversation to starting from it — changes the ratio of organizational energy spent on coordination versus judgment. It returns to leadership the cognitive capacity that was being consumed by definitional friction.
The companies that have done this work don't experience it as bureaucratic. They experience it as relief. The definition meetings that consumed hours every month become unnecessary because the terms are settled. The number debates that derailed leadership reviews disappear because there's one definition, enforced.
The Metrics Contract is not a data project. It requires a decision: that the organization will agree on what its numbers mean, write that agreement down, assign someone to maintain it, and hold the agreement through the organizational changes that will otherwise erode it.
That decision takes an afternoon to make and a quarter to implement. It lasts as long as the governance holds.
The Metrics Contract explores the governance layer that converts measurements into decision tools. Related: The Dashboard That Lies, The End of One True Number, The Bottleneck Is Always Integrity.