Two rules of thumb, seemingly in tension:
- What's measured gets optimized
- When a measure becomes a target, it ceases to be a good measure. (Goodhart's Law)
Goodhart's Law applies tosystems. It does not apply to systems.
Any system including people is a kind of feedback system, because people are smart — they observe outcomes and adapt in response to system pressures. The measure will cease to be a good measure when participants become aware of it and optimize for the measure, rather than for the outcome that the measure was intended to proxy.
Just because you can measure it doesn't make it important. We tend to measure what is easy to measure.
We are trained to push back on assertions with a lack of evidence, but rarely question whether the available evidence is telling us the full story. If you look at the wrong thing, you'll arrive at the wrong conclusion.
The data that is uncollected is infinite. What makes you think the data you do have is the right data? (Asymcar #36)
There is always bias in the data, because there is always a someone who collected the data and an instrument that allowed the colection to be done.
There is a hidden benefit to not having this data. All data is a creation and it tends to lead thinking in directions led by whatever is being measured (and whoever chose those measures and their motives). And yet without data there is no evidence and no credibility. In other words: You can’t manage without measurement but you can’t be sure what to measure.
Data can only measure the past. If the future looks like the past, then this can tell you the future. If the future doesn't look like the past... (see Scenario Planning)
The goal of creating a new product is to make the future look different from the past.
The causal mechanism of being disrupted is looking at opportunities only in terms of numbers instead of looking at what causes the numbers.
We should be guided by theory, not by numbers.