Goodhart's law

Goodhart's Law:

When a measure becomes a target, it ceases to be a good measure.

Goodhart's Law applies to feedback system with humans in the loop. It does not apply to feed-forward systems.

Any system including people is a kind of feedback system, because people are smart. We are agents. We observe outcomes and change behavior in response to system pressures. Evolution! The measure will cease to be a good measure when participants become aware of it and optimize for the measure, rather than for the outcome that the measure was intended to proxy.

Measuring something? Be careful how you measure it.