Goodhart’s law: When a measure becomes a target, it stops being a good measure.
Charles Goodhart is an economist who recognized that once a central bank set a specific monetary target, the historical relationship between that target leading to the outcome they want breaks down.
Those subject to new policies and regulations will react in different, and often unexpected ways, [and] also takes cognisance of the fact that, having set a new policy target, the authority involved has some reputational credibility attached to successfully meeting that target, and thus may adjust its own behaviour and procedures to that end.
One reason this happens in other fields: once a goal is set, people will optimize for that goal in a way that neglects equally important parts of a system. Task your company with hitting a big sales target and customer service may wither as the goal cannibalizes employees’ attention. Or they’ll game the system to meet a goal in a way that distorts the benefit of achieving that goal. Investors set quarterly earnings goals for a CEO to meet, with a huge incentive if they’re exceeded.
Then stuff like this happens:
[General Electric] for two years in a row “sold” locomotives to unnamed financial partners instead of end users in transactions that left most of the risks of ownership with GE.
The sales in 2003 and 2004 padded revenue by $381 million … critical to meeting GE’s end-of-year numbers.
This is a cousin of observer effects in physics: It’s hard to know how some things operate in the real world because the act of measuring them changes them.