Goodhart's law
Goodhart's law
Goodhart's law is an adage named after economist Charles Goodhart, which has been phrased by Marilyn Strathern as "When a measure becomes a target, it ceases to be a good measure."[1] One way in which this can occur is individuals trying to anticipate the effect of a policy and then taking actions that alter its outcome.[2]
Formulation
Goodhart first advanced the idea in a 1975 article, which later became used popularly to criticize the United Kingdom government of Margaret Thatcher for trying to conduct monetary policy on the basis of targets for broad and narrow money. His original formulation was:[3]
Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.
However, some parts of the concept considerably pre-dates Goodhart's statement in 1975.[4] Shortly after Goodhart's publication, others suggested closely related ideas, including Campbell's law (1976) and the Lucas critique (1976).
As applied in economics, the law is implicit in the economic idea of rational expectations, a theory in economics that states that entities who are aware of a system of rewards and punishments will optimize their actions within said system to achieve their desired results. E.g. employees whose performance in a company is measured by some known quantitative measure (cars sold in a month etc.) will attempt to optimize with respect to that measure regardless of whether or not their behavior is profit-maximizing. While it originated in the context of market responses, the law has profound implications for the selection of high-level targets in organizations.[5] Jón Danı́elsson quotes the law as "Any statistical relationship will break down when used for policy purposes" and suggests a corollary to the law for use in financial risk modelling: "A risk model breaks down when used for regulatory purposes."[6] Mario Biagioli has related the concept to consequences of using citation impact measures to estimate the importance of scientific publications:[7]
All metrics of scientific evaluation are bound to be abused. Goodhart's law (named after the British economist who may have been the first to announce it) states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it.
The law is richly illustrated in the 2018 book The Tyranny of Metrics by Jerry Z. Muller.[8] An enunciation of the law preceding both Goodhart's and Campbell's works is due to Jerome R. Ravetz. In his 1971 Scientific Knowledge and Its Social Problems,[9] pp. 295–296, Ravetz discusses how measurement systems can be gamed. For Ravetz, when the goals of a task are complex, sophisticated, or subtle, then crude systems of measurements can be played exactly by those persons possessing the skills to execute the tasks properly, who thus manage to achieve their own goals to the detriment of those assigned.
See also
Campbell's law
Reflexivity (social theory)
Reification (fallacy)
Overfitting
Cobra effect
McNamara fallacy