Goodhart's Law is a principle in economics and social science that states: "When a measure becomes a target, it ceases to be a good measure." It highlights the idea that once a particular metric is used as a basis for decision-making or as a target for optimization, people and organizations will often find ways to manipulate or game the system to achieve the desired outcome. Even if it means compromising the integrity of the original measure.
British economist Charles Goodhart; hence Goodhart’s Law, was first articulated in the context of monetary policy. He observed that when policymakers target specific economic indicators to achieve their goals, these indicators lose their reliability as accurate measures of economic stability or performance because people adjust their behavior to influence these indicators artificially.