Commitment device
A commitment device is a mechanism or arrangement that implements precommitment by deliberately restricting one’s own future options in order to make a commitment credible. In game theory, commitment devices are strategic moves that constrain an agent’s choices to influence an adversary or negotiating partner, as when a general burns bridges to make retreat impossible or a state adopts a nuclear deterrence posture that makes retaliation automatic. In behavioral economics, they are voluntary self-imposed constraints—such as savings accounts with withdrawal penalties or contracts that impose costs for failure—designed to help individuals follow through on intentions they might otherwise abandon due to akrasia (weakness of will) or time-inconsistent preferences.
The concept emerged independently in the mid-1950s, when Thomas Schelling analyzed commitment as a bargaining tactic and Robert Strotz modeled how a rational agent might benefit from constraining future choices.
Experimental research on the topic exists, but has focused primarily on a specific type of device known as “commitment contracts”—formalized agreements in which a person accepts a penalty for failing to meet a self-chosen goal. These have produced meaningful increases in savings in developing-country field trials, but evidence for sustained effects on health behaviors such as exercise and smoking cessation is more limited.
Concept and mechanisms
A commitment device is a specific implementation of precommitment, in which a person or group arranges, in advance, for certain future options to be restricted, costly, or unavailable, in order to change current or future behavior. The concept primarily appears in two contexts: game theory, where the device is directed at influencing another party such as an adversary or negotiating partner; and behavioral economics, where it is directed at one's own future self to overcome weakness of will. The underlying logic is the same in both cases: future choice is constrained in order to make a current commitment credible. In the strategic case, a general who destroys a bridge forces both the enemy and his own troops to treat retreat as impossible; in the self-control case, a person who deposits money into an account with withdrawal penalties forces their future self to bear a cost for breaking the commitment.[1][2]
A common illustration of a commitment device is the story of Odysseus and the Sirens.[3][4] Knowing that he will be unable to resist the Sirens’ song once he hears it, Odysseus instructs his crew to bind him to the mast and refuse any order he gives to change course.
Types
Commitment devices vary along several dimensions. “Hard” commitments impose real economic penalties for deviation, such as forfeiting a deposit, while “soft” commitments carry primarily psychological or social consequences, such as the embarrassment of breaking a public pledge.[1] A related but distinct line separates “immutable” devices, whose consequences are irreversible (such as a monetary penalty automatically triggered by failure), from “mutable” ones, which allow for reversal.[2] A separate dimension concerns the person rather than the device: “sophisticated” individuals are aware of their own future self-control problems and therefore willing to seek commitment, while “naïve” individuals do not anticipate future lapses and see no reason to constrain themselves.[5]
One common form is the “commitment contract”, in which a person voluntarily enters into a formalized agreement specifying a goal, a timeframe, and a consequence—typically a financial penalty—for failure. The contract may be made with a third party such as an employer, a bank, or an online platform, or structured as a deposit that the person forfeits if the goal is not met.[6]
Commitment contracts are the form of commitment device most frequently tested in field experiments.[1][2] They differ from devices that operate through physical constraint, institutional design (such as automatic enrollment in savings plans), or outright elimination of options. Because contracts depend on the person’s ongoing willingness to accept a penalty, they are more susceptible to the low take-up and frequent default that characterize much of the experimental literature.
Theoretical explanations
The behavioral economics literature grounds the need for commitment devices in the problem of time inconsistency: a person’s preferences at one point in time may conflict with their preferences at another. A commitment device works by making it difficult or costly to act on the reversed preference.[1]
The hot-cold empathy gap approach provides a complementary psychological account: people in calm, deliberative states (“cold” states) systematically underestimate how powerfully visceral drives and emotions (“hot” states) will influence their future behavior.[7]
In the strategic case, the mechanism is different: the commitment device works not by overcoming internal preference reversal but by altering another party’s expectations. Thomas Schelling, in his work on nuclear deterrence, argued that (paradoxically) visibly eliminating one’s own options can force an adversary to adjust, because the adversary can see that the commitment is irrevocable.[8]
Thaler and H. M. Shefrin’s “planner-doer” model bridges the two traditions by treating the individual as an internal organization, with a farsighted “planner” who constrains a myopic “doer”—applying the logic of principal–agent theory to a single person.[9]
Effectiveness and challenges
The experimental evidence on commitment devices is relatively recent and concentrated: as of 2022, thirty-three empirical studies of commitment contract take-up had been published, with all but two appearing in the preceding decade.[6] Most have studied commitment contracts specifically, and findings vary considerably across domains.
Savings
Field experiments have found that commitment contracts can meaningfully change savings behavior. A randomized trial in the Philippines offered bank clients a savings account that restricted withdrawals until a self-chosen goal was reached. Twenty-eight percent of those offered the product chose to open an account, and after one year those in the treatment group had increased their bank savings by approximately 80 percent relative to controls.[10] Programs such as Save More Tomorrow[11], which use automatic enrollment and inertia rather than binding constraints, are better understood as nudges than as commitment devices in the strict sense.
Health behavior
Evidence on commitment devices for health behavior is more mixed. A field experiment with workers at a US Fortune 500 company found that combining a financial incentive with an optional commitment contract produced lasting increases in gym attendance detectable several years later, though the incentive alone had only modest long-run effects.[12] A commitment contract for smoking cessation in the Philippines increased quit rates, but take-up was modest.[13]
Systematic reviews have found limited and uneven evidence. A meta-analysis of commitment devices for weight loss reported a statistically significant but small short-term effect (a mean difference of 1.5 kg across three trials with 409 participants), with wide confidence intervals and variable study quality.[14] A meta-analysis of behavioral economics interventions for physical activity found a small significant effect during the intervention period that did not persist at follow-up.[15]
Take-up and default
A recurring finding across domains and geographic settings is that voluntary take-up of commitment contracts is low—typically between 10 and 30 percent of those offered—even when the contracts appear effective for those who adopt them.[2][1] One explanation is that only “sophisticated” individuals will seek out commitment, while those unaware of their own self-control problems will not.
Commitment contracts can also backfire. A field experiment with low-income savers in the Philippines found that 55 percent of those who adopted a commitment savings product defaulted and incurred monetary losses. The problem was particularly acute among “partially sophisticated” individuals—those who recognized their self-control problems but underestimated their severity, leading them to select contracts they could not sustain.[16] A similar pattern emerged in a Malawi HIV testing trial, where a substantial fraction of those who chose hard commitments lost their deposits.[17]
History
Origins
The concept of the commitment device emerged independently in two fields in the mid-1950s. In 1956, Thomas Schelling argued in “An Essay on Bargaining” that a negotiator could gain advantage by visibly restricting his own future choices, so that the power to constrain an adversary depends on the power to bind oneself.[18][19] In the same period, Robert Strotz published a formal analysis of time-inconsistent preferences, discussing how a rational agent might wish to constrain their own future behavior.[18][3] Both identified the same core mechanism: an agent can be made better off by having fewer options.
Schelling extended the strategic side of this idea in ‘’The Strategy of Conflict’’ (1960), arguing that in nuclear deterrence and international bargaining, visibly eliminating one’s own options could force an adversary to adjust.[18][8][19] Jon Elster developed the self-control side in ‘‘Ulysses and the Sirens’’ (1979), treating precommitment as a general strategy of rationality and developing the Odysseus story into the central metaphor of his theory.[4]
From strategy to self-control
By the mid-1970s, Schelling began connecting the two traditions. After joining the National Academy of Sciences Committee on Substance Abuse and Habitual Behavior, he applied the strategic logic of commitment to individual self-control, modeling the inner struggle between a person’s “two selves” on the pattern of Cold War conflict.[18] Robert H. Frank’s ‘‘Passions Within Reason’’ (1988) extended the concept in a different direction, arguing that emotions such as anger and guilt function as evolved commitment devices: a person genuinely prone to anger will credibly retaliate against cheating even when retaliation is costly, because the emotional response is difficult to fake.[20]
See also
References
- ^ a b c d e Bryan, Gharad; Karlan, Dean; Nelson, Scott (2010-09-04). "Commitment Devices". Annual Review of Economics. 2 (2010): 671–698. doi:10.1146/annurev.economics.102308.124324. ISSN 1941-1383.
- ^ a b c d Rogers, Todd; Milkman, Katherine L.; Volpp, Kevin G. (2014-05-28). "Commitment Devices". JAMA. 311 (20): 2065. doi:10.1001/jama.2014.3485. ISSN 0098-7484. Archived from the original on 2026-02-27.
- ^ a b Strotz, R. H. (1955–56). "Myopia and Inconsistency in Dynamic Utility Maximization". Review of Economic Studies. 23 (3): 165–180. doi:10.2307/2295722.
- ^ a b Elster, Jon (1979). Ulysses and the Sirens: Studies in Rationality and Irrationality. Cambridge University Press. ISBN 9780521223881.
- ^ O’Donoghue, Ted; Rabin, Matthew (1999). "Doing It Now or Later". American Economic Review. 89 (1): 103–124. doi:10.1257/aer.89.1.103.
- ^ a b Augenblick, Ned; Boser, Brock; Karlan, Dean; Levine, Conor (2022). Who Chooses Commitment? Evidence and Welfare Implications (Report). NBER Working Paper. doi:10.3386/w26161.
- ^ Loewenstein, George (2005). "Hot-Cold Empathy Gaps and Medical Decision Making". Health Psychology. 24 (4S): S49–S56. doi:10.1037/0278-6133.24.4.S49. PMID 16045419.
- ^ a b Dixit, Avinash K.; Nalebuff, Barry J. (1991). "Ch. 6, Credible Commitments". Thinking Strategically: The Competitive Edge in Business, Politics, and Everyday Life. W. W. Norton. ISBN 978-0-393-31035-1.
- ^ Thaler, Richard H.; Shefrin, H. M. (1981). "An Economic Theory of Self-Control". Journal of Political Economy. 89 (2): 392–406. doi:10.1086/260971.
- ^ Ashraf, Nava; Karlan, Dean; Yin, Wesley (2006). "Tying Odysseus to the Mast: Evidence from a Commitment Savings Product in the Philippines". Quarterly Journal of Economics. 121 (2): 635–672. doi:10.1162/qjec.2006.121.2.635.
- ^ Thaler, Richard H.; Benartzi, Shlomo (2004). "Save More Tomorrow™: Using Behavioral Economics to Increase Employee Saving". Journal of Political Economy. 112 (S1): S164–S187. doi:10.1086/380085.
- ^ Royer, Heather; Stehr, Mark; Sydnor, Justin (2015). "Incentives, Commitments, and Habit Formation in Exercise: Evidence from a Field Experiment with Workers at a Fortune-500 Company". American Economic Journal: Applied Economics. 7 (3): 51–84. doi:10.1257/app.20130327.
- ^ Giné, Xavier; Karlan, Dean; Zinman, Jonathan (2010). "Put Your Money Where Your Butt Is: A Commitment Contract for Smoking Cessation". American Economic Journal: Applied Economics. 2 (4): 213–235. doi:10.1257/app.2.4.213.
- ^ Sheridan, R.; et al. (2019). "The effect of commitment-making on weight loss and behaviour change in adults with obesity/overweight; a systematic review". BMC Public Health. 19: 816. doi:10.1186/s12889-019-7185-3. PMC 6591991. PMID 31234794.
- ^ Berkemeyer, K.; et al. (2023). "Effectiveness of behavioural economics-informed interventions to promote physical activity: A systematic review and meta-analysis". Social Science & Medicine. doi:10.1016/j.socscimed.2023.116299.
- ^ John, Anett (2020). "When Commitment Fails: Evidence from a Field Experiment". Management Science. 66 (2): 503–529. doi:10.1287/mnsc.2018.3236.
- ^ Derksen, Laura; Kerwin, Jason T.; Ordaz Reynoso, Natalia; Sterck, Olivier (2025). "Healthcare Appointments as Commitment Devices". The Economic Journal. 135 (665): 81–118. doi:10.1093/ej/ueae066.
- ^ a b c d Fontaine, Philippe (2024). "Commitment, Cold War, and the Battles of the Self: Thomas Schelling on Behavior Control". Journal of the History of the Behavioral Sciences. 60 (2). doi:10.1002/jhbs.22302.
- ^ a b Schelling, Thomas C. (1960). "Chapter 2: An Essay on Bargaining". The Strategy of Conflict. Harvard University Press. ISBN 0674840313.
{{cite book}}: ISBN / Date incompatibility (help) - ^ Frank, Robert H. (1988). Passions Within Reason: The Strategic Role of the Emotions. W. W. Norton & Company. ISBN 978-0-393-02604-7.