Table of Contents
Imagine walking into a room with two boxes on a table. One contains $1,000, clearly visible. The other is a mystery box. You are told a highly accurate supercomputer has predicted your choice. If it predicted you would take only the mystery box, it placed $1 million inside. If it predicted you would take both boxes, it left the mystery box empty. The prediction was made before you even entered the room, and the boxes were set up accordingly. Do you take both, or just the mystery box? This is Newcomb’s Paradox, a thought experiment that has divided thinkers, philosophers, and mathematicians for decades.
Key Takeaways
- The Two Camps: The problem creates an even split between "one-boxers," who prioritize the predictor's accuracy, and "two-boxers," who prioritize the immediate financial gain of the $1,000.
- Evidential vs. Causal: One-boxers rely on Evidential Decision Theory, betting on the correlation between their choice and the predictor’s success. Two-boxers rely on Causal Decision Theory, arguing that past predictions cannot be changed by present actions.
- The Paradox of Rationality: The scenario suggests that in certain high-stakes environments, being "rational" in the short term (taking both boxes) may lead to worse outcomes than acting on a pre-commitment.
- Real-World Application: This logic mirrors strategies like Mutually Assured Destruction (MAD) and game theory scenarios, where pre-committing to a seemingly irrational action actually secures a safer future.
The Core Divide: Why Logic Isn't Universal
Newcomb's paradox is famously divisive because it pits two valid, yet contradictory, modes of reasoning against each other. As the philosopher Robert Nozick once noted, the choice seems perfectly obvious to both sides, while the other side appears fundamentally irrational.
To almost everyone, it is perfectly clear and obvious what should be done. The difficulty is that these people seem to divide almost evenly on the problem.
The One-Boxer Perspective
One-boxers are driven by the weight of evidence. They look at the supercomputer’s track record—thousands of correct predictions—and conclude that their choice is essentially correlated with the contents of the box. By choosing one box, they are aligning themselves with the outcome that leads to the million-dollar reward. For them, it is not about changing the past; it is about recognizing the predictive power of the system they are operating within.
The Two-Boxer Perspective
Two-boxers adhere to the principle of strategic dominance. They argue that the money is either in the box or it isn't. Since the prediction happened in the past, nothing they do now can change the contents. By taking both boxes, a two-boxer always ends up with $1,000 more than they would have had otherwise. To them, the one-boxer’s approach is a form of magical thinking—the belief that their current choice can influence a past state.
Decision Theory and the Illusion of Choice
The conflict ultimately boils down to which decision theory one prioritizes. Evidential Decision Theory argues that you should choose the action that provides the best evidence for a good outcome. In this case, choosing one box is strong evidence that you are the type of person who receives the million. Conversely, Causal Decision Theory insists that you should only consider outcomes that your current actions can directly cause.
This reveals a deeper question: Does free will exist? If a supercomputer can predict your choices with 100% accuracy, your decision-making process may be as deterministic as the gears of a clock. Yet, we must live our lives as if we are the masters of our actions. Even if our choices are predetermined, the necessity of making them remains a fundamental requirement of being human.
The "Why Ain'cha Rich?" Argument
A common critique of the two-boxer strategy is the "Why Ain'cha Rich?" argument. If you are supposedly being "rational" by maximizing immediate utility, why do you end up with only $1,000 while the "irrational" one-boxers walk away as millionaires? This forces us to re-examine what it actually means to be rational.
Philosophers Gibbard and Harper suggested that in scenarios like Newcomb’s, the game itself is rigged. If a system rewards irrationality, the "rational" move might be to adopt a set of rules that appear irrational in isolation but yield the best results over time. This is strikingly similar to the Prisoner’s Dilemma in game theory, where individual selfishness often leads to collective ruin, whereas cooperation—a form of pre-commitment—leads to better outcomes for everyone.
Lessons from Mutually Assured Destruction
The logic of Newcomb’s paradox finds a dark, real-world parallel in the Cold War strategy of Mutually Assured Destruction (MAD). The United States and the Soviet Union maintained peace not through disarmament, but through a credible, public commitment to retaliate with nuclear weapons if attacked—even if doing so would lead to global suicide.
The best strategy in this game is to visibly take the steering wheel out of your car and throw it out the window.
This is the essence of pre-commitment. By "throwing the steering wheel out the window," a leader makes it impossible to swerve, forcing the opponent to be the one who yields. Much like the one-boxer who is committed to the $1 million outcome, the rational actor in international politics must be seen as someone who will follow through on a high-stakes, seemingly irrational threat to ensure stability.
Conclusion
Newcomb’s paradox is more than a thought experiment; it is a mirror that reflects our own decision-making philosophies. Whether you are a one-boxer who trusts in correlation and pre-commitment, or a two-boxer who insists on the cold logic of causal independence, you are expressing a fundamental stance on how the world works. Ultimately, the most successful people might be those who can act as if they are "programmed" to make the best choices. By building a reputation and adhering to the rules of a "cooperative" life, you turn a single-round game into an iterated one, where the best versions of ourselves are those who have pre-committed to the best possible outcomes.