Experience Machines Support Ethical Hedonism

Suppose there was an experience machine that would give you any experience you desired. Super-duper neuropsychologists could stimulate your brain so that you would think and feel you were writing a great novel, or making a friend, or reading an interesting book. All the time you would be floating in a tank, with electrodes attached to your brain. Should you plug into this machine for life, preprogramming your life experiences? […] Of course, while in the tank you won’t know that you’re there; you’ll think that it’s all actually happening […] Would you plug in? (44-45, Nozick)

Robert Nozick’s arguments basically boil down to:
1. If all we cared about was pleasure, we would agree to plug into the experience machine.
2. However, we do not want to plug-in.
3. Thus, there are things which matter to us besides pleasure.

My Response:
Critics of experience machines do not formalize their intuitions enough. If they did, they’d discover they didn’t actually have a problem with experience machines in their simplest form.Β  Here is a thought experiment which I believe speaks for itself:

Suppose our best AI experts agreed it’s now safe to create a powerful, benign AGI. This AGI swiftly created a thriving post-scarcity economy. This all happened 10,000 years ago. Now, you are alive and face a choice of entering an experience machine. You could remain alive and climb mountains and experience in the base reality, or you could experience the same and more in a simulation where you only think you are in a base reality. Of note, there is nothing more that you could do for others in this base reality- you would only be causing relatively less benefits to others than this AGI could over a period of time interacting with another person. There is also nothing you could do to secure the future of humanity or sentient life- this AI is far smarter than you and the future is secured under its control. So I ask, why not go use the experience machine now? Why not let your family use it? If it creates a subjective reality literally designed to maximize wellbeing and it reliably does a fantastic job at this, then why not?

Leave a Reply