Social experience machines
While expanding my thoughts on synthetic worlds for The Social Interface, I made a connection between Edward Castronova's concept of migration to synthetic worlds, and Robert Nozick's experience machine. Nozick postulates a machine able to give its user any experience he or she desired, but argues that no one would actually want to live in such a machine. Therefore, he argues, people do not subscribe to the utilitarian notion that we care only about the pain and pleasure we experience.
It's important for Nozick's argument that potential users of the experience machine are aware that it simulates experiences, since he argues that potential users would find this simulation dissatisfying irrespective of how good the experiences were. Castronova's synthetic worlds satisfy this criterion since their users are aware of entering and leaving their worlds, and this would be the case even if virtual reality technology advanced to the point that it could provide perfectly realistic experiences.
Assuming that Nozick is correct about a fully-informed person wanting to live in an experience machine, the question remains as to what might happen were someone to enter an experience machine without knowing it. Fully-functioning experience machines don't exist, but I think an argument can be made that certain aspects of them do. Would a person tricked into entering one feel cheated?
During the discussion that led to my dangerous idea last week, one of my colleagues observed that it felt rewarding to accept connection requests, and rude to decline them. I countered that this was exactly why I'd deleted my LinkedIn profile: it seemed superficially rewarding to accept connection requests, and at first I thought they might lead to something, but this quickly turned to disappointment when I realised that I wasn't actually connected to these people in any meaningful way, and it never led to anything.
For me, LinkedIn was a primitive experience machine that (momentarily) provided the experience of being connected. As Nozick predicted, I got myself out of it once I'd decided that the experience was, in fact, simulated. As Sherry Turkle puts it, it promised friendship but delivered only a performance — and a particularly crude one at that.
I suppose that people who use LinkedIn and other networks might contend that that was my particular experience, that they have built genuine connections with it, and that maybe I wasn't using the tool correctly in order to benefit from it. Or maybe it's just not my thing, in the same way that stamp-collecting and dog ownership aren't my thing.
This all sounds plausible enough, and I can neither prove nor disprove it. When pressed, I guess I find the "not my thing" explanation most convincing. Going back to experience machines, though, I only felt cheated once I'd compared the LinkedIn experience with my physical world experience. If I were still in LinkedIn's experience machine, and ignorant of the physical world, might I not be as happy as everyone else in that machine?