Writing in The Social Interface last year, I supposed that mainstream media's following of Facebook and Twitter rather than synthetic worlds came down to the numbers involved: the former simply have many more users than any single instance of the latter. A second possible explanation occurred to me after reading Wagner James Au's The Making of Second Life (2008) recently. The book was written at around the time that I heard all those stories of entrepreneurs and companies opening for business in Second Life, and Chapter 10 has a bit to say about their fates.
Au records that Second Life users largely ignored the corporate spaces, preferring to remain in the areas created by traditional non-corporate Second Life users. The owners of these spaces, one might therefore suppose, have little incentive to talk about Second Life in their own spaces. Meanwhile, many corporations have thousands of Facebook "likes" and Twitter followers, so why wouldn't they prefer to talk about those? Could it be that Facebook and Twitter's visibility comes about because they turned out to better homes for major media corporations, or at least because their user community was more welcoming to said corporations than the user community of Second Life?
Off-hand, I can't think of any way to test such an hypothesis — certainly not from the armchair in which I write this. But I did happen across a couple of observations consistent with it.
Firstly, I recalled my own recent observations about very large "communities": as nice as it sounds for everyone to participate with everyone else, it just isn't possible to do it. We therefore conduct our public business through large institutions, even if few people have much affection for them. Facebook provided a home where Second Life did not, so there the intitutions are and so is everyone else but a few corporation-averse hold-outs.
Secondly, The Register's Richard Chirgwin drew readers' attention to some marketers' lament that up to 80% of sharing of links and articles occurs via e-mail and text messaging, which marketers have no means of tracking. So the marketers would certainly prefer it if we were all on Facebook. At least one of the marketers involved seems to be so impressed with Facebook et al. that Business Review Weekly quotes her as saying that "dark social [e-mail] is a very interesting development" even though, as Chirgwin observes, people of sufficient age have been using e-mail and text messaging for at least a decade before anyone had heard of what we now call a "social network".
For many of us, I'm sure that getting away with 80% of our communications unmonitored by marketers is a sign of hope. And we can remind ourselves that we aren't defined solely by our profiles in major media outlets: Second Lives and e-mails aren't failures just because they don't enjoy the media profile of Facebook or Google, any more than my local baker is a failure because he only sells bread to people in my suburb. If Second Life and World of Warcraft entertain millions of people and keep their operators in business, why worry if some other corporation isn't paying much attention to them?
Towards the end of Here Comes Everybody (2008), Clay Shirky writes about the differences that young people and old people face in adapting to new technologies and circumstances. He seems to think that older people are at a disadvantage because they need to "unlearn" the conventions that they learned when older technologies were in vogue, while young people are ready to take up new technologies from the get-go. On the other hand, he acknowledges that young people are prone to seeing revolutions everywhere they turn.
Shirky might be right within the confines of his discussion, which refers to a particular set of communication and collaboration technologies. I nonetheless think I'd prefer to use the word adapt rather than unlearn: the latter word suggests to me that we've somehow been stuffing our heads full of useless knowledge. But any unlearning seems to me to do at least two disservices to our skills and knowledge of yesteryear.
Firstly, it suggests that those skills and knowledge were pretty superficial to begin with. It's the kind of thinking that presumes that programmers of my vintage, for example, must be totally unable to write Java or mobile applications since we learned to program in C and C++ on desktops. But what we really learned was object-oriented programming and problem-solving, which are just as useful now as they were in 1980. Anyone hung up on the name and syntax of a language probably wasn't a very good programmer in the first place.
Secondly, it's a surrender to technological determinism. We place ourselves at the mercy of the latest technology and its purveyors, unable or unwilling to decide for ourselves which technology (or abstention from it) is really the most effective one for our needs.
I read Jared Diamond's The World Until Yesterday (2012) at around the same time, and found his views on aging somewhat more heartening. Diamond argues that younger people have greater vitality and creativity, while older people have greater depth and breadth of knowledge. These qualities, he thinks, ought to complement each other rather than have us all pining to be twenty. Amongst academics, for example, it's the under-forties who produce the brilliant new theories and proofs in narrow fields, while it's the over-forties who synthesise the knowledge across multiple fields. (Admittedly he's unclear on how this applies to less heady occupations like construction work, athletics and hospitality.)
In this vein, one of my students recently asked how the lecturing staff at my institution were able to teach so many different subjects. Because we've had twenty years to learn it all, I suggested. Furthermore (I might have continued had I been talking to Shirky), we don't need to forget everything we know in order to learn some cool new skill or piece of knowledge: we add the new skills to the old.
I suppose that an enthusiast of the latest technology might say that Diamond, being seventy-something, would say that, and I, being forty-something, would agree with him. Then again, Diamond and I might equally say that our critics, being twenty-something, would say the opposite.