A couple of weeks ago, The Australian's higher education section quoted Anant Agarwal, president of edX, saying that "education hadn't really changed for hundreds of years" (27 March 2013, p. 26). I don't know which schools and universities Agarwal has visited over the past two or three hundred years, but the statement drew my attention to a tried-and-true technique of would-be revolutionaries: deny that anything that happened before today was of any consequence.
For those dreaming of the day that computers revolutionise education, university lecturers are apparently still getting about in black robes and discussing the finer points of Galenic medicine in Latin with their exclusively white male students. It makes me wonder who's really out of touch here.
Of course modern schools and universities also continue some practices that would be familiar to their mediaeval forebears, including the human teachers and lecturing and tutoring that I take to be the subject of on-line educational scorn. But perhaps there's a reason for this continuity: it works. Would anyone suggest that the Roman alphabet is due for a shake-up just because "it hasn't really changed for hundreds of years"?
Writing about older computer workers' difficulties with finding employment in her book Cyberselfish, Paulina Borsook speculates that older workers might be disadvantaged by the lack of excitement they show when presented with a new buzzword that looks suspiciously like technology they worked with ten or twenty years ago. So we're using thin clients to access our data in the cloud now? Sounds rather like the dumb terminals and mainframes that we covered in the history of computing discussed in my operating systems class last week. Unhampered by any knowledge (or at least experience) of history, younger workers impress by the excitement they show when encountering an idea for the first time.
Similarly, massive open on-line courseware seems so much more exciting if one has never encountered — or makes a habit of ignoring — the textbooks, video lectures, educational software and on-line learning management systems that existed before it. And Heaven forbid that any of our ancestors ever had a good idea.
The Spring 2013 issue of IEEE Technology in Society Magazine has Alexander Hayes writing about Google Glasses and other wearable technology that he says "is set to revolutionize the manner in which we interact with each other".
Putting yet another revolution aside for a moment, I vaguely recall reading some advice to the effect that writing fiction in the second person is unlikely to succeed. Reading the first few paragraphs of Hayes' article, I found that the same is probably true of non-fiction, and experienced the reason why: addressing someone in the second person makes risky presumptions about what that person thinks.
When Hayes asserts that "you [i.e., me] may agree that [Hayes' experience] is not dissimilar to your current relationship with this disruptive technology," he immediately seems hopelessly incorrect: I think my relationship with mobile phones is very little like his. (Daniel Kahneman, whose work I wrote about in my last entry, has a similar habit, though he usually at least qualifies his second-person assertions with "if you are like most people..." and some empirical research to back this up. And, when it comes to Kahneman's studies, I probably really am like most people.)
Apart from advice on writing, Hayes' article gave me cause to think about two things: the temptation for enthusiasts to project their enthusiasm onto other people, and the distinction between frequent communication and good communication.
When I'm grooving along to the gothic metal and industrial music that I love, for example, it's easy to forget that not everyone appreciates such grim and gloomy noise. I suppose that computer enthusiasts feel the same way: the latest gadget seems so exciting, and the revolution so palpable, how could anyone be blasé about it?
Hayes certainly seems to be a frequent communicator, to go by his own description of is mobile phone use, yet the opening paragraphs of his article completely mis-identified this reader. Of course Hayes had no way of knowing that I was going to be reading his article, but I was nonetheless reminded of the argument made in Susan Cain's recent book Quiet that talking a lot isn't the same as achieving a lot.
Yet excitement and noise attracts far more attention than sober reflection and quality time. More charitable commentators might describe such noisy excitement as "infectious enthusiasm", but I wonder if it only infects those who already have said enthusiasm.