I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard
Posts tagged as buzzwords

Unlearning the quest for the latest fad

2014-05-16 by Nick S., tagged as buzzwords, education, history

Towards the end of Here Comes Everybody (2008), Clay Shirky writes about the differences that young people and old people face in adapting to new technologies and circumstances. He seems to think that older people are at a disadvantage because they need to "unlearn" the conventions that they learned when older technologies were in vogue, while young people are ready to take up new technologies from the get-go. On the other hand, he acknowledges that young people are prone to seeing revolutions everywhere they turn.

Shirky might be right within the confines of his discussion, which refers to a particular set of communication and collaboration technologies. I nonetheless think I'd prefer to use the word adapt rather than unlearn: the latter word suggests to me that we've somehow been stuffing our heads full of useless knowledge. But any unlearning seems to me to do at least two disservices to our skills and knowledge of yesteryear.

Firstly, it suggests that those skills and knowledge were pretty superficial to begin with. It's the kind of thinking that presumes that programmers of my vintage, for example, must be totally unable to write Java or mobile applications since we learned to program in C and C++ on desktops. But what we really learned was object-oriented programming and problem-solving, which are just as useful now as they were in 1980. Anyone hung up on the name and syntax of a language probably wasn't a very good programmer in the first place.

Secondly, it's a surrender to technological determinism. We place ourselves at the mercy of the latest technology and its purveyors, unable or unwilling to decide for ourselves which technology (or abstention from it) is really the most effective one for our needs.

I read Jared Diamond's The World Until Yesterday (2012) at around the same time, and found his views on aging somewhat more heartening. Diamond argues that younger people have greater vitality and creativity, while older people have greater depth and breadth of knowledge. These qualities, he thinks, ought to complement each other rather than have us all pining to be twenty. Amongst academics, for example, it's the under-forties who produce the brilliant new theories and proofs in narrow fields, while it's the over-forties who synthesise the knowledge across multiple fields. (Admittedly he's unclear on how this applies to less heady occupations like construction work, athletics and hospitality.)

In this vein, one of my students recently asked how the lecturing staff at my institution were able to teach so many different subjects. Because we've had twenty years to learn it all, I suggested. Furthermore (I might have continued had I been talking to Shirky), we don't need to forget everything we know in order to learn some cool new skill or piece of knowledge: we add the new skills to the old.

I suppose that an enthusiast of the latest technology might say that Diamond, being seventy-something, would say that, and I, being forty-something, would agree with him. Then again, Diamond and I might equally say that our critics, being twenty-something, would say the opposite.

Retrovolution

2014-04-11 by Nick S., tagged as buzzwords, education

I had intended to wait until completing my Coursera course on university teaching before writing another comment on massive open on-line courses, but today read some words from recent ex-Vice Chancellor Jim Barber in The Australian's higher education supplement (Seven lessons of survival in an online world, 9 April 2014, p. 29) that prompted me to write earlier. Barber, along with some elements of the course I'm studying, seem to to believe that there is something terribly wrong with university education, has been for years if not centuries, and that the whole business is about to be swept away by fantastic new approaches and technologies that will have our students dancing in the streets with joy, not to mention heads full of knowledge.

Coursera's course being presented by American-accented lecturers at Johns Hopkins University, I was much reminded of the complaint that "Americans don't understand irony" as I learned about the need for innovative teaching techniques from a talking head with slides, and "discussed" the value of learning in small groups with hundreds of other learners on the discussion boards. Barber himself thinks that anyone "who continues to believe that the purpose of a lecture is to transmit information [needs] to be dispatched to a re-education camp" though he doesn't state what he thinks the purpose of a lecture actually is. (He may mean "class" rather than "lecture", since I take the very definition of the latter to be the oral transmission of information from the speaker to the listeners.)

Teaching Americans about irony and lecturers about communication aside, I actually found the course interesting and informative, with a good balance between delivering established knowledge, enabling student thought and discussion, and providing exercises that put relevant skills into practice. But this leaves me only more puzzled as to what Barber and his fellow revolutionaries are on about: I learned engineering using much the same combination of techniques twenty years ago, and it's no surprise to me that people use them, because they work (at least for me, and the many students who've successfully completed courses at my current instituion). Sure, they're on a web site instead of in a classroom now, but I wonder where the revolutionaries have been if they think that such techniques appear only in science fiction.

Sorel Reisman, writing in the April 2014 issue of IEEE Computer (The Future of Online Instruction, Part 1, pp. 92-93), seems to me to have a much better grip on the state of on-line education than many of its enthusiasts: he observes that MOOCs are simply learning management systems that support large enrolments, and that learning management systems are themselves simply content delivery systems tailored for educational content. He himself thinks that any real advances in on-line education must come from what he calls "adaptive learning", where the learning system adapts to the needs of individual learners. (Coursera's course recommends more or less the same idea under the name "personalisation", but the focus there is on how human teachers can provide it.)

A recent conversation with an experienced high school teacher told me that the same phenomenon exists in other schools: every now and again, revolutionaries come to school and ask teachers to "update" their methods with techniques that teachers have been using for decades, possibly under a different name. Perhaps such practices could use a buzzword of their own, maybe retrovolution?

On the structure of computing revolutions, part 2

2013-11-15 by Nick S., tagged as buzzwords

I was a little surprised to read that "P2P networks are yet another manifestation of the shift of information systems' control to individuals" in an article by Tom Kirkham and colleagues in the September/October 2013 issue of IEEE Security & Privacy (p. 14). Everywhere else, I read that information is moving to the cloud. Cloud enthusiasts might even point out (rightfully) that the series of examples of peer-to-peer social networks that follow the statement are rarely heard of outside technical circles, and do not appear to pose the slightest threat to Facebook or Google.

I suppose that Kirkham and colleagues are trying to leverage the enthusiasm for the "disintermediation" and "user-generated content" that made a buzz around 2000-2005, and live on as "social media" now. Cloud computing and disintermediation aren't necessarily incompatible: individuals can and do create their own content and store it in cloud-based services like web hosts, virtual worlds, Facebook and Twitter. (The particular proposal made by Kirkham et al., though, is very un-cloud-like.) I'll come back to this later.

The same issue of Security & Privacy has Gary T. Marx writing about the line between citizens assisting with law enforcement and citizens pursuing vigilantism (pp. 56-61). Despite the recent prominence of citizen journalism and success in apprehending suspects after the bombing of the Boston marathon, Marx observes that cooperation between law enforcement and the public is hardly new: governments have been encouraging citizens to report crimes for decades, and, in the more distant past, routinely employed citizens as auxiliary law enforcers.

All this had me groaning: is anything ever new? A moment's thought assured me that surely something is, since plenty of technology exists now that didn't exist a hundred years ago. My reaction to the claims of both Kirkham et al. and their opposites in the cloud computing camp is really a reaction to sweeping claims that computing is centralising, or de-centralising, or intermediating, or disintermediating, or converging, or diverging, or whatever. At any one time, surely some things might be de-centralising (like creation of web pages) while others are centralising (like hosting of the same web pages), depending on the most efficient and effective way of using the available technologies. To claim that there is any overall trend seems pretty bold, to say the least.

Whatever technology is doing, however, we remain human and we continue to struggle with rights vs responsibilities, privacy vs accountability, individuality vs community, and a hundred other tensions that likely existed before anyone ever rubbed two sticks together, let alone built a computer. Technology may change the means by which tensions like these are expressed and resolved, but it hasn't made us a whole new species with whole new needs and desires. Facebook may have appeared only ten years ago, for example, but it's hardly the first time anyone had a social network.

I've previously observed that commentators can generate a lot of nonsense by mistaking a change in one aspect of computing for a computing revolution. I should perhaps extend that observation to include mistaking a change in technology for a change in humanity.

Boldly going where many have been before

2013-04-17 by Nick S., tagged as buzzwords, education, history

A couple of weeks ago, The Australian's higher education section quoted Anant Agarwal, president of edX, saying that "education hadn't really changed for hundreds of years" (27 March 2013, p. 26). I don't know which schools and universities Agarwal has visited over the past two or three hundred years, but the statement drew my attention to a tried-and-true technique of would-be revolutionaries: deny that anything that happened before today was of any consequence.

For those dreaming of the day that computers revolutionise education, university lecturers are apparently still getting about in black robes and discussing the finer points of Galenic medicine in Latin with their exclusively white male students. It makes me wonder who's really out of touch here.

Of course modern schools and universities also continue some practices that would be familiar to their mediaeval forebears, including the human teachers and lecturing and tutoring that I take to be the subject of on-line educational scorn. But perhaps there's a reason for this continuity: it works. Would anyone suggest that the Roman alphabet is due for a shake-up just because "it hasn't really changed for hundreds of years"?

Writing about older computer workers' difficulties with finding employment in her book Cyberselfish, Paulina Borsook speculates that older workers might be disadvantaged by the lack of excitement they show when presented with a new buzzword that looks suspiciously like technology they worked with ten or twenty years ago. So we're using thin clients to access our data in the cloud now? Sounds rather like the dumb terminals and mainframes that we covered in the history of computing discussed in my operating systems class last week. Unhampered by any knowledge (or at least experience) of history, younger workers impress by the excitement they show when encountering an idea for the first time.

Similarly, massive open on-line courseware seems so much more exciting if one has never encountered — or makes a habit of ignoring — the textbooks, video lectures, educational software and on-line learning management systems that existed before it. And Heaven forbid that any of our ancestors ever had a good idea.

On the structure of computing revolutions

2012-11-23 by Nick S., tagged as buzzwords, mobile computing, prediction

I recently read an article mocking its own authors for failing to recognise that the iPhone (or some particular version of it) would instigate a revolution. Unfortunately I didn't record where I read this, and I haven't been able to find it again after later thinking about what constitutes a "revolution", and what it might feel like to live through one.

My immediate reaction upon reading the article was: are you sure you weren't right the first time? I, at least, don't feel like I've been through a revolution any time in the past ten years, or, indeed, my entire life. Sure, technology has steadily improved, but I've only ever perceived it as "evolution". I have no doubt that someone catapulted into 2012 from the time of my birth in the 1970's would find much to be amazed about. But, having lived through all of the intervening years myself, I had the much more mundane experience of seeing the changes one product at a time.

This begs the question: how much change is required, and how sudden does it need to be, to constitute a "revolution"? When talking of the history of computing to my computer systems students, I often talk of "trends" from analogue to digital, from stand-alone computers to networked ones, and from single-core to multi-core CPUs. I say "trend" because I perceive the changes as a gradual process of older products being replaced one-by-one by newer products. But proponents of the iPhone (or digital or network or multi-core) revolution presumably perceive the changes as one big leap from existing products to a spectacular new entrant. (Either that, or they use the word "revolution" to mean "any perceptible change".)

Now, many small changes may add to up to a big one. Someone of my mind born in Britain in 1800, say, might have observed machines or factories appearing one at a time over his or her life time. But that person's lifetime now seems short compared to the span of human history, and we consequently refer to that period as the Industrial Revolution. Still, I suspect that future historians will be looking at more than iPhones when they decide what to call the current period.

One of my students foreshadowed the taxonomic problems awaiting future historians when he observed to me that the articles he had been reading disagreed about what era of computing we currently enjoyed. I forget the exact list of candidate eras, but one might have been the "mobile era" and another the "network era", and so on. Off the cuff, I suggested two explanations: firstly, that his sources were talking crap, and, secondly, that his sources were talking about two different aspects of computing.

The two explanations might not be mutually exclusive. Perhaps the iPhone revolutionised mobile telephony/computing for some definition of "revolution", but I didn't notice this revolution because I do relatively little telephony and mobile computing. But the iPhone didn't revolutionise other aspects of computing -- let alone biotechnology or space travel or any of numerous other technologies of the modern period -- so attributing a broader revolution to it would seem to be a load of crap.