I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard
Posts tagged as education

Should universities lead or follow technological trends?

2015-02-20 by Nick S., tagged as education

The Australian's Higher Education section this week either presented some very strange research, or made a very strange presentation of some research, in claiming that Twitter is the least used online resource (18 February 2015, p. 30). (Less used than www.nps.id.au, are you sure? I laughed.) The article doesn't clearly identify the study alleged to have discovered this and I wasn't able to find it via a search engine, so I can only go by the article's presentation here.

As the article has it, Twitter is "the social media platform of choice for academics, journalists and a host of other professionals" but "barely rates as an educational tool". This is based on a survey showing that only 15% of participating students found Twitter useful in their university studies.

To my mind, the most obvious explanation for this is that Twitter just doesn't meet the needs of university education. As far as I know, it was never designed for this purpose, so it's hardly surprising that people don't use it as such. Refrigerators, say, probably get even less use in university courses and no one would anyone expect anything else given that refrigerators were never designed for educating people.

The article instead quotes the study's lead author, Neil Selwyn, speculating that the finding "could be seen as a negative for universities [since Twitter] is where the technological generations are having conversations and finding stuff out." The underlying assumption seems be to that the Cool Kids are using Twitter, and universities might not be cool if they don't use it too.

Well, students probably use refrigerators quite a bit too, but does that mean that it would be useful to have one in my classroom? If Twitter is to be accepted as an educational tool, educators need to be convinced of some educational purpose in using it. Those who do things in order to be cool are more likely to be described as "try-hards" than "innovators".

And are the Cool Kids really using Twitter anyway? According to the article, nearly all students are actually using learning management systems, on-line libraries and on-line videos — and why wouldn't they, given that all these tools have well-established educational uses? The article itself acknowledges that the students are all aware of Twitter, they just don't use it for this particular purpose. Maybe the article could just as meaningfully have read "Twitter barely rates as an educational tool, yet is the social media platform of choice for academics, journalists and a host of other professionals."

On technological determinism and classrooms

2014-12-08 by Nick S., tagged as education

At the same time that I was complaining about technological determinism and law(non)making in my previous entry, The Conversation published an article from Joanne Orlando that I at first took to be a quite different form of technological determinism. Orlando disputes the importance of remembering information in the classroom (though it isn't clear to me who actually claims that it is the high point of learning), and along the way claims that electronic devices can be just as useful, if not more so, than handwriting in classrooms.

According to Orlando, a character who records photos, video and audio of a presentation "can use their digital notes to create something new that builds on the topic", while another character who makes handwritten notes finds this "not so easy". For Orlando, studies showing that handwritten notes provide better recall are beside the point: the character with the recording devices can achieve the same or better by building something out of the recordings.

Orlando doesn't say much about what ensures that the device-wielding character actually builds on the topic rather than simply leaves the recordings where they lie, and doesn't explain at all what stops the pen-wielding character from also building on the topic. I therefore took Orlando to be claiming that the mere availability of electronic recording devices led to intelligent processing and re-combination of recordings, as might be claimed by an enthusiastic proponent of mash-ups and re-mixes.

In writing a response to the article, and re-reading Orlando's article several times in the process, I realised two things. Firstly, Orlando probably didn't mean to claim that re-mixes and mash-ups are necessarily intelligent or useful, only that is possible for them to be. This point, however, is obscured by conflating the actions of memorisation and of constructing new knowledge. Coming to this understanding through the process of writing a blog entry illustrates exactly what Orlando wants to happen, but might not have happened had I simply left Orlando's article on its web site.

Orlando acknowledges that recall of a certain amount of basic information is, in fact, necessary for mastery of a topic as well as day-to-day business. Having a dictionary of French in one's pocket, for example, does not make one a fluent speaker of French, no matter how good one is at looking up words. For memorising information of this sort, writing it out it is surely better than merely making an electronic recording, as the studies cited in Orlando's article say.

Nonetheless, Orlando is correct to say that electronic recordings can be of use in constructing new knowledge — I used The Conversation's web site rather than a handwritten copy of Orlando's article in developing this very blog entry — and that memorisation of useful facts may happen along the way. My experience matches that of Cat Brown, however, whose comment points out that many students' use of electronic material is far from the ideal that Orlando imagines, being to "simply regurgitate chunks of undigested facts that google has delivered to their computers". (No doubt students can write out similarly undigested facts with a pen, though maybe they'd at least remember some of them.)

Teachers with experiences like Cat Brown's and mine may be tempted to subscribe to a form of technological determinism that is opposite to the one that I initially read into Orlando's article: electronic devices lead to unthinking reproduction of search engine results. I now suppose that Orlando meant to critique exactly this view. The real question is not whether pens or cameras and microphones result in better learning, but how do teachers get their students to use their tools intelligently?

Should coding be taught in school?

2014-11-03 by Nick S., tagged as education

Malcolm Turnbull, Australia's Minister for Communications, recently gave a speech calling for "coding" to be included in the school curriculum in Australia, arguing that "instead of teaching students how to be passive consumers of technology or how to use Microsoft Word or other proprietary software, our educators should be teaching students how to create, how to code". He backs up his views with a warning from the Australian Computer Society "that any delay to the teaching of coding would put students at a significant disadvantage from their peers in the UK", and the opinion of some unnamed advisors who "have compared the importance of coding to that of literacy and numeracy".

Commenters over at The Register, where I first saw the story, aren't so sure. Preparing students for creating computer technology sounds like a noble enough goal, but how realistic is it? And how necessary?

For a start, any contention that coding might be as fundamental a skill as literacy and numeracy is surely absurd, since literacy and numeracy are both pre-requisites to the ability to write code. To judge by my experience of teaching software development at university level, even many students who have chosen to enrol in computer science degrees finding coding difficult, so why expect primary school students to make much headway with it?

Still, one can imagine teaching something simpler than what we teach at university level. In my school days, we had BASIC and Logo, and later Pascal, for those of us wealthy enough to have access to computers (hm... maybe this coding-in-school stuff is not such a new idea after all). Learning some BASIC and Pascal in high school probably helped me identify an aptitude for programming, which ultimately led me to computer engineering (though I didn't actually make this choice until I'd studied the basics of all engineering in my first year).

Turnbull acknowledges that not everyone who studies coding in school will go on to be a software developer, just as not everyone who studies English will go on to write a novel. But I can see at least a prima facie case for some coding in school, even if it's a very limited form of coding compared to what professional software developers do, and it comes somewhat later than what its most enthusiastic proponents seem to imagine.

Yet, if the ability to create computer technology is worthy goal, what about the ability to create cars, roads, electrical power distribution systems, agriculture, and all of the other technologies that are essential to industrialised countries? I've often thought that if education departments listened to every professional society and trade association's suggestion that its pet subject be included in the school curriculum, we'd finish high school just in time to retire.

Since not everyone will go on to become a software developer, the real purpose of any coding in school cannot be to give all of us the ability to create computer technology. For me, it was part of making an informed decision about what to pursue after I'd completed high school, and this seems reasonable enough a justification as far as it goes. But determining the value of coding to society at large, relative to other things might be taught in school, requires input from a lot more than just software developers and other computer enthusiasts.

Following MOOCs on the Gartner Hype Cycle

2014-07-18 by Nick S., tagged as education, prediction

I was little surprised this week to find The Conversation's David Glance writing of the MOOC [Massive Open Online Courseware] revolution that never happened. Firstly, I've previously associated Glance with revolutionary views of MOOCs. Secondly, the term "MOOC" has only been around a short while and it seems premature to declare the whole thing over, as Alan W. Shorter's comment points out. It seems that Glance has moved from Gartner's Peak of Inflated Expectations through to the Trough of Disillusionment during the two years or so that MOOCs have existed. Radio National's Antony Funnell also reported a sobering of rhetoric from MOOC enthusiasts including Anant Agarwal, CEO of edX.

One supposes that wild-eyed enthusiasts who scale the Peak of Inflated Expectations are setting themselves up for a fall into the Trough of Disillusionment when the technology fails to deliver on those expectations, as the names suggest. More sober commentators, such as those who appeared on Radio National, strive to go straight to what Gartner calls the Slope of Enlightenment, leading to the Plateau of Productivity. Gartner's Hype Cycle doesn't seem to account for technologies for which such a plateau might not exist at all — electronic cash, flying cars and videophones come to mind — but even identifying a technology as having limited value is enlightenment of sorts. It remains to be seen what sort of Plateau of Productivity arises from MOOCs, if one arises at all.

The commentary in both the Conversation and Radio National pieces identify two key points that seem to have been well-known to sober commentators from the beginning of MOOCs, but overlooked by revolutionaries. Firstly, as Gavin Moodie frequently points out, very few university entrants have the intellectual independence required to master a topic without the guidance of a teacher. I suspect that this also contributes to findings reported on Radio National that 83% of MOOC participants are already highly educated — presumably, these people have already become the "independent learners" who Moodie argues to be the only ones likely to benefit from MOOCs. Secondly, what MOOCs provide isn't actually all that new, as experienced on-line educators like David White (on Radio National) and Sorel Reisman can tell you.

None of this is to say that MOOCs are necessarily useless, or that they'll never arrive at some Plateau of Productivity in a niche for which they are suited. I found the course that I tried interesting and informative — but, having already gained a PhD, I'm hardly the kind of fresh new-model student that MOOC enthusiasts expect to abandon universities. MOOC developers and users just need a bit more toiling on the Slope of Enlightenment instead of admiring the scenery on the Peak of Inflated Expectations.

Unlearning the quest for the latest fad

2014-05-16 by Nick S., tagged as buzzwords, education, history

Towards the end of Here Comes Everybody (2008), Clay Shirky writes about the differences that young people and old people face in adapting to new technologies and circumstances. He seems to think that older people are at a disadvantage because they need to "unlearn" the conventions that they learned when older technologies were in vogue, while young people are ready to take up new technologies from the get-go. On the other hand, he acknowledges that young people are prone to seeing revolutions everywhere they turn.

Shirky might be right within the confines of his discussion, which refers to a particular set of communication and collaboration technologies. I nonetheless think I'd prefer to use the word adapt rather than unlearn: the latter word suggests to me that we've somehow been stuffing our heads full of useless knowledge. But any unlearning seems to me to do at least two disservices to our skills and knowledge of yesteryear.

Firstly, it suggests that those skills and knowledge were pretty superficial to begin with. It's the kind of thinking that presumes that programmers of my vintage, for example, must be totally unable to write Java or mobile applications since we learned to program in C and C++ on desktops. But what we really learned was object-oriented programming and problem-solving, which are just as useful now as they were in 1980. Anyone hung up on the name and syntax of a language probably wasn't a very good programmer in the first place.

Secondly, it's a surrender to technological determinism. We place ourselves at the mercy of the latest technology and its purveyors, unable or unwilling to decide for ourselves which technology (or abstention from it) is really the most effective one for our needs.

I read Jared Diamond's The World Until Yesterday (2012) at around the same time, and found his views on aging somewhat more heartening. Diamond argues that younger people have greater vitality and creativity, while older people have greater depth and breadth of knowledge. These qualities, he thinks, ought to complement each other rather than have us all pining to be twenty. Amongst academics, for example, it's the under-forties who produce the brilliant new theories and proofs in narrow fields, while it's the over-forties who synthesise the knowledge across multiple fields. (Admittedly he's unclear on how this applies to less heady occupations like construction work, athletics and hospitality.)

In this vein, one of my students recently asked how the lecturing staff at my institution were able to teach so many different subjects. Because we've had twenty years to learn it all, I suggested. Furthermore (I might have continued had I been talking to Shirky), we don't need to forget everything we know in order to learn some cool new skill or piece of knowledge: we add the new skills to the old.

I suppose that an enthusiast of the latest technology might say that Diamond, being seventy-something, would say that, and I, being forty-something, would agree with him. Then again, Diamond and I might equally say that our critics, being twenty-something, would say the opposite.