I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard
Posts tagged as history

What if the future of computing power is finite?

2014-09-07 by Nick S., tagged as history

In the Soapbox column of the July 2014 issue of the IEEE Consumer Electronics Magazine, Craig Hillman suggests "it is thought-provoking to speculate on how the electronics industry will eventually take the form of the automotive industry, a market based on a relatively stagnant technology. Because it will." (Leave No Technology Behind, p. 30). This was prompted by a blog post from Zvi Or-Bach proclaiming the end of Moore's law with an argument that the cost of manufacturing transistors ceased to drop once feature sizes hit 28nm.

That electronics and computer systems will ever reach a plateau from which we can't expect amazing advances in power and price from year to year, I thought, is surely some sort of heresy. Yet physics tells us that there is a lower limit to the physical size of an electronic circuit (the size of a charge carrier), and CPU clock speeds have already plateaued for a decade or so.

I suppose that technological optimists might argue — or at least confidently assert — that we'll find ways around these apparent limits, just as free market economists argue that the market will provide solutions to other apparent limits without any need for said economists to understand or provide any actual solutions. CPU clock speeds may have plateaued, the optimist might say, but multi-core technology allowed computing power to continue to increase.

To which a pessimist might say: none of cars nor aeroplanes nor ships increased infinitely in speed and capacity, so why should electronics be any different? No one doubts that electronics technology is rapidly increasing in power at the present moment — but so did cars and aeroplanes in the first half of the twentieth century, yet they now travel at much the same speed and carry much the same number of people as they did when I was child thirty years ago.

Any end of increases in computing power might seem depressing or unthinkable for those captured by the bigger-better-faster vision. I happened to read just today that "industry experts agree that fifth-generation (5G) cellular technology needs to arrive by the end of this decade" in the September 2014 issue of IEEE Spectrum (Mobile's Millimeter-Wave Makeover, p. 32) — but what happens to such alleged "needs" if and when computer technology reaches the plateau that Hillman and Or-Bach foresee?

The relatively static state of the internal combustion engine and its accoutrements doesn't seem to prevent millions of car enthusiasts slavering over stuff like Top Gear. Mechanical engineers still have employment and many in Australia are bitterly disappointed about the imminent departure of our car factories (not because we don't care about cars anymore, but because it's cheaper to import them from elsewhere). Maybe one day computer enthusiasts and engineers will likewise have to learn to live with a boring old industry.

Unlearning the quest for the latest fad

2014-05-16 by Nick S., tagged as buzzwords, education, history

Towards the end of Here Comes Everybody (2008), Clay Shirky writes about the differences that young people and old people face in adapting to new technologies and circumstances. He seems to think that older people are at a disadvantage because they need to "unlearn" the conventions that they learned when older technologies were in vogue, while young people are ready to take up new technologies from the get-go. On the other hand, he acknowledges that young people are prone to seeing revolutions everywhere they turn.

Shirky might be right within the confines of his discussion, which refers to a particular set of communication and collaboration technologies. I nonetheless think I'd prefer to use the word adapt rather than unlearn: the latter word suggests to me that we've somehow been stuffing our heads full of useless knowledge. But any unlearning seems to me to do at least two disservices to our skills and knowledge of yesteryear.

Firstly, it suggests that those skills and knowledge were pretty superficial to begin with. It's the kind of thinking that presumes that programmers of my vintage, for example, must be totally unable to write Java or mobile applications since we learned to program in C and C++ on desktops. But what we really learned was object-oriented programming and problem-solving, which are just as useful now as they were in 1980. Anyone hung up on the name and syntax of a language probably wasn't a very good programmer in the first place.

Secondly, it's a surrender to technological determinism. We place ourselves at the mercy of the latest technology and its purveyors, unable or unwilling to decide for ourselves which technology (or abstention from it) is really the most effective one for our needs.

I read Jared Diamond's The World Until Yesterday (2012) at around the same time, and found his views on aging somewhat more heartening. Diamond argues that younger people have greater vitality and creativity, while older people have greater depth and breadth of knowledge. These qualities, he thinks, ought to complement each other rather than have us all pining to be twenty. Amongst academics, for example, it's the under-forties who produce the brilliant new theories and proofs in narrow fields, while it's the over-forties who synthesise the knowledge across multiple fields. (Admittedly he's unclear on how this applies to less heady occupations like construction work, athletics and hospitality.)

In this vein, one of my students recently asked how the lecturing staff at my institution were able to teach so many different subjects. Because we've had twenty years to learn it all, I suggested. Furthermore (I might have continued had I been talking to Shirky), we don't need to forget everything we know in order to learn some cool new skill or piece of knowledge: we add the new skills to the old.

I suppose that an enthusiast of the latest technology might say that Diamond, being seventy-something, would say that, and I, being forty-something, would agree with him. Then again, Diamond and I might equally say that our critics, being twenty-something, would say the opposite.

The rise/fall/whatever-you-call-it of civilisation

2013-09-25 by Nick S., tagged as history, prediction

I read a couple of articles this week that, without being specifically directed at technological optimists, seemed at odds with the technology-is-advancing-faster-than-ever-before narrative that I've become accustomed to in publications like IEEE Spectrum. The Australian (18 September 2013, p. 29) had Peter Murphy contending that "big ideas in art and science seem like a thing of the past", while Radio National had Ed Finn lamenting that current science fiction typically portrays a pretty grim future.

Peter Murphy sounds like the kind of person who contributes to a narrative that I once saw described (I forget where) as "civilisation has been declining since it started". For him, the good old days were left behind somewhere in the middle of the twentieth century, and we no longer have anything interesting to say. Ed Finn is not such a curmudgeon himself, but draws attention to the trend from the largely utopian science fiction of the mid-twentieth century to the dystopian sort now enjoying popularity. Finn himself proposes to encourage more inspirational science fiction through a programme known as Project Hieroglyph. I presume that neither of them have been reading IEEE publications, Ray Kurzweil, Kevin Kelly, or any of their ilk, for whom things are (mostly) quite the opposite.

I was struck by the degree to which Murphy's article used the same technique as that used by more euphoric views of technology, however much their conclusions might differ: make a set of assertions about the importance of certain artworks or technologies that are at best subjective and at worst arbitrary, then conclude with whether you liked the older ones or newer ones better. Whether things are getting better or worse thus seems to depend largely on whether you prefer Daniel Defoe or Stephen King, or whether you happen to see more ploughs or iPhones.

The most convincing analysis of this sort that I've encountered is the one in Jaron Lanier's You Are Not a Gadget (2010). Writing about music, he argues that no new genres of music have appeared since hip hop in the 1980's, and that no one could tell whether a pop song that came out in the past twenty years was released in the 1990s or the 2000s. In another section, he argues that open source software consists largely of clones of prior commercial software. I'm sure there are plenty of musicians and open source software developers who might argue otherwise, but Lanier's points are at least testable hypotheses.

Given that the importance of any particular technology or piece of art is so subjective I'm not sure it's really very meaningful at all to make sweeping statements about whether art or technology is getting better or worse, or faster or slower. The Lord of the Rings, for example, has been immensely influential for generations of fantasy writers and readers, but I don't imagine it means much to writers and readers of, say, romantic comedies. There might nonetheless be more specific statements that could be made, but they need much more robust than merely making a list of what one individual likes and doesn't like.

Boldly going where many have been before

2013-04-17 by Nick S., tagged as buzzwords, education, history

A couple of weeks ago, The Australian's higher education section quoted Anant Agarwal, president of edX, saying that "education hadn't really changed for hundreds of years" (27 March 2013, p. 26). I don't know which schools and universities Agarwal has visited over the past two or three hundred years, but the statement drew my attention to a tried-and-true technique of would-be revolutionaries: deny that anything that happened before today was of any consequence.

For those dreaming of the day that computers revolutionise education, university lecturers are apparently still getting about in black robes and discussing the finer points of Galenic medicine in Latin with their exclusively white male students. It makes me wonder who's really out of touch here.

Of course modern schools and universities also continue some practices that would be familiar to their mediaeval forebears, including the human teachers and lecturing and tutoring that I take to be the subject of on-line educational scorn. But perhaps there's a reason for this continuity: it works. Would anyone suggest that the Roman alphabet is due for a shake-up just because "it hasn't really changed for hundreds of years"?

Writing about older computer workers' difficulties with finding employment in her book Cyberselfish, Paulina Borsook speculates that older workers might be disadvantaged by the lack of excitement they show when presented with a new buzzword that looks suspiciously like technology they worked with ten or twenty years ago. So we're using thin clients to access our data in the cloud now? Sounds rather like the dumb terminals and mainframes that we covered in the history of computing discussed in my operating systems class last week. Unhampered by any knowledge (or at least experience) of history, younger workers impress by the excitement they show when encountering an idea for the first time.

Similarly, massive open on-line courseware seems so much more exciting if one has never encountered — or makes a habit of ignoring — the textbooks, video lectures, educational software and on-line learning management systems that existed before it. And Heaven forbid that any of our ancestors ever had a good idea.