I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard
Posts tagged as employment

Some thoughts on the Butlerian Jihad

2015-07-21 by Nick S., tagged as artificial intelligence, employment

Continuing to think about automation and employment while constructing my last entry, I recalled the "Butlerian Jihad" that Frank Herbert imagines in the history of Dune (1965). In the far distant future in which the novel is set, the Jihad has resulted in a ban on machines that replicate human mental functions. This ban manifests itself in Dune in form of human "mentats" trained to perform the computational work that we now associate with machines.

It's been some time since I read Dune, and I don't remember why the Butlerians went on their Jihad, or if Herbert gives a reason at all. But if they feared that thinking machines might make humans redundant, or at least spawn the monumental inequality envisaged by thinkers like Tyler Cowen and Eryk Brynjolfsson and Andrew McAfee, could the Butlerians have a point? I imagine that orthodox economists and technologists, including those I've just mentioned, would simply dismiss the Butlerians as a form of Luddite. But why should we accept machines if they're not doing us any good?

Part of the problem with any such jihad, aside from the violence associated with it in the novels, is that what makes us human is not so clear-cut or obvious as is traditionally presumed. Evolutionary biology argues that we are not so different from other animals, work in artificial intelligence is continually re-drawing the line between computation and what we think of as "intelligent", and neurologists are yet to identify a soul. The introduction of mentats illustrates the computational part of the difficulty: in ridding the galaxy of machines with human-like capabilities, the Butlerians introduced a need for humans with machine-like capabilities. Brynjolfsson and McAfee (I think) also make the point that it isn't just in mental powers that humans distinguish themselves machines: humans remain better at tasks requiring fine manual dexterity, meaning that robots aren't yet ready to replace pickers and packers, masseurs, and all manner of skilled tradespeople. Any would-be Butlerians have some work to do in defining exactly what it is that they object to.

A second problem is that people differ in what they want to do themselves, and what they want automated. I enjoy making my own beer, for example, but plenty of other people are happy to buy it from a factory that can make it much more efficiently. On the other hand, I'm usually happy to have my camera choose its own settings for focus, shutter speed and the like, where I imagine a photography enthusiast might be appalled to leave such things to a machine. Should I smash breweries, or photographers smash my camera, to preserve the need for the skills that we like to exercise ourselves?

Of course I don't need to smash breweries in order to brew my own beer: I have a non-brewing-related income that leaves me with the time and resources to brew my own beer even if no one else will pay for it. This brings me back to a point I've already come to several times in thinking about automation and work: to what degree should our worth and satisfaction depend on paid employment at all? If machines allowed us to reduce the amount of work we do, freeing up more time and resources to do what we actually want to do, would we have any reason to fear the machines?

Are machines taking our jobs, or are we working despite them?

2015-05-31 by Nick S., tagged as employment

David Tuffley recently asserted in The Conversation that we [humans] need new jobs as the machines do more of our work. I immediately saw something fishy about the article's premise, in which "governments are encouraging healthy older people to postpone retirement and keep working" on one hand, while "jobs are not easy to come by these days" on the other. Tuffley goes on to consider what might happen, and how we might respond to it, as even more present-day jobs become mechanised. But if jobs are not easy to come by because machines are doing all the work, from where arises the need to have humans keep working in the first place?

I suppose that Tuffley developed his premise by accepting two popular narratives with their own internal logic, but whose internal logicks conflict such that a naïve combination of the two does not make sense. Many of the commenters on the article (including me) perceive this as a symptom of flaws in the worldview in which the aging-population and machines-are-taking-our-jobs narratives flourish.

Tuffley seems to subscribe to the traditional counter-Luddite view that a growing economy will find new non-automated work for displaced human workers, or at least that it must be somehow made to do so. Many of the commenters are not so sure, though few if any of us have a clear idea on how to bring about an alternative.

It's well beyond the scope of this blog — and probably my whole academic career — to have "the dismal science [economics] ... rebuilt from the ground up", as Graeme Martin's comment suggests. But in a previous entry or two on work and mechanisation, I've looked at certain kinds of work as being satisying in their own right, and wondered if anyone would want this work to be taken away by machines.

Several months ago, I happened to pick up Simon Birnbaum's Basic Income Reconsidered (2012). The most powerful memory I have of the book is where Birnbaum questions the notion that the worthiness of a member of a society should depend on paid work, which (notoriously) discounts the value of such things as raising children, caring for sick relatives, and cleaning the house. Not to mention all that produsing that we're supposed to be doing. (A "basic income" is a government payment made to every citizen irrespective of the citizen's income — or lack of it — from other sources.)

Whatever one thinks of basic incomes, Birnbaum's perspective gets at the dilemma that I had when contemplating being replaced by technology about a year ago. Markets provide an incentive to replace labourers with technology if the technology can make products less expensively, and this might be a great thing if the labour is boring or dangerous — but where does that leave those of us who derive satisfaction from our work? Especially if society continues to insist, as Tuffley presumes, that its members keep in paid work for forty hours a week even while more and more work is done by machines.

Rather than talk of machines taking jobs, perhaps we ought to be talking of how best to distribute the wealth created by advancing machinery, and how best to use it in pursuing what we really want to do. As some of the commenters noted, thinkers as diverse as John Maynard Keynes and the writers of Star Trek have grappled with these ideas, but it never seems to have caught on in a society that insists we prove our worth with forty hours of drudgery each week.

Bringing your own device, or providing it for someone else?

2015-03-05 by Nick S., tagged as employment, mobile computing

When I touched upon bring-your-own-device schemes in an article about upgrading devices last month, my inner industrial relations consultant was a little troubled by the whole idea of bring-your-own-device: why would I provide equipment for my employer's use at my own expense? I then read Brian M. Gaff's column BYOD? OMG! in the February 2015 issue of IEEE Computer (pp. 10-11), in which he provides some advice for employers in managing devices brought into the workplace by employees.

In doing so, Gaff sometimes makes said employees sound very much indeed like suckers providing free equipment to their employers and donating time outside of work hours: BYOD transfers costs from employers to employees, and increases productivity (per dollar, if not per time) by allowing employees to work at home. Reading that "personally owned devices are typically more advanced compared to those that are employer issued" (p. 10), I further envisaged a workplace version of John Kenneth Galbraith's "private opulence and public squalor" in which BYOD participants flaunt their cool new devices while the workplace's own infrastructure is left to rot.

To be fair, there are benefits in it for employees as well. They get to use devices set up to their own specifications, and there's some convenience in not having to switch between personal devices and work ones. I myself frequently answer work e-mail from my home computer (though that says as much about the casual nature of my employment as anything to do with which computer I like to use.) Maybe one could even make an environmental case for the practice insofar as it reduces the number of devices that need to be built (though the perpetual upgrade cycle that feeds BYOD enthusiasm may have exactly the opposite effect.)

Apparently pretty much everyone thinks this is all more than fair, because a quick search for "bring your own device" on both Google and Bing fails to bring up anyone complaining about employers transferring costs to employees. Indeed, if Gaff, Google and Bing are to be believed, employers can barely stop employees from bringing their beloved devices to work.

Still, it's not clear to me whether BYOD enthusiasts have consciously rejected any concern over who pays for work to be done, or if they have in fact forgotten to ask the question in their rush to use a favourite device. Even I wouldn't reject BYOD outright over the concern I've noted above — but I would want to be sure that I'm not just providing technology procurement services as a free add-on to my normal duties.

A funny sort of progress

2015-02-05 by Nick S., tagged as commerce, employment

The Conversation's David Glance outlined a curious theory this week, suggesting that "part of Apple's success comes from giving us a sense of progress". Glance conjectures that providing workers with updated hardware and software every year might give them a sense of progress that contributes to job satisfaction, and suggests that companies might even consider paying their staff bonuses with which they can upgrade their own devices in bring-your-own-device schemes.

Glance doesn't address the question of whether or not upgrading devices makes any actual progress towards the goals of either a company or an individual worker. For Apple's purposes, it's enough to give a sense of progress if it keeps the customers coming back for more upgrades. As Erich Heinzle's comment points out, this strategy is generally known as planned obsolescence and it's an old strategy that serves car and computer manufacturers well but has some questionable benefits for the rest of us.

A student once told me that he'd grown tired of constantly updating his phone to the latest model, and had given up doing it. I told him, slightly tongue-in-cheek, that it was a sign of maturity. Where a child might grasp for the latest toy, an adult chooses the device that best meets his or her needs at a price that he or she is able to pay. (Indeed, he was studying a subject in which students are supposed to learn how to make informed judgements about what kind of computer equipment meets a set of needs.)

Matthew Tucker's comment alludes to what psychologists call an hedonic treadmill (though Tucker doesn't use the term), in which people chase goals and possessions in the expectation that achieving them will improve their lot, only to find that their happiness shortly returns to its usual level. My student recognised that he was on an hedonic treadmill, and got off it.

I can nonetheless see where Glance is coming from when he writes about the feeling of being left behind when one has to use old equipment while everyone else has, or is presumed to have, the latest model. And upgrading hardware and software can lead to progress if the new versions increase productivity, improve reliability and/or create new opportunities.

Still, serious companies and mature individuals probably want to exercise some caution in interpeting Glance's advice lest they end up on a corporate version of the hedonic treadmill. Glance's article is, after all, mostly about how Apple succeeds, not how its customers succeed. Suppose a company has some money to spend on bonuses. Would the company prefer its bonuses be spent by staff who rush out and buy the latest gadget, or by staff who carefully choose tools that improve the quality, breadth and ease of their work?

Do engineers need worship?

2014-07-22 by Nick S., tagged as employment

The July 2014 issue of IEEE Spectrum (pp. 36-40) has G. Pascal Zachary arguing that engineering needs more heroes. "We live in a hero-worshipping society," he says (p. 38). And so "serious fields that lack serious heroes are seriously disadvantaged."

There is an implicit assumption in this logic that society is right to worship heroes, or at least that it can never be dissuaded from doing so and that engineers therefore have no choice but to play along with it. Zachary doesn't seem to consider the possibility that popular hero-worship might, in fact, be misguided nonsense and that engineers are steering a wiser course in eschewing it.

What might pass for engineering heroes in the popular mind illustrates the potential dangers of hero-worship. I'm pretty sure that most people think of characters like Bill Gates and Steve Jobs when they think of the champions of computing technology, for example. But, as Zachary himself observes, these people are really famous for having built up large companies, not for any engineering achievements. The technology itself was built by large teams of anonymous engineers doing work that most people care little or nothing about, and anyone entering an engineering degree thinking that they're going to spend most of their time making big business decisions is going to be sorely disappointed.

Zachary instead wants engineering heroes that attract people to the profession by allowing potential engineers to believe that "individuals matter in the course of technological history" (p. 39). Of course individuals do matter: but there are millions of them, all mattering in different ways to different people and different projects, and all contributing to advances in technology, knowledge and wealth without any single one of them acquiring a body of worshippers. One might as well be motivated by winning the lottery as be motivated by hero status. Indeed, does any profession really want to attract a bunch of narcissists expecting fame and glory for their efforts?

It may be that Zachary is looking for "exemplars" rather than "heroes"; some of his characters — Louis Pouzin, for example — aren't particularly famous despite the contributions they made to technology. I can see how an exemplar might usefully illustrate the life and work of an engineer without requiring any special heroism or worship. I and other teachers sometimes use personal experience for this purpose, since we have a much more intimate understanding of what happened in the projects we've worked on than of anything that might have been done by Nobel Prize winners or big-name megacorporations.

In fact, I've previously observed that what Spectrum describes as "dream jobs" might not really be so different from what I and and a lot of other teachers and engineers do to very few public accolades. Even the holders of Spectrum's dream jobs aren't famous, however fantastic their work may be. So who needs a hero?