I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard
Posts tagged as mobile computing

Bringing your own device, or providing it for someone else?

2015-03-05 by Nick S., tagged as employment, mobile computing

When I touched upon bring-your-own-device schemes in an article about upgrading devices last month, my inner industrial relations consultant was a little troubled by the whole idea of bring-your-own-device: why would I provide equipment for my employer's use at my own expense? I then read Brian M. Gaff's column BYOD? OMG! in the February 2015 issue of IEEE Computer (pp. 10-11), in which he provides some advice for employers in managing devices brought into the workplace by employees.

In doing so, Gaff sometimes makes said employees sound very much indeed like suckers providing free equipment to their employers and donating time outside of work hours: BYOD transfers costs from employers to employees, and increases productivity (per dollar, if not per time) by allowing employees to work at home. Reading that "personally owned devices are typically more advanced compared to those that are employer issued" (p. 10), I further envisaged a workplace version of John Kenneth Galbraith's "private opulence and public squalor" in which BYOD participants flaunt their cool new devices while the workplace's own infrastructure is left to rot.

To be fair, there are benefits in it for employees as well. They get to use devices set up to their own specifications, and there's some convenience in not having to switch between personal devices and work ones. I myself frequently answer work e-mail from my home computer (though that says as much about the casual nature of my employment as anything to do with which computer I like to use.) Maybe one could even make an environmental case for the practice insofar as it reduces the number of devices that need to be built (though the perpetual upgrade cycle that feeds BYOD enthusiasm may have exactly the opposite effect.)

Apparently pretty much everyone thinks this is all more than fair, because a quick search for "bring your own device" on both Google and Bing fails to bring up anyone complaining about employers transferring costs to employees. Indeed, if Gaff, Google and Bing are to be believed, employers can barely stop employees from bringing their beloved devices to work.

Still, it's not clear to me whether BYOD enthusiasts have consciously rejected any concern over who pays for work to be done, or if they have in fact forgotten to ask the question in their rush to use a favourite device. Even I wouldn't reject BYOD outright over the concern I've noted above — but I would want to be sure that I'm not just providing technology procurement services as a free add-on to my normal duties.

Image

2014-11-15 by Nick S., tagged as experience, mobile computing

Having built up a collection of electronic books and magazines to be read, and wanting to save space in my pack on a recent hiking trip, I decided to load all of the books and magazines into my phone instead of taking paper reading material on the trip. This was fairly effective for its purpose, but left me feeling slightly ashamed when I found myself sitting outside my tent with my phone (reading), looking for all the world like someone who'd rather spend time with a phone than with the natural environment I'd come to see.

Now, what I was doing with the phone was essentially the same as what I would do with a book, and I bring books with me whenever I travel. I suppose that one might also argue that reading books in hotel rooms or camp sites is wasting time that could be better spent experiencing the locale that one has come to visit, but I find reading indispensable for passing the time on aeroplanes, trains and buses, and for relaxing at the end of the day. For that matter, I also check my e-mail and answer phone calls while travelling, albeit with greatly reduced frequency to what I normally do. So why not use the phone for the same purpose?

In my mind, at least, I guess there's a great difference in the image projected by using a mobile phone as compared to a book. Sure, I might only be reading, but with a phone I could be checking in with work or providing banal second-hand experiences to my friends — and perhaps I shortly will be if I become accustomed to using to the phone. But a book can only be read, and anyone seeing me with a book knows exactly what I must be doing.

Now, why should I care what everyone thinks I'm doing anyway? Plenty of people respond with incredulity when I say I'm planning to walk or catch a bus where my audience would take a car, but I just explain to them that it's part of the adventure. Yet in doing this I guess I am trying to project an image of someone who isn't bound up with his technological devices, and enjoys spending time without them. I wouldn't like to think that I'd be doing something so crude as trying to be popular or conventional, but I am nonetheless looking after my image.

Later on one evening, I did receive a phone call from a friend. While I was a little surprised that the phone had reception at my camp site, I thought nothing of answering it until I started thinking on this blog entry. So perhaps I am just as much at the beck and call of my devices as the next person after all, at least when I'm not concentrating on resisting them.

On the structure of computing revolutions

2012-11-23 by Nick S., tagged as buzzwords, mobile computing, prediction

I recently read an article mocking its own authors for failing to recognise that the iPhone (or some particular version of it) would instigate a revolution. Unfortunately I didn't record where I read this, and I haven't been able to find it again after later thinking about what constitutes a "revolution", and what it might feel like to live through one.

My immediate reaction upon reading the article was: are you sure you weren't right the first time? I, at least, don't feel like I've been through a revolution any time in the past ten years, or, indeed, my entire life. Sure, technology has steadily improved, but I've only ever perceived it as "evolution". I have no doubt that someone catapulted into 2012 from the time of my birth in the 1970's would find much to be amazed about. But, having lived through all of the intervening years myself, I had the much more mundane experience of seeing the changes one product at a time.

This begs the question: how much change is required, and how sudden does it need to be, to constitute a "revolution"? When talking of the history of computing to my computer systems students, I often talk of "trends" from analogue to digital, from stand-alone computers to networked ones, and from single-core to multi-core CPUs. I say "trend" because I perceive the changes as a gradual process of older products being replaced one-by-one by newer products. But proponents of the iPhone (or digital or network or multi-core) revolution presumably perceive the changes as one big leap from existing products to a spectacular new entrant. (Either that, or they use the word "revolution" to mean "any perceptible change".)

Now, many small changes may add to up to a big one. Someone of my mind born in Britain in 1800, say, might have observed machines or factories appearing one at a time over his or her life time. But that person's lifetime now seems short compared to the span of human history, and we consequently refer to that period as the Industrial Revolution. Still, I suspect that future historians will be looking at more than iPhones when they decide what to call the current period.

One of my students foreshadowed the taxonomic problems awaiting future historians when he observed to me that the articles he had been reading disagreed about what era of computing we currently enjoyed. I forget the exact list of candidate eras, but one might have been the "mobile era" and another the "network era", and so on. Off the cuff, I suggested two explanations: firstly, that his sources were talking crap, and, secondly, that his sources were talking about two different aspects of computing.

The two explanations might not be mutually exclusive. Perhaps the iPhone revolutionised mobile telephony/computing for some definition of "revolution", but I didn't notice this revolution because I do relatively little telephony and mobile computing. But the iPhone didn't revolutionise other aspects of computing -- let alone biotechnology or space travel or any of numerous other technologies of the modern period -- so attributing a broader revolution to it would seem to be a load of crap.

Why is it so boring to use the right tool for the job?

2012-11-18 by Nick S., tagged as buzzwords, mobile computing

In thinking about both tablet PCs and Alone Together over the last month or so, I noted the paradigm of using the right tool for the job. To recommend using the right tool for the job seems fairly banal, but I wondered if my perceived need to recommend it reflects the apparent existence of a contrary view in which there exists, or will shortly exist, some universal tool appropriate to all uses.

Henry Jenkins refers to this contrary view as "the black box fallacy" in his book Convergence Culture. I find it hard to identify any particular person who propagated the black box fallacy -- or dream, if you disagree with Jenkins and I -- and I can't imagine anyone owning up to a statement as simplistic as "device X is all we will ever need". Yet, the black box idea seems implicit in utopian (and dystopian) narratives like that implied by questions like "Have digital tablets become essential?"

To be fair to anyone anticipating the arrival of a black box, there are presumably some limits in mind, albeit unstated and vague. Surely no one foresees a single black box performing all the functions of a computer, a vehicle, an oven, a refrigerator and a washing machine! But, even if we restrict the imagined functions of a black box to those currently performed by microelectronics, why expect a single box when there is plainly a whole host of different boxes on the market?

I suppose that the hype and excitement surrounding a new device tends to drown news of existing devices, giving a false and unintended impression that the new device is far more important and interesting than the old ones. Presumably not even the most enthusiastic supporters of smartphones or tablet PCs believe that such devices are about to replace server farms or home theatres, for example. But the features of server farms and home theatres are likely to be far from the mind of someone enthusing over the latest mobile device.

The gradations between phones, smartphones, tablets, netbooks, laptops and desktops are more subtle, though. If desktop computers were only introduced in 2012, after we had been accustomed to mobile telephony and portable computing, could we be so amazed by their computing power, large screens and keyboards as to forget that they aren't very mobile?

Tablets and the elusive black box of computing

2012-09-30 by Nick S., tagged as buzzwords, mobile computing, prediction

The Conversation last week included Roland Sussex wondering if digital tablets have become essential. His answer seems to be "no", since he observes they aren't very good for textual input and aren't sufficiently robust for use by children. He eventually comes to the rather inane conclusion that "for what they do well they are fine." For things they do badly, we still need other devices.

For my part, the answer is obviously "no" since I don't have a tablet and have yet to drop out of society, or even be inconvenienced in any way. I have a netbook that I find quite useful for reviewing lecture material and drafts while I'm on the train. Perhaps a tablet would be better for reading books and magazines (which I also do), but I do enough typing to feel that a device with a keyboard is the most appropriate tool for the job. See my comments on the article for my full argument.

I sometimes wonder if all the fuss about tablets (and cloud computing and any number of other buzzwords that have appeared over the years) is driven by a self-fulfilling prophecy in which everyone buys tablets because everyone says tablets are the way of the future. Did Steve Jobs anticipate the market for tablets when he introduced the iPad, or did the market buy tablets because a charismatic and influential figure anticipated them? After all, tablets of various sorts existed a long time before the iPad: Apple itself released the Newton in 1992, while Microsoft introduced Windows for Pen Computing in 1991 and Windows XP Tablet PC Edition in 2002.

Now, it could well be that technology just wasn't up to the task of making tablets work in the days of the Newton and Windows for Pen Computing, and advances in technology have now made it the right time to try again. Sun Microsystems and others were eager promoters of "network computing" and "thin clients" in the 1990's, for example, but the networks of the time didn't have the capacity or connectivity to support what we now call "cloud computing". With increased network capacity and connectivity, the 2010's might be a more fertile ground for similar ideas.

Whatever the case, it's hard to see us escaping from the boring old paradigm of using the right tool for the job. Henry Jenkins refers to imaginations of a single, unified computing/communications device as "the black box fallacy" in Convergence Culture. As several of the comments on Sussex's article observe, it's not that tablets lack the technological sophistication of Jenkins' black box, it's that ergonomics dictates different tools for different tasks. And even if some future tablet -- or smartphone or wearable computer or microchip implant -- could somehow meet all of one's computing and communications needs, would it make a very good refrigerator?