Having built up a collection of electronic books and magazines to be read, and wanting to save space in my pack on a recent hiking trip, I decided to load all of the books and magazines into my phone instead of taking paper reading material on the trip. This was fairly effective for its purpose, but left me feeling slightly ashamed when I found myself sitting outside my tent with my phone (reading), looking for all the world like someone who'd rather spend time with a phone than with the natural environment I'd come to see.
Now, what I was doing with the phone was essentially the same as what I would do with a book, and I bring books with me whenever I travel. I suppose that one might also argue that reading books in hotel rooms or camp sites is wasting time that could be better spent experiencing the locale that one has come to visit, but I find reading indispensable for passing the time on aeroplanes, trains and buses, and for relaxing at the end of the day. For that matter, I also check my e-mail and answer phone calls while travelling, albeit with greatly reduced frequency to what I normally do. So why not use the phone for the same purpose?
In my mind, at least, I guess there's a great difference in the image projected by using a mobile phone as compared to a book. Sure, I might only be reading, but with a phone I could be checking in with work or providing banal second-hand experiences to my friends — and perhaps I shortly will be if I become accustomed to using to the phone. But a book can only be read, and anyone seeing me with a book knows exactly what I must be doing.
Now, why should I care what everyone thinks I'm doing anyway? Plenty of people respond with incredulity when I say I'm planning to walk or catch a bus where my audience would take a car, but I just explain to them that it's part of the adventure. Yet in doing this I guess I am trying to project an image of someone who isn't bound up with his technological devices, and enjoys spending time without them. I wouldn't like to think that I'd be doing something so crude as trying to be popular or conventional, but I am nonetheless looking after my image.
Later on one evening, I did receive a phone call from a friend. While I was a little surprised that the phone had reception at my camp site, I thought nothing of answering it until I started thinking on this blog entry. So perhaps I am just as much at the beck and call of my devices as the next person after all, at least when I'm not concentrating on resisting them.
Shortly before I hit the "publish" button on my previous entry, I read through the November 2014 issue of APC Magazine (yes, in October). The feature article, How to Hack Everything (p. 29 ff.), espoused a view aligned with mine in that it encouraged computer owners to put in the effort required to understand and customise their devices, but also quite different in that such hacking was promoted with "unlock extra features, better performance and more with these hardware secrets". Customisations of the latter sort are of great interest to computer technologists — especially when it got to learning "how Wi-Fi and the Web are hacked" with packet sniffers, cross-site scripting and the like — but surely of little direct interest to anyone else.
This is all to be expected from a magazine whose main business is reviewing and investigating computer technology for readers with a high level of technical expertise. But it did cause me to pause before I published my entry, add "Part 1" to the title, and make a note to come back for a "Part 2" that contrasted the hacker's view of customisation with what I was imagining.
For APC and others wanting to assert the more mythologised meaning of the word, "hacking" is about understanding computer technology and bending it to one's will, more or less for its own sake. The goal of APC's hacks, for example, include overclocking CPUs, installing software on WiFi routers, and automating one's "online life" to no clear purpose.
Some of these, such as obtaining root access to smartphones, may be precursors to achieving something that is (or perhaps should be) of interest to ordinary people. I recently installed CyanogenMOD, for example, in order to remove the numerous applications that my phone's manufacturer had pre-installed on my phone, but for which I have no use. Surely no one (apart from a phone manufacturer, I suppose) would say that such a situation is ideal: the folks who produced CyanogenMOD, and people who use it, need to employ a deep understanding of computer technology in order to achieve something that anyone can do using the standard install/de-install facility of a desktop operating system. (In fact, while searching for the reasons that phone manufacturers install these applications in the first place, I discovered that South Korea has recently issued guidelines forcing manufacturers to make almost all apps removable, which may do more for ordinary users than any amount of hacking.)
So all this may be a means to an end, even if it's an awkward one used only because better means aren't available. (This is actually the sense in which I most often use the word "hack" in describing a piece of engineering.) But what is the end? Understanding how technology works is a fine thing for engineers, and I'm sure no one would complain if others understood something of it as well. But people ultimately build technology to be used, not merely to be understood.
With this in mind, I can refine my concept of "critical computing" to be concerned primarily with the use of technology rather than its construction. Hacking of the APC sort isn't incompatible with this, and is perhaps even complementary. But I don't expect that there will ever be a day in which we all build our own hardware and software, any more than we build our own cars and bridges. We can nonetheless think about how we choose and use the technology that engineers make available to us: do we blindly pick up the latest product and join the latest web site, or do we think through what we want from our devices and how to best achieve it?
I've been reading Michael Pollan's The Omnivore's Dilemma (2006) this week, in which Pollan investigates the way in which food is produced in America. Arguing that much of this food is produced in brutal industrial settings that are good for neither farmers nor animals nor the people who eat them, he calls on his readers to take a deeper interest in the way food is produced, and to look for qualities beyond the lowest price. The fastest way to end factory farming, he suggests, might be to require that feedlots and slaughterhouses be built with transparent walls because no one would want to eat anything from a factory farm after seeing what goes on there.
This doesn't have much to do with computing, but I nonetheless saw some parallels with what I ended up calling "critical computing" around the beginning of the year. Just as Pollan calls for eaters to better understand the origins and qualities of the food they are eating, I called for computer users better understand their relationship with the technology they use.
The obvious problem with all this, of course, is that we each have a limited amount of time and resources to apply to improving this understanding. Perhaps it's all well and good for me, an experienced software developer, to customise my computing devices to meet my exact needs, but what about someone who doesn't have a degree in computer engineering and twenty years' experience with the things? Thinking about Pollan's call for me to take a comparable interest in the food I eat put me in a better position to answer a question of this sort.
Obviously I must have some interest in the preparation of food to have picked up Pollan's book in the first place. I cook my own food and I've grown a few herbs in pots on my balcony, but I have no plans to take up farming or to slaughter my own meat. When I think about following Pollan's suggestion, I think: how on Earth am I going to find out so much about what I eat, let alone take action on it from the highly urbanised locale in which I live?
Pollan goes to quite some effort to procure the food that he does, far beyond what I think almost anyone would find practical on a day-to-day basis, and I'm sure he himself would be among the first to acknowledge that there's no immediate prospect for a food chain free of factory farms and other industrial baddies. But that doesn't mean that the whole exercise is hopeless: eaters can make a decision to choose factory-free food whenever it is available — even if it costs a bit more — and eaters can put in a modest effort to seek out food-conscious farmers instead of uncaring industrial food conglomerates. And with continual modest effort, perhaps farmers and eaters (and maybe even conglomerates) can improve the food production system over time.
A comparable pursuit of understanding of computing would probably look very different — computers can't be made other than in factories, for a start, and they have no "natural" lifestyle bequeathed to them by evolution — but perhaps it's reasonable to ask for a comparable level of effort and continual improvement.
I was recently without Internet access at home for a week, apparently due to flooding at my local telephone exchange. I've heard that some people get very upset at losing their connectivity even for periods much shorter than a week, most recently in a Conversation article from Michael Cowling claiming that "we are all connected, every minute of every day, and without your phone you are on the outskirts of everybody else’s new, more digital, world." The local newspaper also ran a suitably angry headline on a stand outside my local newsagent towards the end of the outage. (I didn't read the newspaper itself.)
Frustrating as the lack of connectivity might have been on occasions, I actually found myself enjoying the adventure of a daily trip to the local library or city mall, where I could check my e-mail using WiFi services provided the local council. (I used to wonder what use public WiFi would be given that we all have Internet connections at home anyway, but now I know.) I was reminded of the days of dial-up modems, when connecting to the Internet was a minor treat, and I maintained a list of Internet-things-to-do to be serviced by dialling in for a couple of hours every day or two. The only really annoying thing, in fact, was that I fell behind in my Coursera studies due to an inability to download course videos over the public WiFi network. I was almost disappointed when the fault came to an end and the adventure was over (though I did catch up on my studies.)
One might suppose that I'm quite a different person to the smartphone-driven folk that inhabit the world described in Cowling's article. I'm certainly older. On the other hand, I presume that the video that Cowling presents to support the quote at the beginning of this entry is staged — not even the youngest and most gadget-conscious of my acquaintances or students behaves anything like the folks shown in it, and I'm sure that most people would regard those folks' behaviour as anti-social and obnoxious.
I recently went on a camping trip during which I was told that a young camper fitting Cowling's description had, in the process of this camp, discovered that she could, in fact, enjoy time without her gadget. One can speculate that I've just had twenty years longer than her to find this out, not to mention first-hand experience of a time when everybody went without a mobile phone all the time.
Perhaps being without the Internet appeals to a similar part of us to that to which camping appeals. I don't suppose I'd want to be camping indefinitely, though maybe I could if I had to given that I'm of the same species as ancestral humans who reached every scrap of land except Antarctica without motorised transport, electricity, or even agriculture. Similarly, my younger acquaintances can surely go without their phones for a bit, and might even enjoy it up to a point, given that all of us did just that only twenty years ago. We just need to remember that there's more to us than the fashion of the day.
Since writing my previous entry on positive computing, I've pondered how software might promote my well-being beyond its traditional promise to make things faster and easier. I've struggled. Perhaps I'm just not particularly creative when it comes to positive ideas, or maybe I'm not sufficiently well-versed in the theory of subjective well-being to know what might be helpful.
I've found myself thinking more about the consumer side of the question, which I left unanswered in my previous entry. Having made connections to some earlier complaints about lazy use of communication tools and e-mail, I realise that I've begged the question: how should we be using our computers?
On one hand, I've been critical of blind acceptance of trendy devices and services, and of lazy submission to user interfaces developed by misguided software designers. On the other hand, I don't think it's reasonable to expect every user to possess the deep technical understanding of computers required to control every detail of his or her experience. Even the most sophisticated users simply don't have the time to build every item of hardware and compose every item of software to meet their precise needs, even if they have the theoretical ability to do so.
The first approach that occurred to me would be to demand that we make our "best effort", that is, do as much as we can within the constraints of our time and technical ability, and always strive to improve. Whenever I'm particularly irritated by a feature that isn't meeting my needs, for example, I'll do a quick search for how to modify that feature. And when I've got more time, I'll invest that time in customising my computer to meet my needs.
The second approach that occurred to me was suggested by Richard Thaler and Cass Sunstein's book Nudge (2008), in which they discuss the development of "choice architectures" that encourage people to make good choices when unable to think the matter through carefully. The basic idea is to think carefully about the desired outcomes during the design phase, and design the system to make it easy to make the choices leading to those outcomes. Such thinking is (I hope) amongst the bread and butter of software designers, and Thaler and Sunstein specifically mention the example of e-mail clients that pop up warnings when the user asks to send an e-mail that contains the word "attach" but does not have any attachments. But software users can apply the same idea, as is occassionally suggested in advice columns like Cassie White's article on digital overload that happened to appear on the ABC's Health & Wellbeing site while I was working on this entry.
Lastly, I wonder if we need a concept of "critical consuming" analogous to the "critical reading" concept that teachers try to impart to students. In a computing context, we need more than the mere ability to move a mouse or touch a screen to get the goods on offer; we need also to think about which goods we want, why we want them, and whether or not they're really the best goods for our purposes.