A recent article in IEEE Spectrum Tech Talk claims that regulation will lag developments in self-driving cars. This is a familiar theme amongst pundits of all kinds of technology, but why would anyone expect regulation to be ahead of technology? What kind of lawmaker would bother to write laws about technology that doesn't yet exist, let alone presume to have the 20/20 foresight required to make sensible ones? And what technologist would applaud lawmakers for making law without first developing an understanding of the technology involved?
I also happened to be reading Jack Goldsmith and Tim Wu's Who Controls the Internet? (2006) this week. A section of Chapter 7 portrays the history of copyright law and attendant media industries as a series of equilibria punctuated by new technologies. The arrival of a new technology heralds a confrontation between established players and new players, frequently loud and ill-tempered. But eventually everyone settles into a new equilibrium that allows life to go on. They conclude the section with an argument that the court cases surrounding Grokster and the like did not come about because the technology had over-run the government's ability to control it, but because the government was simply taking its time to determine the best way forward, just as it had for technologies like vinyl records, radio, cable television and video recorders.
Put that way, having regulation lag technology sounds like eminent good sense. Goldsmith and Wu themselves refer to it as "business as usual". Jonathan Zittrain, writing on related issues, proposes that lawmakers regulate Internet technologies using more or less this strategy in The Future of the Internet and How to Stop it (2008): spend time watching how the technology plays out, then act if some harm becomes evident. And the time spent working out what to do about video recorders, for example, now seems pretty minor compared to the three decades we've spent enjoying rental videos since.
Technologists, I suppose, might like to think that they already know all about the technology and are therefore in a position to set appropriate rules for their inventions right away — if, indeed, they feel the need for any rules at all. I suppose similar thinking underlies the calls for "self-regulation" that feature in much industry input into public policy.
Technologists may well know the most about the technology, and would surely be high on any capable lawmaker's list of people to speak to in drafting legislation. But technologists have some fairly obvious conflicts of interest in developing rules for technology that might make them wealthy or powerful, and even the most disinterested technologist is as subject to the law of unintended consequences as anyone else. As Zittrain suggests, then, perhaps the humble and wise technologist ought to embrace the lag of legislation behind technology rather than expressing constant amazement at the hopeless laggards in parliament, who might just be taking the same care in their job as we do in ours.
After getting into a discussion about the degree to which Internet service providers can or should contribute to the enforcement of copyright law, I did some reading as to what means might be available for Internet service providers to make a contribution. In doing so, I discovered that deep packet inspection technology means that Internet service providers can and do monitor and manipulate the data that flows over their networks. This seems completely at odds with the protestations of Internet advocates like Suelette Dreyfus that filtering network traffic is onerous and expensive: it turns out that network providers find it perfectly practical and affordable if they have something to gain from it. Any doubt that network providers are willing and able to monitor and manipulate traffic might be dispelled by the recent hullabaloo concerning net neutrality, which would surely be a non-issue if network providers really did find content filtering uninteresting and/or uneconomic.
Of course network providers aren't interested in the same things about packets as the media industry or Government censors, and implementing filters for these things presumably incurs some cost over and above what network providers have already done for their own purposes. Nor does the technical feasibility of filtering necessarily imply that it's the best way to address the issues involved. Still, allegations of technical impossibility or economic infeasibility might not be sufficient reason to oppose filters, and might be particularly disingenuous if they come from network providers who'd like to do some filtering of their own.
Quite a few legal scholars, such as Landes and Lichtman, argue that some sort of filtering might indeed be the most cost-effective method of enforcing copyright law, since it is much easier to police the relatively small number of network providers than it is to police the extremely large number of network users. If those scholars are correct, the proposals of Dreyfus and others like her would have us pay less for Internet access in return for paying even more for the things that we download from it.
Of course Internet advocates don't see it this way. Assuming that they accept that copyright law ought to be enforced at all, they see it as a question of who has the moral responsibility for enforcing it. The logical conclusion of this approach seems to me to be for the media industry to either implement digital rights management technology, or to sue individual infringers, or a combination of the two. Yet I'm hard-pressed to think of an Internet advocate applauding the media industry's past efforts in these directions.
Since writing my previous entry on positive computing, I've pondered how software might promote my well-being beyond its traditional promise to make things faster and easier. I've struggled. Perhaps I'm just not particularly creative when it comes to positive ideas, or maybe I'm not sufficiently well-versed in the theory of subjective well-being to know what might be helpful.
I've found myself thinking more about the consumer side of the question, which I left unanswered in my previous entry. Having made connections to some earlier complaints about lazy use of communication tools and e-mail, I realise that I've begged the question: how should we be using our computers?
On one hand, I've been critical of blind acceptance of trendy devices and services, and of lazy submission to user interfaces developed by misguided software designers. On the other hand, I don't think it's reasonable to expect every user to possess the deep technical understanding of computers required to control every detail of his or her experience. Even the most sophisticated users simply don't have the time to build every item of hardware and compose every item of software to meet their precise needs, even if they have the theoretical ability to do so.
The first approach that occurred to me would be to demand that we make our "best effort", that is, do as much as we can within the constraints of our time and technical ability, and always strive to improve. Whenever I'm particularly irritated by a feature that isn't meeting my needs, for example, I'll do a quick search for how to modify that feature. And when I've got more time, I'll invest that time in customising my computer to meet my needs.
The second approach that occurred to me was suggested by Richard Thaler and Cass Sunstein's book Nudge (2008), in which they discuss the development of "choice architectures" that encourage people to make good choices when unable to think the matter through carefully. The basic idea is to think carefully about the desired outcomes during the design phase, and design the system to make it easy to make the choices leading to those outcomes. Such thinking is (I hope) amongst the bread and butter of software designers, and Thaler and Sunstein specifically mention the example of e-mail clients that pop up warnings when the user asks to send an e-mail that contains the word "attach" but does not have any attachments. But software users can apply the same idea, as is occassionally suggested in advice columns like Cassie White's article on digital overload that happened to appear on the ABC's Health & Wellbeing site while I was working on this entry.
Lastly, I wonder if we need a concept of "critical consuming" analogous to the "critical reading" concept that teachers try to impart to students. In a computing context, we need more than the mere ability to move a mouse or touch a screen to get the goods on offer; we need also to think about which goods we want, why we want them, and whether or not they're really the best goods for our purposes.
Shortly after writing my entry on the joys of engineering and the banality of products, I found that Rafael A. Calvo and Dorian Peters had addressed much the same issue in the Winter 2013 issue of IEEE Technology and Society (pp. 19-21). In fact, they say they're soon to publish a whole book on the subject, to be called Positive Computing. I've added it to my reading list.
In the mean time, the book's title comes from a 2011 position paper written by Tomas Sander, who looks at the role of information technology in pursuing the "positive psychology" proposed by Martin Seligman. The basic idea is to create computer applications that promote what psychologists and economists call "subjective well-being", rather than applications that merely allow us to do things faster. Tibor Scitovsky might have had exactly this in mind if he were writing The Joyless Economy today.
I'm sure that plenty of applications already exist that promote well-being in one way or another. Calvo and Peters specifically mention SuperBetter, bLife and the Mindfulness App, which seem to implement ideas from the positive psychology school. The promises made by these applications might be a little saccharine for my tastes, and I have certain misgivings about aspects of Seligman's ideas, but I think there's reason to believe that great games, for example, can provide meaningful and satisfying challenges.
On the other hand, I'm sure that there is plenty of software out there that promises meaningful and satisfying experiences, but ultimately provides only superficial simulacra of such experiences. The development and use of such software might be driven, in part, by a wish for fast and easy access to desirable experiences.
Whatever the motives of producers in creating the products that they do, Scitovsky calls for consumers to become more sophisticated in their choice and use of products. In a computing context, for example, word processing software may make it fast and easy to edit and format documents, but it's still up to writers to strive for meaningful words, and up to designers to strive for attractive pages. If they don't, word processors are just a fast and easy way to produce unsatisfying junk. I've previously made similar comments about communication technology that I can now interpret as a need to be more sophisticated about the communication tools that we use.
Seeing that Scitovsky and others were writing about these notions back in the 1970s, I wonder why we still appear to be prioritising fast and easy over meaningful and satisfying. I suppose that Scitovsky's critics might argue that history has shown him wrong, and that the majority of people really do value fast and easy products over what a few elitists think are more worthy pursuits, Maslow's hierarchy of needs be damned. But when I see the degree to which Australians appear to have convinced themselves that we're "doing it tough" despite enjoying one of the highest levels of material wealth that has ever existed in the world, I suspect that the critics and their followers might just have chosen to pursue the fast and easy path because it is itself the fast and easy choice.