Jason Lodge recently asked on The Conversation: is technology making as stupid?.
Of course this depends somewhat on what one considers to be "stupid". As Sue Ieraci's comment observes, "every generation appears to value its own ways of knowing and relating above those of the generations above and below." Lodge's article starts with whether or not rote learning has been displaced by ready access to sources of information such as Google. If so, we might be becoming "stupid" insofar as intelligence is measured by an ability to remember facts.
I, and probably Jason Lodge also, would be surprised if anyone still considered rote learning to be the pinnacle of "intelligence". Well before the World Wide Web even existed, there was far more information in the world than any one person could be expected to remember, and how many teachers these days would consider their students to be "intelligent" merely for copying something into an essay or computer program? Modern educators therefore prize skills like knowing how to find information, determining whether or not it is reliable, and synthesising it into a coherent response to a question.
I think that being able to recall a certain breadth of factual information is nonetheless useful: imagine that you had to resort to a dictionary to look up the spelling and meaning of every noun you came across! And imagine what a teacher I would be if I had to look up the textbook every time a student asked a question!I suppose that knowing what needs to be remembered, and what can be left for looking up, is a skill of its own. A Java programmer who can remember the difference between "int", "double" and "String" is surely going to be far more productive than one who can't, for example, but it's probably safe for the same programmer to know that he or she can look up the documentation should he or she ever need to parse hexadecimal numbers using the java.util.Scanner class.
When giving advice about presentations to my research students, I often advise them that they ought to be able to talk knowledgeably about their subject without having to look everything up as they go. The title of the article aside, I guess Lodge is really asking whether or not technology has made us complacent about what constitutes "knowledgeable". Has ready access to search engines and the like, he asks, made us imagine we are experts in subjects that we can't actually talk about except insofar as we can look them up?
I've recently read a couple of smug comments from technology enthusiasts lambasting what they perceive as Luddism from sceptics of some recent technological adventure. One comment on The Conversation equated doubters of massive on-line open courses with a newspaper executive insisting that people would never want to read classifieds anywhere other than a printed newspaper. And, having worked in copyright protection for many years, I'm well-acquainted with the hacker triumphalism that follows the breaking of some rights management scheme.
The technology enthusiasts involved are, of course, cherry-picking the failed predictions of their opponents (or, in the case of the comment cited above, describing a caricature that probably doesn't represent tbe opinion held by any actual person). The Register, for example, recently described ten technology fails illustrating that technology enthusiasts can be just as mistaken in their views of the future as anyone else.
I guess I'm pre-disposed to doubt apocalyptic predictions like the notions that universities will be replaced by massive on-line open courses or that iPhones have already brought about a revolution. Aside from the fact that I'm yet to experience any such apocalypse despite the numerous technological changes that have occurred over the course of my life, extreme predictions of this sort are inevitably simplistic.
For one, the world is vastly bigger than any single technology or product, and the changes brought about by any one are always going to tempered by numerous other influences. So the iPhone is a very successful mobile computing product: but what did it do for refrigeration, power generation or surgery?
For another, existing institutions don't just sit back and wait for their demise when a new technology comes along, even if technology enthusiasts would rather that they did. The music and newspaper industries, for example, may have struggled to re-organise their businesses around electronic media, but they never just packed up and walked away, and they continue to try things even now. And far from planning to either shut down their universities in the face of massive on-line open courses, or pretend that such things don't exist, vice-chancellors Ed Byrne and Margaret Gardner, offer some more measured thoughts about how existing universities might work with on-line courses.
Publishing executives, vice-chancellors and others in their position probably won't be correct in every detail — but could they be as wrong as a prediction that technology X will overwhelm everythng?