There’s an old saying in business: if you don’t know who the sucker in a room is, it’s probably you. A similar adage can be applied to technology: if you don’t know how to control the systems you’re using, these systems are probably controlling you. As John Naughton argues in his special report for this week’s New Review, Britain is in danger of producing a generation of technological suckers: people who know how to word process a letter, buy apps for their iPhones and to search in Google, but have no understanding of the inner workings of these services.
This is, above all, an issue of education and training. For more than a decade, the teaching of information technology in schools has focused on using software rather than understanding systems; and on treating computers more like magical boxes than tools to be programmed and critiqued. With the government’s recent decision to throw away this old syllabus and replace it with something better fit for 21st-century purpose, we have an opportunity to rectify a dangerous imbalance and set a new standard. It’s an opportunity we can ill afford to miss – and that touches on some of the most fundamental questions surrounding what role computer technologies can, and should, play in 21st-century life.
Understanding modern computing means far more than typing at a desktop machine or picking up mail on a smartphone. Whether we’re meeting friends, reading books, checking our bank balances or going shopping, computer systems increasingly mediate every aspect of our lives – and shape the ways in which we both see and are seen by the world. Opting out is no longer a serious option, while ignorance risks simply handing over control to those, from corporations to fellow citizens, who may not have our best interests at heart.
Read the complete article at The Guardian.