From iron tools to iPads, technology changes way we think and interact. And that’s the way it always is.
In an early 1970s computer-science class, I learned to organize punch cards to run basic programs on an IBM computer so big and hot that it had its own supercooled building. The computer lab was one of the best places to hang out on 100-degree F. afternoons in Austin, Texas.
When Fortran and COBOL were properly coded, the behemoth’s results emerged in minutes. Although laughably primitive by today’s standards, in 1972 the IBM mainframe felt like the future. Twenty-five years earlier, cybervisionary Alan Turing had pined for such a machine: “[B]eing digital should be of more interest than … being electronic,” he said at a time when a “computer” was more commonly understood to be a gifted human with a pencil, paper, and instructions about what to calculate.
You know where this is going: Twenty-five years from now, iPads, BlackBerrys, and Android phones will be as quaint as the IBM mainframe or the defunct Macintosh 128K I found at the town dump last year. Moviemakers will use them as props to create a retro 2010 look (“It was hilarious, Martha! He was using a ‘touch screen’ computer and a ‘virtual keyboard.’ Wha?”)
Our species has always devised new tools and lamented the loss of old ways. Storytelling around a fire yielded to the written word. Laboriously illuminated manuscripts were displaced by movable type. Then, of course, came the computer. The sort of unit that newscaster Walter Cronkite (above) visited in the early 1950s revolutionized everything from vote tallying to star counting to the dissemination of news.
We layer invention atop invention, celebrate our progress, and cope with the consequences. But behind each new technological revolution is the nagging question: Have we gone too far?
On the street, in the elevator, at the coffee shop, everyone is staring into a smart phone. Surely this has altered our way of thinking and interacting with others, as a number of observers are warning. What we don’t know is whether that is bad or good.
Our tools get better. They also get worse. In the dark world of warfare, each new tool is a more effective way to harm people. The A-bomb could be the point of no return. On the streets, our gas-guzzling culture allows us unprecedented mobility but not without extracting a high cost in air quality, auto fatalities, and the sort of catastrophic pollution that has poured out of the runaway borehole in the Gulf. We can instantly tap into more knowledge than any humans in history.
That is wonderful, but it also changes us. Memory is not as important if you can always do a quick Google search. Are the people in the room with you more important than those who have just texted you?
Theodore Kaczynski, the Unabomber, thought technology’s price was so high that he resorted to violence to try to stop it. He was acting on a common trope in literature: the collapse of our technology and a return to old ways.
On the charming side is the island treehouse world of the Swiss Family Robinson. Who wouldn’t want to live there? On the harrowing side is the post-apocalyptic world of Cormac McCarthy’s “The Road.” When the power grid overloads, fiction becomes fact for a short time. We vow to buy more flashlights – and promptly forget where we put them when the lights flicker on and the air conditioner chugs back to life.
Tools are extensions of our bodies. They let us leverage our physical and mental capabilities. They undergird civilization. But they aren’t gifts outright. Each has a cost. What we buy may make life more efficient or interesting, but we give something up, too. It could be our native ability to spell or calculate instead of typing the problem into a search engine. It could be the quiet time we used to spend with the morning paper instead of scanning the Web. It could be the conversation we used to have across the backyard fence instead of on Facebook.
Alan Turing was right. Being digital is incredibly interesting. It’s your call as to whether it is also better.
John Yemma is editor of The Christian Science Monitor