top of page

Seven Reasons to be Critical of Technology

 

 

­

 

“If children are separated from their parents by hours of TV, from their playmates by videogames, and from their teachers by teaching machines, where are they supposed to learn to be human?” – Marian Kester

 

“The world has achieved brilliance without wisdom, power without conscience. Ours is a world of nuclear giants and ethical infants. If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner.” – General Omar Bradley

 

THE ORIGINAL TITLE of this piece was Seven Reasons to Exercise Discernment and Reasoned Judgment with Regards to the Tools, Techniques, Systems, and Methods of Organization Human Beings Employ to Adapt to and Exert Control over their Environment. Somehow, that idea did not survive the focus groups.

My aim in presenting the following points is not to argue “against” technology in general or even against specific technologies. My goal is to articulate why it is important to maintain a discerning, even skeptical, stance towards the myriad ways in which human beings seek to apply knowledge with the aim of shaping the world around them. While I often use the general term technology, this writing is primarily concerned with tools and systems that have been developed over the past few centuries. Technologies that utilize fossil fuels, nuclear power, electronics, or genetic engineering are of particular interest to me because collectively, they have profoundly altered the cultural and environmental landscape in such a short time.

There are three primary reasons why I thought it worthwhile to write this letter. First, technology represents an increasingly powerful, ubiquitous, and subtle force in shaping modern life and thus deserves the most serious and thoughtful consideration and discourse. It is almost impossible to overstate the impact of the clock, the automobile, the printing press, the atomic bomb, or the Internet on human culture and on the the planet. New technologies can transform the way in which we live in the historical blink of an eye. Of course, it is impossible to foresee the ramifications of any given technology over time. Yet, it is crucial nonetheless to try to ascertain the general ways in which a technology may impact the earth and society, to determine who stands to benefit and who stands to suffer by its application.

Second, it is important to maintain a critical stance simply because new technologies have so many cheerleaders with deep pockets and broad influence. Proponents of a new technology, most notably large corporations and media conglomerates, will clearly accentuate the positive and downplay the negative. We need to look critically at technology because there is almost never a countervailing voice of equal volume to warn of potentially negative implications. The fact is, most technologies are developed and marketed with the goal of profit first and foremost. Consequently, those who have the most to gain from the introduction of a new technology typically have the most to lose from a rigorous and holistic critique of that technology.

Third, and most importantly, it is worthwhile to question some of our basic attitudes and assumptions towards technology. In Western culture, technology has long been linked to the idea of progress and has thus enjoyed something of a free pass in our public discourse. This helps explain why, even given technology’s pervasiveness and impact, our culture exhibits a curiously uncritical attitude towards it. There are precious few debates on whether a new technology is positive for society at large, how it should be used and regulated, or even whether it should be introduced at all. There are no votes, few public reports on what the long-term implications of a technology might be, and very few avenues to block the use or introduction of a new technology. Most of us are introduced to new technologies through billboards and TV commercials, long after the balls have already been set in motion.

I think it is time to rescind the free pass we too often grant to new technologies. The long-term implications of a technology can be too profound to be considered in such a hurried, narrow-interest, and lackadaisical manner. History is riddled with technological saviors that were supposed to solve humanity’s challenges, only to create a host of unforeseen problems (DDT comes to mind). Instead of viewing technology as a potential savior, I think it should be viewed first and foremost as a Pandora’s Box. The more critically and democratically we consider technology, the more likely we are to end up with technologies that are low impact, that support a relative equalizing of the haves and the have-nots, and that contribute to the health of the whole society (as opposed to small segments of it).

I offer the following points as food for thought. I acknowledge that, at times, I level criticism or raise questions with what may seem overly broad brushstrokes, and clearly the following concerns do not apply to all technologies. What I attempt to articulate, though, are general trends, patterns, and tendencies I have observed. I also admit that I do not often mention the positive implications of technologies, of which there are manifold. As stated earlier, I believe most of us are quite familiar with the benefits technology can bring. We are less well versed in its liabilities, so that is where I will keep my focus.

 

I. Technology amplifies our power without amplifying our responsibility

Virtually all technologies are designed to amplify or leverage human power and impact. A hammer serves to amplify the force we exert on a nail. A telephone expands the reach of our voice. Written language increases our ability to save and transmit information across time and space. Yet, as our technologies make us increasingly powerful, the wisdom and restraint necessary to use those technologies responsibly has not kept pace.

In the digital age, we have seen the power and speed of our technologies increase at a dizzying pace. Twenty-five years ago, a home computer served as little more than a glorified typewriter. Today, an individual can run a business, hack into and cripple computer networks, reach millions of readers and viewers online, and purchase virtually any product in the world—all from a five hundred dollar laptop. Modern day computers are more potent than their predecessors by factors of ten. Yet, are their users factors of ten more responsible than they were in 1984?

In many cases, there seems to be an inverse relationship between the impact of a technology and the responsibility it engenders in its user. Consider, for example, how advances in weapons technology increasingly distance the user from his target and can thus obscure the full ramification of his actions. A swordsman must engage his enemy at close proximity and bears witness to pain, fear, and perhaps injury or death. A sniper sees his target, but only as a tiny, far-off figure. A bomber pilot never sees the devastation in his wake, just a pillar of smoke receding in the distance. Could Paul Tibbits, the pilot of the Enola Gay, have killed ninety thousand men, women, and children in Hiroshima with a sword or a rifle? It is hard to imagine that his conscience would have enabled him to do so. One can only conclude that the distance between agent and effect brought about by aviation, communication, and nuclear technologies made it possible for Paul Tibbits to carry out an order that, without those technologies, he would have deemed unthinkable.

Similarly, as communications and transportation technologies increase the reach of our purchasing power, our awareness and understanding of the connections to the people, ecosystems, and resources involved disappears in a shroud of distance and complexity. If I purchase a pair of Gap jeans online, for example, it takes a tremendous amount of research and imagination to piece together the ultimate impact of that click. With that single click, I may be in a small but very real way perpetuating sweatshops in Southeast Asia, environmentally devastating cotton production in California, and oil extraction in the Persian Gulf. Yet, I am shielded from these unpleasant realities by an interface of smiling models on a stylishly designed website.

Technology and marketing often conspire to keep consumers in the dark with regards to the true costs of the products that we can purchase with such convenience. If we were confronted with the horrific realities of the meat industry, for example, many of us would abstain from factory-farm meat and demand change in practices and legislation. However, cruelty of “finishing lots” and slaughterhouses is completely hidden from view when we buy neatly pre-packaged ground beef at the super market.

I am not implying that technologies somehow make human beings capable of cruel or destructive behavior. Rather, I am suggesting that the very power, speed, and reach we so value in our technologies can also make it almost impossible for us to truly understand the totality of the impact we have in using them. This fosters a degree of ignorance that makes it very difficult to act in a fully ethical, responsible way. Going back to one of the themes of my writing on the underpinnings of health, the distance between agent and effect created by certain technologies makes us less able to receive and respond to feedback. Sweatshops are clearly unjust, inhumane, and unsustainable, but the billions of us who help perpetuate their existence through our purchasing are not fully aware of, or acutely subjected to, the negative feedback that should ideally serve as a corrective signal to our society to change its ways.

All of this raises an interesting question: Is there a limit to the technological power humanity can entrust itself with that is dependent upon our collective responsibility and maturity? If so, what is that limit and how can it be enforced? My personal opinion is that it is beyond human capacity to responsibly bear nuclear weapons or utilize nuclear power. Given our current climate predicament, I would even question whether we are responsible enough to harness the power of fossil fuels. The question of the survival of our species may not be how much power we can obtain, but rather how quickly we can develop the maturity to use the power we need responsibly.

 

II. Technology “outsources” human skill and capacity

In high school, I was amazed to learn that the traveling bards in Homer’s day would memorize the entirety of epic tales like The Odyssey. At this very time, I was struggling to memorize a one-page speech for my oral communications class. I learned that people in pre-literate societies typically have phenomenal memories. The advent of writing, while increasing the collective memory of our species, has also served as a sort of “crutch” for our individual memories. As Emerson once wrote, “the civilized man has built a coach, but has lost the use of his feet.”

I encountered a startling example of how thoroughly human skill and ingenuity can be outsourced to technology on a recent visit to the farm where my father grew up in Odell, Illinois. I was talking with Jason Eggenberger, our friend who now farms my family’s old homestead. Jason showed me an enormous new John Deere tractor that is used to apply fertilizers, minerals, and pesticides in precise amounts through GPS technology. Pretty soon these things will drive themselves, I commented. Jason informed me that they already do. A friend of his had just bought a GPS-guided tractor that only required a human being to turn it on and drive it to the correct starting point. I asked Jason what his friend did once the tractor was off and running. He sits in the cab and watches baseball on TV. My friend and manager of UC Santa Cruz’s Alan Chadwick Garden, Orin Martin, is fond of quoting the Chinese proverb, “The best fertilizer is the footsteps of the farmer.” What happens when the farmer no longer sets foot on his own land?

There are countless other examples of how our skills and capacities can atrophy when we rely too heavily on technology. Computer spell check features have eroded our ability to spell; calculators, our ability to do simple math in our heads or by hand; and GPS systems our ability to find our way around town without an iPhone. We have outsourced music to the recording industry, storytelling to television, and basic living skills to microwaves and automated sprinklers. The implications of outsourcing to technology become truly scary, as opposed to just deeply lamentable, when we have completely ceded responsibility and control over life and death to machines. Writing during the closing days of the Cold War in his powerful book In the Absence of the Sacred, author Jerry Mander warns of a world in which control over our most devastating weapons is effectively outsourced to computers:

 

In recognizing the difficulty of human decision-making in modern warfare, we hear talk of “launch on warning” (launching missiles instantly at the first computer warning) as a viable policy. The technical capacity is already in place for people to be dropped out of the decision loop, leaving us with automatic warfare: our computer program versus theirs. So what is called nuclear war is not that at all; it is microelectronic war, software war. And the arms race has become a battle of computer programmers seeking to gain an edge in a war that, when fought, will happen automatically with no people involved—until the hardware starts dropping on them.

 

As is often the case, science fiction provides us with the most stark and haunting images of a fully outsourced world. In the movie The Matrix, the masses of humanity spend their entire lives in a vegetative state, an utterly convincing virtual reality piped into their brains so that each remains ignorant of his powerlessness. The world itself has become a nearly lifeless place, overrun with self-reproducing machines. As we spend more of our time in front of television and computer screens, as desertification swallows up the earth’s surface, as the Japanese begin to utilize robotic nurses to care for the elderly, and as drone planes drop bombs on villages in Pakistan, the disturbing picture painted by The Matrix becomes an increasingly relevant cautionary tale.

 

III. Our children’s children’s children inherit the legacy of our technologies

Let us look at three technological breakthroughs of the nineteenth and twentieth centuries that human beings will be trying to figure out how to deal with for a long, long time: fossil fuels, plastics, and atomic energy. Human dependence on the burning of fossil fuels has brought us climate change and the prospect of a pending ecological and humanitarian crisis on an unprecedented scale. Plastics, as Alan Weisman reminds us in his sobering book The World Without Us, essentially never go away; they just break down into smaller and smaller pieces and find their way into just about everything. National Geographic estimates that ingested plastics account for the death of over one million birds and one hundred thousand mammals on a yearly basis, and a soupy mix of broken down plastics the size of Texas has accumulated in the Pacific Ocean. Nuclear weapons and nuclear waste, the dangers of which need no elucidation here, must be guarded with the utmost vigilance for thousands of years if we are to avoid catastrophe. Even so, it is nearly impossible to imagine a future in which mankind does not live without some degree of nuclear anxiety. Every subsequent generation of human beings will be saddled with the unintended negative consequences of these technologies, a fact that must be figured into the accounting when we ask ourselves, What sort of world are we bequeathing to our children?

Of course, the first people to burn fossil fuels would not have imagined that humans would one day heat up the planet through technologies that harness that power. Alexander Parkes, the inventor of the first plastic, certainly did not picture his creation causing cancers, filling oceans, and clogging the digestive tracks of marine animals. The scientists who worked on the Manhattan Project to develop the first atomic bomb, many of whom held pacifist beliefs, saw their work as an essential effort against Nazi aggression. If people had known the severity and persistence of the downsides of these particular Faustian bargains, would these technologies have been as rigorously pursued and adopted? If so, what sort of restrictions and safeguards might have been devised to mitigate their more negative effects?

Modern society produces and adopts new technologies at breakneck speed. Yet, how much time and effort has been put into thinking about how these technologies may adversely affect future generations? If a technology truly stands to bring widespread benefit to humanity with minimal negative effects, then surely there is no harm in taking the time to thoroughly consider its potential cultural and environmental impacts. What is an extra five years of deliberation when compared to the potential of imperilling humans and the environment for generations to come?

I write this letter at a time when the human genome has recently been mapped, research on stem cells has received a Federal green light, and nanotechnologies are on the cutting edge of science. Of course, there are tremendous potential benefits from this scientific frontier… but the perils of the genetic age are huge. Who can possibly guarantee a safeguard against the misuses of genetic engineering? The future misuse and proliferation of genetic technology, I am afraid, is as predictable and as dangerous as that of nuclear technology. New technologies have so thoroughly validated the law of unforeseen consequences over time that we need to actually start taking that piece of conventional wisdom seriously.

 

 

 

IV. Some technologies are potentially addictive

Around 2003, I started hearing about the “Keitai Syndrom”. At the time, I was working for the non-profit organization called Volunteers in Asia and was traveling to Japan two or three times per year. Keitai is the Japanese word for cell phone. Young, urban Japanese have always been at the vanguard of cell phone use. An article in the Japan Times newspaper related a study that showing that young Japanese displayed an “extreme attachment” to their keitai, felt anxious without their phones, and often thought about their phones even while they were talking with other people face-to-face. It was only a year or two later before I was reading essentially the same article in an American newspaper, this time about “Crackberry Syndrome”, a reference to the addictive nature of Blackberry devices.

I do not claim to understand why some technologies call to our addictive tendencies, but it is fairly clear to me that addiction, characterized by obsessive thinking, dependency, and compulsive behavior, is the appropriate term. I displayed an addiction to video games when I was younger, and I have definitely felt addicted to email at certain points of my life. When one reads in a recent Nielsen survey that the average American now watches an average of nearly five hours of television per day, that millions of people spend over twenty hours per week playing online videogames such as Second Life, and that approximately six percent of Internet users might be considered addicts, it becomes clear that we are dealing with a technology addiction epidemic.

It is significant and important to acknowledge that certain technologies seem to be inherently addictive. If someone is truly addicted to online videogames, cell phone use, or television, then it becomes important to understand the psychology of addiction in order to help that person develop a more balanced relationship with that technology. If an entire culture is suffering from large numbers of its population in the grip of any sort of addiction, then serious thought should be given to how to address the issue on a large scale.

 

V. Most of humanity’s problems have non-technological solutions

Many technologies enter into our lives heralded as solutions to human problems and shortcomings. The agricultural Green Revolution of chemical fertilizers and pesticides of the mid-twentieth century, for example, was presented as a crusade against global hunger. Television was first marketed as an educational tool that would contribute to the erudition and uplift of humanity. Countless green technologies are currently flooding the marketplace today, all purporting to help save the planet. All of these technologies ultimately fall short of their stated goals for one primary reason: no technology, by and large, can address the roots of human problems.

The innumerable technologies that have thus far been developed by humankind are more than sufficient to provide the essentials of food, shelter, clothing, and security to every human being. It is widely acknowledged, however, that after these basic conditions have been met, human happiness and material wealth do not correlate. Lack of technology, therefore, is not the problem. More technology is not the primary solution. Poverty, war, starvation, and environmental degradation are rooted in failures in human perception, thought, communication, and behavior. Poverty and starvation are rarely the result of lack of resources; they represent failures to distribute resources equitably. Environmental degradation is not due to a lack of clean energy technologies or eco-friendly products; it is a failure to live more simply, intelligently, and conscientiously within ecological parameters.

I am not suggesting that technological innovation has no role to play in addressing an issue like climate change, for example. We certainly need to continue developing alternatives to fossil fuels and more energy efficient products to lessen our collective carbon footprint. I believe the main thrusts of our effort should be elsewhere, however. Addressing issues like climate change and poverty can and should be decidedly low-tech, with an emphasis on education, public policy, diplomacy, community building, localization efforts, spiritual practice, ecological literacy, and so forth. All of these endeavors serve to proactively mitigate social and environmental problems by cultivating healthy, critically thinking individuals and communities. In my experience, such people and communities are inclined to consume less, are more adept at conflict resolution, distribute their resources more equitably, and develop a more harmonious relationship with the land they inhabit.

There are several other reasons why I do not think we should look primarily to technology to extricate ourselves from environmental and social problems. First, any technological tool—even a “green” technology—consumes a certain amount of energy and natural resources. We should always seek to address problems in ways that consume as few resources as possible, preferably zero. Second, belief that technology will somehow save the day can produce a dangerous kind of complacency. If one believes that the solution to climate change lies in technological breakthroughs, those of us who are not in the technological breakthrough business are seemingly left with little to do but twiddle our thumbs and wait for either salvation or catastrophe.

Third, the introduction of a new technology into a culture is like introducing a non-native species into an ecosystem. There is simply no way of knowing how it will ultimately influence the dynamics of the culture. For example, One Laptop Per Child is an effort spearheaded by the Massachusetts Institute of Technology "to provide educational opportunities for the world's poorest children by providing each child with a rugged, low-cost, low-power, connected laptop with content and software designed for collaborative, joyful, self-empowered learning." It may seem ridiculous and futile in this global age to question whether or not this is a good idea, but One Laptop Per Child brings many troubling questions to my mind: How will this impact traditional education and culture? How will exposure to sophisticated and omnipresent online advertisements influence these children’s values and interests? Who will develop the “content and software” and what biases will they contain? Will use of laptops contribute to increased consumerism and therefore environmental exploitation? Will laptop use result in a cultural rift between computer-literate youth and their elders?

There is a deeply held and rarely examined assumption behind technological solutions in general and One Laptop Per Child in particular, and that is that progress and technology are always wedded. It is taken for granted that the world’s poorest children have more to learn from the technological marvels of the developed world than the developed world has to learn from the diverse ways of life maintained by more traditional societies.

Introducing laptops en masse into cultures that have never had access to computers is a well-intentioned but brazen attempt to bring these cultures into the global fold, to make them a bit more “like us.” As it becomes ever more evident that technologically-oriented modern society is sending us careening towards environmental collapse, I am not convinced that we should so actively and hastily be exporting our technologies that carry with them, like a Trojan horse, our way of life.

 

VI. Technology is rarely developed primarily to serve the public good and benefits some more than others

It is always important to ask: Who developed and marketed this technology? Who is intended to benefit? The simple fact is that most technologies that become mainstream are developed or funded either by corporations, which explicitly exist to make profit, or by the military, which exists to make war. The primary intended beneficiaries of most technologies, therefore, are governments, corporate executives, and shareholders. The extent to which the rest of us benefit is of decidedly secondary consideration, as are potentially negative societal and environmental effects.

Most research and development work is quite expensive. It is not surprising, therefore, that a huge amount of scientific funding comes from deep-pocketed corporations and government departments, such as the Department of Defense.

While collusion between the military, industry, and academia is nothing new, it is disturbing to note how thoroughly the military and corporations now bankroll major research institutions. British Petroleum recently lavished five hundred million dollars on University of California at Berkeley to establish a biofuels institute. One of Monsanto Corporation’s many academic investments was ten million dollars to establish a scholarship “to identify and support young scientists interested in improving research and production in rice and wheat” at Texas A&M University.

An interview with a young Stanford engineering graduate student that I heard several years ago on the radio illustrates how some of our finest research minds are lending their efforts to ventures they may not totally support (or even know much about). The young man was the part of the winning team of the Urban Challenge, a robotic car race sponsored by the Defense Advanced Research Program Agency, or DARPA. The Department of Defense entices students from top universities with research funding to compete in the challenge to further develop robotic weapons technology. The young student, still aglow after his victory, was tongue-tied when asked by the interviewer how he felt about the fact that his team’s designs would be utilized to make advanced weaponry. From his perspective, the important thing was that his Stanford team now had two million dollars to continue their research on robotics technologies. The implications of what the Defense Department did with his team’s design seemed not to have occurred to him.

We should be concerned that the cauldron in which so many technologies develop is so heavily influenced by entities concerned primarily with relatively narrow interests of profit-making and national security (and we should be more concerned that these two interests are increasingly linked). Since technology often has a more profound impact on society and the environment than just about any other force, it does not seem wise that its development should rest so heavily in the hands of the few and the powerful.

 

 

VII. Technology is often a vehicle for consumer culture

While living in Costa Rica in 2002-03 and during visits to Indonesia and Sri Lanka in 2005 and 2006, I became increasingly concerned about how modern technologies like televisions, computers, and cell phones were serving as a kind of gateway drug for Western consumer culture. This concerns me not just from the standpoint of maintaining cultural and linguistic diversity, but also from the increased environmental exploitation that inevitably comes with consumerism.

Consider the fact that the average American sees twenty-one thousand television commercials per year, and imagine the effect of such a deluge of advertising on Dayak villagers living on the island of Borneo in traditional communal longhouses. Imagine how Singhalese farmers in a village with no roads perceive the idealized images they view on Baywatch and Friends. In the forward march of Western culture, technology serves as the advance guard.

In the late 1980s, Jerry Mander was invited by a group of Dene Indians of Canada to discuss with them the effects the recent introduction of television was having on their communities. He quotes Cindy Gilday, a member of the Native Women’s association, in In the Absence of the Sacred:

 

"The effect has been to glamorize behaviors and values that are poisonous to life up here. Our traditions have a lot to do with survival. Cooperation, sharing, and nonmaterialism are the only ways that people can live here. TV always seems to present values opposite to those."

 

Ernie Lennie, a Dene educator, added:

 

"Children learn directly from their parents. That is the native way of teaching. Learning has to come from doing, not intellectualizing. A long time ago they only taught people by doing things, but now they just sit and watch TV. Taking away TV is like taking away a bottle of alcohol."

 

It is often argued that people in traditional societies want TVs, computers, and cell phones, and that people in developed nations have no right to preach to them what technologies they should and should not use. I agree with this, but it is also true that the benefits of technologies are aggressively peddled while the downsides are rarely communicated. A Dayak villager might think twice about purchasing a TV if the salesman told him, “Yes, I will sell you this television. But you should know that it might make you desirous of a fictitious lifestyle you will never be able to afford, eradicate your long tradition of storytelling, make your people feel embarrassed about their culture, create distance between the old and the young, and generally contribute to the crumbling of your current way of life.”

Moving towards a sustainable future requires learning from the time-honed traditions and low-impact technologies of all the world’s peoples. Modern technologies have tended to contribute to the undermining of cultural and linguistic diversity and to support an irresponsible and wasteful “mono-culture” based on consumption. To give ourselves the best chance of planetary survival, pragmatism suggests we do whatever is necessary to support the integrity of traditional cultures that remain, and humbly learn what we can from them about living lightly on the land.

 

WHETHER OR NOT you find any of the above points compelling, my main hope is that this writing will support a habit of questioning technologies and our relationship with them. Extra effort needs to be taken to understand the ramifications of any given technology because ours is such a technologically oriented culture. Humans will always devise and utilize tools and systems to help us go about life with some order and efficiency. However, if technology is to remain an asset instead of a liability, then we must heed the words of the late British economist E.F. Schumacher:

 

"Wisdom demands a new orientation of science and technology towards the organic, the gentle, the non-violent, the elegant, and the beautiful."

Screen Shot 2020-05-19 at 9.53.42 PM.png
Screen Shot 2020-05-19 at 9.53.56 PM.png
Screen Shot 2020-05-19 at 9.54.09 PM.png
bottom of page