Sei sulla pagina 1di 7

Technological Complexity and Ethical Control

Vincent di Norcia

EYEWIRE

he western world has argued passionately about technology whether its good or bad for us, ...even while inventing it at a furious and accelerating rate. Richard Rhodes, Visions of Technology [1].

ETHICS, CONTROL, AND TECHNICAL SYSTEMS


Richard Rhodes words touch a nerve, for the pace and complexity of technological change pose daunting challenges for our moral intelligence. The technical, social, and moral complexities of technical systems have profoundly affected the ethical control of technical systems. Control is not

restricted to the use or production of technical systems, but applies to a technologys full life cycle, from

The author is Professor of Philosophy at the The University of Sudbury, Sudbury, Ontario, Canada P3E 2C6. He can be reached at: 4 Irish Lane, Barrie, Ontario L4M 6H8; email: vdn@sympatico.ca.

IEEE Technology and Society Magazine, Spring 2002

0278-0079/02/$10.002002IEEE

33

invention, design, and development through manufacture and use to final disposal. Ethical control means that the ends and means involved in systems control are guided by core ethical values such as care for life (human and natural), socio-economic welfare, and communication [2]. Notwithstanding their complexity, most of the ethical problems that controlling technical systems pose should be solvable, assuming adequate resources are available [3]. However, technologies are so diverse that we should first clarify the concept of a technical system, e.g., by dividing it into four types, based on their degree of complexity and increase in scale. The four types of technical system we will refer to here are: tools, technologies, technology networks, and technology fields. As we move from relatively simple tools and technologies to more complex technology networks and fields, we are involved, I will suggest, in an increasingly difficult and uncertain social experiment [4]. To solve the technically, socially, and morally complex problems that technical systems pose therefore demands an informed and resourceful moral intelligence. To understand what this involves let us begin with the simplest case, tools; more specifically, the knife.

TOOLS
Tools are simple technical systems like knives, hammers, pens, and clocks. Knives, for example, have few mechanisms if any, and are designed to facilitate individual ease of use in solving practical problems. There are different knives for different uses: from paper, butter, and meat knives to shaving razors, axes, and swords. One can usually assess the appropriateness of a knife to a specific use [5]. One does not for instance use a butter knife to chop wood. Also, related tools are often linked together to make up useful sets.

Knives, forks, and dishes for instance are used together for eating, just as pen, ink, and paper are useful for writing. As tools, knives are familiar and easy to use, usually without formal training. Since users can usually predict the effects of normal use, outcomes tend to match user intent. Thus the ethical control of a tool presents no great difficulties. But skilled use, such as slicing meat finely and evenly or competent sword play has traditionally involved an apprenticeship, e.g., to a butcher or fencing master. Skillful performance also evokes the idea of virtue as excellence in a practice, for excellence, Aristotle argued, extends from technical practice to ethical (see [6]). Traditionally knives were highly crafted unique tools, but today most are mass produced in standard designs, facilitating their general availability and ease of use. Tool culture connotes traditional values. The designs of knives and swords, for example, are slow to change. Older knives can even be antiques or works of art, viz., a kitchen knife found in ancient Pompeii or a Samurai sword. In addition, many tools are organically connected with their users habitat. Hunting knives are used to kill an animal and slice the meat; axes to cut local wood for burning or building. If tools like knives represent tradition, technologies like the automobile signify change.

TECHNOLOGIES
Over the last few centuries the old tool-based farm economy has been replaced by more efficient new industrial technologies: trains, factories, large buildings, elevators, and automobiles. While new technologies may compete with old for users, they do not always replace older systems. Automobiles for instance have not eliminated bicycles, and we still use knives to cut food and paper. But technologies are more complex

than tools, numerically, systemically, socially, and morally. Numerically, technologies typically contain several component tools, and systemically most technologies operationally interconnect several subsystems (ultimately composed of tools), into a single integrated operating system. In operation, automobile technology, for instance, dynamically connects several subsystems: accelerating, gearing, steering, engine cooling, cabin heating, instrumentation, etc. All work together as one transportation system. Watches link springs, gears, hands, face, and case together into one mechanical system designed to tell the time. Each subsystem is composed of tools. Hitting the brake pedal sends brake fluid through the lines to make the pads slow down the wheels. Abstractly, a technology might be reduced to its separate components, but not as a dynamic system. To drive a car you do not manipulate its components separately. You do not stop it by picking up a brake pad and holding it to the wheel. On the contrary, the normal operation of a technology like an automobile involves numerous high-speed interactions among its subsystems and component tools. But those subsystems are loosely coupled. Brake failure affects, but does not destroy, the steering. A driver can in consequence often avoid a serious accident. Technologies whose subsystems are tightly coupled however are prone to normal accidents [7]. In such systems, problems in one subsystem can affect others and quickly threaten to cause total system breakdown, with attendant risks to human health and the environment. Thus problems in the cooling system of a nuclear reactor can rapidly lead to a meltdown; and a short circuit in a planes wiring can cause it to crash. One way of reducing such risks is to favor resilience in design, through loosely coupled subsystems and redundancy.

34

IEEE Technology and Society Magazine, Spring 2002

Despite its systemic complexity, most people today can operate a car. Drivers must of course comply with the automobiles prescribed control techniques, viz., in manipulating the gear shift or turning the steering wheel, and staying on the right or left side of the road. The need for compliance typifies what Ursula Franklin terms prescriptive technologies, in contrast to the greater freedom users had with older holistic tools, like knives [8]. But prescriptive compliance and standardization are not necessarily unethical. Standardizing operational controls has facilitated safer driving. Indeed one of the most important achievements of industrial society was the development of mass production manufacturing technology, as with Fords Model T early in the 20th century. While many technological innovations solved user problems and enhanced performance, they also increased system complexity. The simple user-friendly controls of the automobile mask significant system complexity under the hood. Today many drivers do not understand how their automobiles mechanisms function or know how to repair their car. This often means that specialized technical expertise and training are needed for its operation and maintenance [9]. The same can be said for VCRs, computers, and other common technologies. The complexity of such systems reflects the growth of the modern sciences and technical professions, or what John Kenneth Galbraith termed the technostructure of modern societies [10]. It has in consequence evoked social complexity. Contemporary automobile manufacturing demands far more knowledge and resources than making a two wheel buggy or a knife. The design and development of an automobile now involves numerous scientists, engineers, and technicians, usually working together in the R&D divi-

sions of large, international corporations [11], [12]. No wonder the growth of the automobile industry reinforced that of the technically complex petrochemical industry and firms like Siemens and Daimler Benz. The technology production phase, from innovation and design to development and manufacture, is usually beyond the capacity and resources of most individuals. The era of the lone inventor creating a new tool is long gone. And old automobiles usually end up in junk yards. The growing volume and toxicity of the wastes automobiles produce constitute a major environmental problem. In response, the German government has required auto parts to be recyclable. This in turn has led to simpler design, and fewer, more standardized parts. The control of automobile technology throughout its life cycle, one can see, is also socially complex. Different groups compete to influence each phase of the cycle, from automobile production through traffic and use to disposal. Groups such as manufacturers, technicians, workers, distributors, owner/drivers, mechanics, communities, including non-drivers whose health is at risk from traffic smog, are all stakeholders in the automobile technology. In his classic study of Nuclear Regulatory Commission hearings in California Richard Meehan showed how competing stakeholder interests clashed sharply in seeking to influence public decisions about reactor sites [13]. The ethical control of complex modern technologies is becoming a socially complex and even political process, one that should be open and democratic [14], [15]. Social complexity in turn evokes moral complexity, for diverse values technical, economic, social and ethical com-

pete to guide the control of technologies throughout their life cycle. In the industrial era, technological innovations were increasingly developed in business contexts. But market values often clash with other values, such as technical quality and efficiency, social and economic welfare, and environmental protection [16]. But, some have argued, the very possibility of ethical control has been blocked by a technological imperative [11]. Anything that is

Few foresaw the automobiles contribution to urban sprawl, adolescent sex, and global warming.
technologically possible, it dictates, must be done, usually by experts. But one has doubts about the imperatives force. Automobile firms for instance have been able to resist technological innovations in fuel efficiency, pollution reduction, and safety, alleging financial or competitive concerns; and other firms have suppressed new technologies, notably for competitive reasons [17]. So ethical values, too, can override any so-called imperative and influence the control of a technology. Ethical control is itself morally complex [18]. The problems that arise in the development of automobiles typically involve what Catherine Whitbeck terms multiple constraints cost, comfort, fuel efficiency, safety, and environmental protection and they may not

IEEE Technology and Society Magazine, Spring 2002

35

be simultaneously satisfiable [19]. Accordingly, one should not expect a uniquely correct solution, but a range of acceptable solutions. Balancing these diverse social, economic, technical, and environmental values presents a significant challenge to many industry stakeholders: owners, management, technical professionals, mechanics, car owners, junk yard operators, and public authorities. It is interesting to note that the scientific professions like engineering have all developed multi-valued ethical codes. Those codes typically stipulate that professionals

Indeed the more radical an innovation the less predictable its life cycle or its socio-economic and environmental impacts.
should use their technical knowledge for the benefit of clients, the profession, the community, and the natural environment [20], [21]. In this framework, professionalism itself involves a morally complex mandate: to balance the public welfare, economic growth, scientific knowledge, and technical efficiency. Living up to these competing obligations may be difficult; but they do imply that professional expertise, and the technostructure, is not morally neutral. Indeed, engineers contributed to the pro-

gressive social movements of the 1920s. Indeed it is risky to downplay expert knowledge in favor of narrow business interests, ideologies, or social factions, for technical knowledge is not merely a social construct. A respect for professional knowledge has guided several notable scientists and engineers, from Rachel Carson to Jeff Wigand, Roger Boisjoly, and Nancy Olivieri. Each has taken a leading role in publicly advocating the ethical control of a technology, respectively: pesticides, cigarettes, manned rockets, and pharmaceuticals. The ethical control of a technology, professionalism suggests, requires one to minimize the socio-economic and environmental risks of a technology to acceptable levels, while realizing its technical and socio-economic benefits. Technological innovation is frequently unpredictable. The newer a technology, and the faster the pace of innovation, the less anyone can foresee or control its development and impacts as it moves through its life cycle [22]. In 1900, for example, few if any foresaw the contraceptive pill, penicillin, TV, nuclear energy, genetics, or the computer [23]. A century ago one might predict that society would need to construct more roads as the number of automobiles increased, but few foresaw the automobiles contribution to urban sprawl, adolescent sex, and global warming. Indeed the more radical an innovation the less predictable its life cycle or its socio-economic and environmental impacts. What was originally a fertility pill for example became a contraceptive, and advanced the emancipation of women. Technology life cycles furthermore are becoming shorter, as the pace of technological change speeds up. As the complexity, unpredictability and pace of events, and the severity of global environ-

mental stress soar, Thomas Homer-Dixon claims, modern societies face an ingenuity gap, a shortfall between the rapidly rising need for ingenuity and the inadequate supply [24]. And that ingenuity gap grows still wider as technologies themselves interact in ever larger Technology Networks.

TECHNOLOGY NETWORKS
Technology Networks link different technologies serving a common function, such as transportation, communication, or energy. Networks are usually more complex and larger in scale than their component technologies. Specific tools and technologies may come and go, but the underlying networks persist, for networks have longer life cycles than do their component technologies and tools. This is partly for an ethical reason. Network functions satisfy fundamental human needs, e.g., for communication, energy, transportation, etc. Large scale communication, transportation, and energy networks emerged in modern societies over the last century to serve the needs of their growing populations. In a technology network one finds diverse technologies at different stages in their life cycle, all competing for users with different needs and preferences. In communications networks, users can choose between phones, regular mail, email, radios, TV, and computers to fulfill their communication needs. In energy networks, user options include electrical, natural gas, oil, coal, nuclear, wind, solar, and passive energy technologies. The competition among old and new technologies to serve a function has ethical implications. Intermediate technologies and simple user friendly tools such as a village phone or small diesel engine, E. F. Schumacher claimed, may be more appropriate to the needs of communities in less developed societies than newer, more complex high tech systems such as the latest

36

IEEE Technology and Society Magazine, Spring 2002

Internet computer email system or a nuclear reactor [5]. Similarly, public energy policy in modern societies should explore all energy technology options including conservation, rather than restrict itself to fossil fuels and nuclear. Given their technical complexity, network operators need to be well educated and technically trained. The need for a technically informed moral intelligence was learned early in the history of engineering [21], [25]. That lesson was tragically reinforced in spring 2000 when the technically untrained manager of the Walkerton, Ontario, Public Utility ignored test lab data showing fatal e. coli bacteria in the towns water supply and did not respond to related inquiries from the area Medical Officer of Health. As a result seven townspeople died and hundreds more got sick, many seriously [26]. Networks are socially complex in their organization. Large public and private organizations like states and corporations alone tend to have the social resources and technical expertise to operate large communication and energy networks [27]. Thus the BBC and CBC are government owned broadcasting media, while ABC, Fox, CBS, NBC, and cable TV are private businesses. Bell Canada runs the central Canadian phone system, and private firms generate electricity, while governments own the distribution grids. Radio, TV, and energy networks are all regulated by government agencies, such the FCC in the U.S. and the CRTC in Canada and by numerous state and province energy commissions. Network boundaries are moreover often vague and ill-defined, extending beyond buildings and towns to whole regions and continents. Both international communications and financial markets flow across organizational and political borders. Such networks call for decentralized organization and widely distributed control sys-

tems. Centralized control seems inappropriate for very high speed and high volume electronic communication networks. Efficient functioning is also affected by the way networks integrate channel bandwidth, routing, and nodes, as well as filter signals from noise, etc. To ensure network resilience their components should be loosely coupled and pathways designed to allow messages to flow around bottlenecks, as in the Internet and transportation networks. Network administration and transaction costs should be minimal so that message transmission, energy, and traffic flows are close to frictionless. Easy access, low transaction costs, and minimal administrative oversight are required to enable these dynamic networks to continue to operate efficiently. Almost free access and low-fee, subscription or rental based user rights seem to fit electronic communications networks such as the phone, email, and the Internet [28], [29]. This also affects network ownership. Forms of shared network ownership and joint control are even accepted in private broadcasting media. Where resources are mobile and flow through networks, as in the case of information, electricity, oil, gas, birds, and fish, ownership should be common and public rather than private [30]. The bundle of ownership rights, that is, should be distributed among network producers, distributors (or communicators), system operators, and service providers, both public and private. Contemporary electronic media process information in nanosecond time slices. If we think of our highspeed communication, energy, and transportation networks as rivers flowing at high speeds and volumes, then you may not be able to step into the same river twice, as Heraclitus observed 2500 years ago. And he might have added, the high speeds and volumes have affected our values [31]. Instant

service and easy access are now expected in our social life. Many wrist watches have split second stop watch functions. Millions of people daily drive autos and fly planes, where they are increasingly treated in high volume terms, not unlike human freight. And ease of access and openness in communication networks has its risks, to privacy and data security. So our fastflowing, open access information networks should be guided by a communication ethic of respect for the integrity of information flows, and for data integrity, privacy, and copyright. The great rivers of information and traffic flowing through todays communication, energy, and road traffic networks are in addition prone to sudden flooding. The high speed flows in communications networks and electricity grids leave little time to respond to emergencies. Small problems in one part of a network can rapidly multiply and spread through the network, especially if its subsystems are tightly interconnected. Gridlock can develop very quickly, and threaten to shut down whole regions. The failure of electricity, communications, or air traffic networks, can moreover pose serious risks to welfare and public safety. Networks whose flows are reinforced by positive feedback loops are especially crisis prone, as shown by the 1995, 1997, and 2000 crises in electronic financial markets in Mexico, Asia, and Silicon Valley. One way to reduce the risk of breakdown may be to be more responsive to negative feedback, so that early warnings of emerging problems receive a timely response. Keeping such networks distributed, loosely coupled, and resilient can reduce the ingenuity gap between the need to minimize risks of breakdown and the available supply of relevant technical and moral intelligence [24]. Computerized high speed sensors and flexible response technologies might be designed to

IEEE Technology and Society Magazine, Spring 2002

37

facilitate rapid response to early warnings of problems. For flows in some networks move near the speed of light e.g., in financial trading. Communication and collaboration among network users, operators, experts, and other stakeholders should also be enhanced, and supported by democratic, participatory approaches to control [8], [13], [14]. Technology Fields however offer an even greater ethical challenge.

TECHNOLOGY FIELDS
A Technology Field is all the tools, technologies and networks present in any defined space and time, e.g., in a room over a day. Technology networks commonly cut across field boundaries. While a technology fields boundaries are arbitrary, they are well defined. And distinct borders constitute the infrastructure of the sovereign state. They also open technology fields to regulation by that areas political authorities. But technology fields range widely in size, from mini-fields like rooms to regional and planetary mega-fields. Technology fields are even found in space, as the growing pile of space debris indicates. But few governments, corporations, or international organizations have adequate resources even to inventory a regional technology field, much less control it. Within a technology field, different technology networks compete for dominance. Success in that competition defined the Stone, Iron and Bronze Ages, and more recently the Industrial Age. In it, mechanical and chemical energy, production, and transportation networks prevailed over the organic networks based on agriculture and village life; but the older technologies, e.g., in agriculture, did not completely disappear. The pace of technological change has also sped up, as technology field life cycles have shortened. The Stone Age endured a millenium, but the industrial era lasted only about 250

years, and is now giving way to a new communications age. The emergence of regional and global technology mega-fields over the last 50 years reflects an evolutionarily unprecedented level of competition between human populations and other species for ecosystem resources. The pace of species extinctions has ratcheted up across the globe. Unlike previous eras, Richard Rhodes comments, today we swim in technology as fish swim in the sea [1]. Technology fields shift the ethical control focus to the sea more than the fish, that is, to our technological environments and their impacts on natural ecosystems [32]. The ambivalence of the international political communitys response to the environmental crisis, like the timidity of the Kyoto treatys requirements, suggest that few political leaders grasp the epochal significance of technology mega-fields. This reflects the illnoted fact that the density of technology fields, or the number of technologies in relation to population, has increased significantly over the last two centuries. (The greater the technology-to-population ratio, the denser the technology field). Contrast for example the U.S., France, or Japan with Kazakhstan, Romania, or an Aboriginal community. In these societies both economic development and environmental impact levels correlate with the technology density level. The rise in species extinctions, global warming, acid rain, and ozone layer depletion, all reflect the emergence of highly dense technology mega-fields. The environmental crisis may represent ecological revenge effects as nature responds to technology field size and density [33]. The radical mitigation of the environmental impacts in fact may represent the major ethical challenge facing civilization in the coming century. Density, furthermore, affects the ethical control of a technology

field. The thinner a field, the more one can identify its technologies; but the impacts of new technologies on a thin field are at times as extensive as they are unpredictable. The invention of the spur for example transformed medieval society. The more dense a field is, the less observable and predictable it also may be. At the field level of analysis then one discerns a kernel of truth in talk of technology out of control [34]. On the other hand fields are mere aggregates of disparate technologies, not autonomous technical systems. Which technologies in a field interact, and which dont, which are loosely coupled, and which tightly coupled, are often unclear. Which interactions in a technology field produce which social and environmental impacts is largely unknown. And technology fields contain what Homer-Dixon calls unknown unknowns, or situations where we often dont know what we dont know [24]. In them, then, 21st century civilization faces a critical ingenuity gap. While the supply of knowledge increases, so do the environmental and social problems created by the interactions between technology fields and human populations, locally and globally. The ratio of knowledges to unknowns may not be changing in our favor. Since humans have in truth never been able to completely control their society or their future, attempts to control technology mega-fields represent daunting regulatory challenges for political and international authorities. The need for an informed and effective moral intelligence to control technical systems, it seems, is constantly growing.

COMPLEX CONCLUSIONS
The ethical control of technologies, we have seen, varies with the scale and complexity of the technical system; and those systems in turn vary in their systemic, social, and moral complexity. Individuals may control tools like knives and

38

IEEE Technology and Society Magazine, Spring 2002

technologies like automobiles and do a tentative inventory of the technology mini-field in their residences, but the control of communication and energy networks requires large organizations. As technologies themselves become more complex, control requires additional knowledge, know how, and, usually, formal education and training. And as the pace of technological change increases, old knowledge and norms become outdated and ineffective. Ethical control starts with small baby steps, e.g., in handling tools and technologies. Small scale experiments can help us climb the relevant learning curve. In her critique of the unthinking overuse of highly toxic pesticides, for example, Rachel Carson argued for an ethical control strategy, viz., pretesting pesticides, small pilot projects, and favoring biological over technological controls [35]. The ethical control project, especially of technology networks and fields, is complicated by the interplay of diverse norms: e.g., technical, economic, social, and environmental. And as we move from tools to technologies, networks, and fields, diverse stakeholders and often conflicting interests compete to affect their control. The result is calls for open but slow democratic control processes, and regional and global collaboration. But not all claims are equally credible, nor are all interests equally at risk. So we always need to exercise our moral intelligence. We need to solve the ethical problems at issue. We need to learn from our failures as well as build on our successes. In many cases we need to minimize the risk of disaster [6]. To that end we need

to become more responsive to early warnings of problems, while they are still solvable, long before we face a full blown crisis. Technology can help here. Electronic communications networks can facilitate better detection and more rapid emergency response. The ethical control of increasingly complex technologies, networks and fields may affect the future path of civilization and, as, the environmental crisis suggests, the planetary ecosystem itself. Hopefully, the challenges it involves do not transcend the limitations of our collective moral intelligence.

REFERENCES
[1] R. Rhodes, Ed., Visions of Technology, New York, NY: Simon and Schuster, 1998, pp. 21, 22, 329. [2] V. di Norcia, Hard Like Water: Ethics in Business. Toronto, Ont.: Oxford Univ. Press, 1998. [3] C. Whitbeck, Ethics in Engineering Practice and Research, Cambridge, U.K.: Cambridge Univ. Press, 1998, ch. 1 and 2. [4] M. Martin and R. Shinzinger, Introduction to Engineering Ethics. New York, NY: McGraw-Hill, 1999. [5] E. F. Schumacher, Small is Beautiful, 2nd ed. Vancouver, B.C.: Hartley and Marks, 1999, pp. 146f. [6] Aristotle, Nicomachean Ethics, ch. I, sect. 7. [7] C. Perrow, Normal Accidents, Living with High Risk Technologies. New York, NY: Harper, 1984. [8] U. Franklin, The Real World of Technology, 2nd ed. Toronto, Ont.: Anansi, 1999, ch. 4, pp. 10f. [9] E. J.Woodhouse and D. Nieusma, When expert advice works, and when it does not, Technology and Society Mag., pp. 23-29, Spr. 1997. [10] J. K. Galbraith, The New Industrial State. New York, NY: New American, 1967, ch. VI. [11] C. Freeman and L. Soete, The Economics of Industrial Innovation, 3rd ed. London, U.K.: Pinter, 1997. [12] R. Buderi, Engines of Tomorrow. New York, NY: Simon and Schuster, 2000. [13] R. Meehan, The Atom and the Fault, Cambridge, MA: M.I.T., 1984, ch 8, 1984. [14] C. Mitcham, Justifying public participation in technical decision making, Tech-

nology and Society Mag., pp. 40-46, Spr. 1997. [15] J. Herkert, Ethical risk assessment: valuing public perceptions, Technology and Society Mag., pp 4-10, Spr. 1994. [16] N. Balabanian, Controlling technology: Should we rely on the marketplace? Technology and Society Mag., pp. 23-30, Sum. 2000. [17] R. Dunford, Suppressing Technology, Administrative Science Quart., pp. 512-25, 1987. [18] G.F. McLean, Integrating ethics and design, Technology and Society Mag., pp. 19-30, Fall 1993. [19] C. Whitbeck, The trouble with dilemmas, Business and Professional Ethics, vol. 1, nos. 1 and 2, pp. 119-141, 1992. [20] C. Morison and P. Hughes, Professional Engineering Practice: Ethical Aspects, 2nd ed. Toronto, Ont.: McGraw-Hill Ryerson, 1988. [20] H. A. Linstone, Technological slowdown or societal speedup the price of system complexity? Technological Forecasting and Social Change, vol. 51, pp. 195-205, 1996. [21] E. Leyton, The Revolt of the Engineers. Baltimore, MD: Johns Hopkins Univ., chs. 2 and 3, 1986. [23] M. Sullivan, America in 1900, in Visions of Technology, R. Rhodes, Ed. New York, NY: Simon and Schuster, pp. 29f, 1998. [24] T. Homer-Dixon. The Ingenuity Gap. New York, NY: Knopf, 2000, pp. 1f, 26f, ch. 1 and 7. [25] W.D. Rifkin and B. Martin, Negotiating expert status, Technology and Society Mag., pp 30-39, Spr. 1997. [26] S. Oziewicz and P. Cheney, This could have been prevented, The Globe and Mail, May 26, 2000. [27] W. Rowland, The Spirit of the Web: The Age of Information from Telegraph to Internet. Toronto, Ont.: Key Porter, 1999. [28] Tim Berners-Lee, Weaving the Web. New York, NY: Harper, 1999. [29] M. Stefik, Ed. The Internet Edge. Cambridge, MA: M.I.T., 1999. [30] E. Ostrom, Governing the Commons, Cambridge, MA: Cambridge Univ. Press, 1990. [31] J. Gleick, Faster. New York, NY: Vintage, 2000. [32] P. A. Vesilind and A. S. Gunn, Engineering Ethics and the Environment. Cambridge, MA: Cambridge Univ. Press, 1998. [33] E. Tenner. Why Things Bite Back. New York, NY: Vintage, 1996, pp. 5f. [34] L. Winner, Autonomous Technology. Cambridge, MA: M.I.T., 1977, ch. 5, pp. 197f. [35] R. Carson, Silent Spring. New York, NY: Houghton Mifflin, 1994.

IEEE Technology and Society Magazine, Spring 2002

39

Potrebbero piacerti anche