Sei sulla pagina 1di 7

Ethics and Social Responsibility for Scientists and Engineers

Michael C. Loui
Professor of Electrical and Computer Engineering University of Illinois at UrbanaChampaign
Friday Forum, University YMCA, Champaign, Illinois February 6, 2009
Abstract. Too often instruction in professional responsibility emphasizes only
compliance with regulations, standards of conduct, and codes of ethics. As their primary
professional responsibilities, scientists strive to ensure the integrity of the research
record, and engineers strive to ensure the safety of the public. After analyzing different
kinds of responsibilities, I will advance a broader conception of professional
responsibility for scientists and engineers, a conception that encompasses stewardship
for society and the environment. Finally, I will discuss whether ethics instruction can or
should cultivate a commitment to this broader conception.
Introduction
Wernher von Braun was a German physicist who led the development of rocket
technology. During the Second World War, he contributed to the development of the V2 rocket that devastated London. Toward the end of the war, he was captured by the
United States Army, and he later worked on the rockets used for the American manned
spaceflight program. In 1965, von Braun was the subject of a song written by satirist
Tom Lehrer:
Gather round while I sing you of Wernher von Braun,
A man whose allegiance is ruled by expedience
Once the rockets are up, who cares where they come down?
Thats not my department, says Wernher von Braun.
Despite the light-hearted tone, this song raises two profound questions. First, are
scientists and engineers merely hired guns, who provide technical services to whoever
employs them? In von Brauns case, the contrast is stark, as he worked for both Nazi
Germany and the United States. Second, are scientists and engineers responsible for the
uses of their creations? Or as the song goes, thats not my department?
In this talk, I will first review our current understandings of professional and ethical
responsibility for scientists and engineers. Second, I will analyze and distinguish
different kinds of responsibility. I will describe a broader conception of professional
responsibility for scientists and engineers, a conception that encompasses stewardship
for society and the environment. Third, finally, I will discuss whether ethics instruction
can or should cultivate a commitment to this broader conception.
First, the Current State of Professional Ethics for Scientists and Engineers
Within the discipline of philosophy, ethics encompasses the study of the actions that a
responsible individual ought to choose, the values that an honorable individual ought to
espouse, and the character that a virtuous individual ought to have. For example,
everyone ought to be honest, fair, kind, civil, respectful, and trustworthy. Besides these

-1-

general obligations that everyone shares, professionals have additional obligations that
arise from the responsibilities of their professional work and their relationships with
clients, employers, other professionals, and the public. For example, doctors and nurses
must keep information about patients confidential. Lawyers must avoid conflicts of
interest: a lawyer must not simultaneously represent both husband and wife in a divorce
case.
Professionals have special obligations because they have specialized knowledge and
skills, and they are granted licenses by the public to use their knowledge and skills to
significantly affect the lives of their clients. For example, physicians can legitimately
write prescriptions and perform surgeryprivileges that most of us do not have. A
dentist can legitimately apply a drill to your teeth, but you would not let me do the
same. I like to say that professionalism makes all the difference between oral surgery
and assault. Because professionals have special powers, they have special
responsibilities to use those powers wisely for the benefit of their clients.
Many professional obligations are not required by laws or regulations. In this sense,
professional ethics goes beyond laws. That is, professional standards may require or
forbid certain actions even when no laws apply. For example, doctors and nurses should
render assistance in cases of medical emergencyfor instance, when a train passenger
suffers a heart attack. Conversely, an action can be wrong even if it is allowed by law.
So when an Enron executive (or in a hypothetical case, an Illinois governor) claims he
has done nothing wrong, he might mean that he has done nothing illegal. But what may
be legal, or allowed by law, may still be wrong.
The primary professional responsibility of the scientist is to ensure the integrity of the
research record. Together, the responsibilities of scientists are called the Responsible
Conduct of Research. Scientists must not fabricate experimental data. They must not
falsify that data, that is, alter the data in unacceptable ways. They must not plagiarize,
representing the ideas of other scientists as their own, without proper attribution. When
scientists collaborate to write a manuscript, the authors should include only those who
made substantial intellectual contributions to the reported research. When scientists
review a manuscript submitted to a journal, they should evaluate the strengths and
weaknesses of the manuscript fairly, they should keep the ideas confidential, and they
should return a report promptly. Scientists should comply with regulations regarding the
treatment of human and animal subjects, and the handling of hazardous substances.
All of these practices in the responsible conduct of research aim to increase the
trustworthiness of research results. While few of these practices are expressed in laws or
regulations, most are stated in standards of professional practice and codes of ethics.
None of these professional, ethical concerns of scientists have to do with the social
consequences of scientific research, however. As we know, the research of some
scientists migrates quickly from findings in the laboratory to commercialization in the
marketplace. In a sense, these scientists are becoming engineers, with the attendant
professional obligations of engineers.
Engineers understand professional responsibilities primarily at the level of individuals
and organizations, as specified in the codes of ethics of all engineering organizations.

-2-

Though the codes differ in emphases and details, all engineering codes say that
engineers should be technically competent and should continue to improve their
competence. In their technical work, engineers should report laboratory data honestly,
and they should affix their signatures only to plans that they have prepared or
supervised. In their relationships with other professionals, engineers should subject their
work to peer review, and experienced engineers should mentor junior engineers. In their
relationships with employers, engineers should honour the confidentiality of trade
secrets and other proprietary information, and engineers should avoid conflicts of
interests. For example, an engineer should not specify the use of parts manufactured by
a firm owned by the engineers spouse. Avoiding conflicts of interest contrasts with
contracting habits in the City of Chicago, where a former mayor said, Theres nothing
wrong with nepotism, as long as you keep it within the family.
Engineering codes of ethics state that engineers should hold paramount the safety,
health, and welfare of the public, above other obligations to clients and employers. This
commitment to the public good echoes the public commitments of other professions:
medicine is committed to human health, law to justice. But engineering and engineering
societies have provided little guidance on how to prioritize the public good, except in
extreme cases such as whistle-blowing. When an engineer finds an immediate threat to
the safety of the public, the engineer must notify appropriate authorities. If the threat is
not handled within the organization, the engineer is expected to blow the whistle. For
example, Michael DeKort was employed by Lockheed Martin as the lead engineer on
the Deepwater project, to modernize the Coast Guards patrol boats. In 2005, DeKort
found that the new camera surveillance systems had large blind spots, much equipment
would not withstand bad weather, and the communication system used unshielded cable
instead of the specified, more secure shielded cable. He alerted officials at Lockheed
Martin about these problems, without success. In fact, he was dismissed from the
project. Finally, in August 2006, in desperation, DeKort blew the whistle by posting a
YouTube video, which attracted attention from the news media and led to investigations
by U.S. Department of Homeland Security and by the Transportation and Infrastructure
Committee of the U.S. Congress. These investigations confirmed his allegations of
wrong-doing.
While stories of whistle-blowing are dramatic, how can or should an engineer serve the
public good in everyday cases, which do not require whistle-blowing? Engineering
codes of ethics are largely silent. As a consequence, instruction in everyday engineering
ethics, like instruction in scientific research ethics, emphasizes professional
responsibilities to individuals and organizations, instead of professional responsibilities
to the public.
Second part of talk: Different Kinds of Professional Responsibility
The culture of science and engineering also encourages a conception of professional
responsibility limited to fulfilling assignments competently and conscientiously. As the
noted civil engineer Samuel Florman wrote in his book The Civilized Engineer:
Engineers do not have the responsibility, much less the right, to establish goals
for society. Although they have an obligation to lead, like most professionals,
they have an even greater obligation to serve. [p. 95] I propose that the

-3-

essence of engineering ethics be recognized as something different and apart


from white-hat heroics. Conscientiousness is fairly close to what I have in
mind. Engineers are accustomed to having their fellow citizens rely on them.
As a consequence, if engineers do their jobs well they are being more than
competent; in serving fellow humans, and serving them well, they are being
good. Reliability is a virtue. Therefore, the conscientious, effective engineer is a
virtuous engineer. Practically all disastersthink of Bhopal and Chernobyl
are attributable to one or another type of ineptitude. One can only conclude that
the great need in engineering ethics is an increased stress on competence. [pp.
101 103]
Of course, like Florman, I am all in favour of competence and conscientiousness. But I
think we can expect more.
In October 2008, I attended a conference on engineering, social justice, and sustainable
development at the National Academy of Engineering. At this conference, I heard an
elderly distinguished engineer state categorically that engineers are not asked to
consider the wider social implications of their work, because those implications are not
their responsibility. In essence, this elderly engineer thought of an engineer as merely a
hired gun, someone who serves a client by solving technical problems, without
considering the social consequences.
It seems to me that this conception of professional responsibility is quite limited. To
explore the concept of responsibility further, let me tell you a tragic story, one of the
many tragedies in the history of technology.
In the early 1980s, Atomic Energy of Canada Limited manufactured and sold a cancer
radiation treatment machine called the Therac-25, which relied on computer software to
control its operation. Between 1985 and 1987, the Therac-25 caused the deaths of three
patients and serious injuries to three others. Who was responsible for the accidents? The
operator who administered the massive radiation overdoses, which produced severe
burns? The software developers who wrote and tested the control software, which
contained several serious errors? The system engineers who neglected to install the
backup hardware safety mechanisms that had been used in previous versions of the
machine? The manufacturer? Government agencies? We can use the Therac-25 case to
distinguish between four different kinds of responsibility.
First, causal responsibility. Responsibility can be attributed to causes: for example,
the tornado was responsible for damaging the house. In the Therac-25 case, the
proximate cause of each accident was the operator, who started the radiation treatment.
But just as the weather cannot be blamed for a moral failing, the Therac-25 operators
cannot be blamed because they followed standard procedures, and the information
displayed on the computer monitors was cryptic and misleading.
Second, role responsibility. An individual who is assigned a task or function is
considered the responsible person for that role. In the Therac-25 case, the software
developers and system engineers were assigned the responsibility of designing the
software and hardware of the machine. Insofar as their designs were deficient, they were

-4-

responsible for those deficiencies because of their roles. Even if they had completed
their assigned tasks as specified, however, their role responsibility may not encompass
the full extent of their professional responsibilities.
Third, legal responsibility. An individual or an organization can be legally
responsible, or liable, for a problem. That is, the individual could be charged with a
crime, or the organization could be liable for damages in a civil lawsuit. Similarly, a
physician can be sued for malpractice. In the Therac-25 case, the manufacturer could
have been sued.
Finally, fourth, moral responsibility. Causal, role, and legal responsibilities are
exclusive: if one individual is responsible, then another is not. In contrast, moral
responsibility is shared: many engineers are responsible for the safety of the products
that they design, not just a designated safety engineer. Furthermore, rather than assign
blame for a past event, moral responsibility focuses on what individuals should do in the
future. In the moral sense, responsibility is a virtue: a responsible person is careful,
considerate, and trustworthy; an irresponsible person is reckless, inconsiderate, and
untrustworthy.
Responsibility is shared whenever multiple individuals collaborate, as in most scientific
and engineering teams. Teamwork is not a way to evade responsibilityalthough I once
read in the preface of a co-authored book, each of the four authors would like to say
that any mistakes are the fault of the other three. Or, as an alumnus once told me,
Always work in teams: if something goes wrong, you can blame someone else. On the
contrary, teamwork should not be used to shift blame or responsibility to someone else.
When moral responsibility is shared, responsibility is not atomized to the point at which
no one in the group is responsible. Rather, each member of the group is accountable to
the other members of the group and to those whom the groups work might affect, both
for the individuals own actions and for the effects of their collective effort. For
example, suppose a computer network monitoring team has made mistakes in a
complicated statistical analysis of network traffic data, and these mistakes have changed
the interpretation of the reported results. If the team members do not reanalyze the data
themselves, they have an obligation to seek the assistance of a statistician who can
analyze the data correctly. Different team members might work with the statistician in
different ways, but they should hold each other accountable for their individual roles in
correcting the mistakes. Finally, the team has a collective moral responsibility to inform
readers of the teams initial report about the mistakes and the correction.
I wish to argue for a broad notion of professional moral responsibility that obligates
scientists and engineers to act even when they do not have an assigned role
responsibility for a task. For example, even when the specifications for a new product
ignore issues of economic sustainability and environmental impact, the product engineer
is professionally responsible for considering those issues. In another case, suppose an
electrical engineer notices a design flaw that could result in a severe electrical shock to
someone who opens a personal computer system unit to replace a memory chip. Even if
the engineer is not specifically assigned to check the electrical safety of the system unit,
the engineer is professionally responsible for calling attention to the design flaw.

-5-

This broad notion of professional responsibility comports with our understanding that
professional responsibilities go beyond complying with laws or regulations, because
laws and regulations often lag behind advances in technology. For example, in 1989,
Hasbro introduced the Playskool Travel-Lite portable crib. This lightweight crib folded
up for easy transportation. The crib met all existing safety standards, because there were
no government or industry test standards at the time for portable cribs, which were are
new kind of product. Quite likely the new crib was not adequately tested. Eventually
three children were killed, in separate incidents, when the top rails collapsed and
strangled them.
Today we have a dazzling variety of new products based on new nanotechnologies and
biotechnologies, whose long-term consequences are difficult to foresee, and which are
thus largely unregulated. Again, scientists and engineers have a professional
responsibility to test these new products sufficiently, to monitor these products as they
are used, and to mitigate their harmful effects. In the absence of technical standards,
they must use their professional judgment.
This broader notion of professional responsibility can even motivate exemplary
behaviour: an engineer might go beyond a job description or a project assignment, to
make products safer, more reliable, less expensive, or more efficientand thereby
improve the lives of users and the public. In the 1930s, Val Roper and Daniel Wright
were engineers at General Electric who worked on headlights for automobiles.
Persisting on their own for several years, they found new materials to make the
headlights brighter so that motorists could drive more safely at night, and their designs
became the industry standards. [Fleddermann, pp. 112114]
Conclusions
Let me summarize. As their primary professional responsibilities, scientists strive to
ensure the integrity of the research record, and engineers strive to ensure the safety of
the public. Too often, however, scientists and engineers receive instruction in
professional responsibility, in a few brief sessions, that emphasizes only compliance
with regulations, standards of conduct, and codes of ethicsminimal responsibilities
that encompass only the conscientious completion of assigned tasks, and the competent
execution of technical work. From this kind of ethics instruction, scientists and
engineers are likely to adopt a narrow conception of professional responsibility similar
to Tom Lehrers fanciful Wehrner von Braun:
Once the rockets are up, who cares where they come down?
Thats not my department, says Wernher von Braun.
In his inauguration speech in January 2009, President Barack Obama said, What is
required of us now is a new era of responsibilitya recognition, on the part of every
American, that we have duties to ourselves, our nation, and the world. What if, indeed,
scientists and engineers construed their own professional responsibilities to encompass a
broad set of such duties, including stewardship for our society and our planet? As a
group of engineering students told Lee Shulman, president of the Carnegie Foundation
for the Advancement of Teaching,

-6-

An engineer is someone who uses math and the sciences to mess with the
worldby designing and making things that people will buy and use; and once
you mess with the world, you are responsible for the mess youve made.
May we all feel so responsible for the messes that we have made.
References
C. Fleddermann (1999). Engineering ethics. Upper Saddle River, N.J.: Prentice Hall.
S. Florman (1987). The civilized engineer. New York: St. Martins Press.
G. Hashemian and M. C. Loui (submitted). Can instruction in engineering ethics change
students feelings about professional responsibility?
N. G. Leveson and C. S. Turner (1993). An investigation of the Therac-25 accidents.
Computer, 26 (7), 1841.
M. C. Loui (2005). Ethics and the development of professional identities of engineering
students, Journal of Engineering Education, 94 (4), 383390.
M. C. Loui and K. W. Miller (2008). Ethics and professional responsibility in
computing,. In Wiley Encyclopedia of Computer Science and Engineering, ed. B. W.
Wah, New York: Wiley, New York, dx.doi.org/10.1002/9780470050118.ecse909.
S. K. Pfatteicher (2001). Teaching vs. preaching: EC2000 and the engineering ethics
dilemma. Journal of Engineering Education 90 (1), 137142.
T. Lehrer (1981). Too many songs by Tom Lehrer. New York: Pantheon Books.
S. Sheppard, K. Macatangay, A. Colby, and W. Sullivan (2008). Educating engineers:
designing for the future of the field. New York: Wiley/Jossey-Bass.

-7-

Potrebbero piacerti anche