0% found this document useful (0 votes)
293 views

PHI320 Ch01 PDF

Uploaded by

Viqar Ahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
293 views

PHI320 Ch01 PDF

Uploaded by

Viqar Ahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

John_0130836990_c01.

qxd 10/17/00 8:13 AM Page 1

CHAPTER 1

Introduction: Why
Computer Ethics?

SCENARIO 1.1 Should I copy proprietary software?


Since John graduated from college, f ive years ago, he has been investing small
amounts of money in the stock market. A year ago, he discovered an extremely
useful software package that helps individual investors choose penny stocks.
(Penny stocks are stocks of small companies that sell for a few dollars or less
per share.) The software requires users to input information about their atti-
tudes toward risk as well as the names of penny stock companies in which they
are interested. The software provides a wide range of information and allows
the user to analyze stocks in many different ways. It also recommends strate-
gies given the user’s attitudes toward risk, age, size of investment, and so on.
John has several friends who invest in stocks, and one of his friends, Mary,
has been getting more and more interested in penny stocks. At a party, they
begin talking about investing in penny stocks and John tells Mary about the
software package he uses. Mary asks if she can borrow the package to see what
it is like.
John gives his disks and documentation to Mary. Mary f inds the software
extremely useful. She copies the software and documentation onto her com-
puter. Then she gives the package back to John.
John and Mary were both vaguely aware that software is proprietary, but
neither read the licensing agreement very carefully. Did John do anything
wrong? If so, what? Why is it wrong? Did Mary do anything wrong? If so, what?
Why is it wrong?

SCENARIO 1.2 Should my company make use of data mining technology?

Inga has worked hard all her life. Ten years ago, she started her own business
selling computer software and hardware. In any given year now, 100,000 to
ISBN: 0-558-13856-X

200,000 customers purchase things in her store. These purchases range from
a $5 item to a $10,000 item. As part of doing business, the company gathers

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 2

2 INTRODUCTION: WHY COMPUTER ETHICS?

information on customers. Sometimes information is gathered intentionally


(e.g., when they distribute customer surveys to evaluate the service they are
providing and f ind out their customer preferences). Other times, they gather
information embedded in the purchase transaction (e.g., when they record
name, address, what is purchased, date purchased).
Recently Inga has been reading about data mining tools. Data mining
tools allow the user to input large quantities of information about individuals
and then search for correlations and patterns. Inga realizes that she might be
able to derive useful information about her customers. The records contain
credit card numbers, checking account numbers, driver’s license numbers, and
so on, but to make use of this information, it would have to be “mined.” The
zip code alone is extremely valuable in that data mining tools might reveal a
correlation between purchasing habits and zip code, and would allow Inga to
target advertising more effectively. The correlation between zip codes and pur-
chasing pattern might then be correlated with public records on voting pat-
terns to identify what political sympathies customers in various zip codes have
and to see how political aff iliation is correlated with size of purchase. This
could also be useful in targeting advertising.
Inga is conf licted about using data mining tools. On the one hand, her
customers have given information in order to make a purchase and data min-
ing would be using this information in a way that the customers had not antic-
ipated. On the other hand, for the most part, the information would not
identify individuals but rather groups of individuals with f inancial or attitudi-
nal patterns.
Should Inga use data mining tools?

SCENARIO 1.3 Freedom of expression.


In December 1994, Jake Baker, a sophomore at the University of Michigan,
posted three sexual fantasies on an Internet newsgroup “alt.sex.stories.” The
newsgroup was an electronic bulletin board whose contents were publicly avail-
able through the Internet. In one of these stories entitled “Pamela’s Ordeal,”
Baker gave his f ictional victim the name of a real student in one of his classes.
The story describes graphically the torture, rape, and murder of Pamela, and
ends with the woman being doused in gasoline and set af ire while tied to a
chair. In addition to publishing the fantasies on the newsgroup, Baker also ex-
changed e-mails with another man from Ontario, Arthur Gonda, discussing
the sexual acts. In one of these e-mails, Baker said that “[ j]ust thinking about
it anymore doesn’t do the trick . . . I need to do it.” It should be noted that
Gonda’s true identity and whereabouts are unknown. The e-mails were private,
and not available in any publicly accessible portion of the Internet.
A University of Michigan alumnus in Moscow spotted Baker’s stories while
ISBN: 0-558-13856-X

browsing the newsgroup and alerted university off icials. The campus police
and the Federal Bureau of Investigation were then brought in to investigate the

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 3

INTRODUCTION: WHY COMPUTER ETHICS? 3

case. On February 9, 1995, Baker was arrested and was held in custody for 29
days. A month later, he was charged in a superceding indictment with f ive
counts of transmitting interstate communication of a threat to injure another.
The story on which the initial complaint was partially based, however, was not
mentioned in the superceding indictment, which referred only to the e-mail
exchanges between Gonda and Baker. The charges were dropped in June 1995,
on grounds that Baker expressed no true threat of carrying out the acts.
Did Jake Baker do anything wrong? Should the police have arrested him?
This case was wr itten by Marc Quek Pang based on the follow ing sources: United States v. Baker,
Cr iminal No. 95 - 80106, United States Distr ict Court for the Eastern Distr ict of Michigan, South-
ern Division, 890 F.Supp. 1375; U.S. Dist. (1995) (LEXIS 8977); 23 Media L. Rep. 2025 ( June 21,
1995) decided ( June 21, 1995) f iled; Philip Elmer-Dew itt, “Snuff Porn on the Net,” Time Magazine,
Februar y 20, 1995, p. 69; Peter H. Lew is, “An Internet Author of Sexually V iolent Fiction Faces
Charges,” New York Times, (Februar y 11, 1995), p. 7; other sources include local Michigan newspa-
per articles.

SCENARIO 1.4 What is my professional responsibility?


Milo Stein supervises new projects for a large software development f irm. One
of the teams he manages has been working on a new computer game for chil-
dren in the 8 to 14 age group. It is an educational game that involves working
through a maze of challenges and solving inferential reasoning problems. Play-
ers of the game get to choose which character they want to be and other char-
acters appear throughout the game. The characters are primarily exaggerated
macho guys and sexy women.
While Milo is attending a conference of computer professionals, he de-
cides to attend a session focused on gender and minorities in computing. He
listens to several papers focused on various aspects of this matter. One paper
discusses the bias in software, especially in the design of children’s software.
Apparently, when software designers are asked to design games for children,
they design games for boys (Huff and Cooper, 1987). The games are not, then,
comfortable to female users. Milo also hears another paper about the small
number of women and minorities who are majoring in computing in college
despite there being a national crisis due to the shortage of technically trained
people. The session ends with a panel discussion about what computer profes-
sionals can do to make computing more attractive to women and minorities.
When Milo returns to work after the conference, the leader of the team
working on the new computer game reports that the game is ready for f inal
testing before being released for marketing. Milo has never thought much
about the composition of the team before, but he now realizes that the team
consists only of men. Milo wonders if he should ask the team to rethink the
game and have it reviewed for gender and/or racial bias. What should he do?
Even if the game sells well, should a different message be sent with the game?
ISBN: 0-558-13856-X

What is his responsibility in this regard?

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 4

4 INTRODUCTION: WHY COMPUTER ETHICS?

These scenarios pose a variety of types of ethical questions. The f irst raises a
question for personal decision making and is inextricably tied to the law. Is it
morally permissible for an individual to break the law by making a copy of pro-
prietary software? If so, when is law breaking justif ied? When it’s a bad law?
When the law is easy to break? The second scenario also raises a question for
individual decision making, but here the decision has to do with establishing a
policy for a company. Inga has to decide what her company should do and this
means taking into account what is good for the company—its bottom line, its
employees, as well as what her responsibilities are to her customers. The third
scenario poses an issue that could be addressed either as an individual matter
(should I censor myself when I do things on the Internet) or as a public policy
matter (should there be free expression online?). Finally, the fourth scenario
raises a question of professional ethics. What Milo should do in the situation
described is not just a matter of his individual values but has much to do with
the profession of computing. That is, computer professionals have a collective
responsibility to ensure that computing serves humanity well. Moreover, Milo’s
behavior will impact the reputation of computer professionals as well as his
own and his employer’s.
Taken together, the four scenarios illustrate the diverse character of ethi-
cal issues surrounding computer and information technology. Among other
things, the ethical issues involve property rights, privacy, free speech, and pro-
fessional ethics. The development and continuing evolution of computer and
information technology has led to an endless sequence of ethical questions: Is
personal privacy being eroded by the use of computer and information tech-
nology? Should computers be used to do anything they can? What aspects of
information technology should be owned? Who is morally responsible for er-
rors in software, especially those that have catastrophic effects? Will encryp-
tion technology make it impossible to detect criminal behavior? Will virtual
reality technology lead to a populace addicted to fantasy worlds? These ques-
tions ultimately lead to deeper moral questions about what is good for human
beings, what makes an action right and wrong, what is a just distribution of
benef its and burdens, and so on.
While the scenarios at the beginning of the chapter illustrate the diversity
of ethical issues surrounding computer and information technology, it should
be noted that there is still a puzzle about why computer and information tech-
nology give rise to ethical questions. What is it about computer and informa-
tion technology, and not bicycles, toasters, and light bulbs, that creates ethical
issues and uncertainty about right and wrong, good and bad? This question
and a set of related questions are contentious among computer ethicists. The
controversy has focused especially on whether the ethical issues surrounding
computer and information technology are unique. Are the issues so different
from other ethical issues that they require a “new ethics,” or are the ethical is-
ISBN: 0-558-13856-X

sues associated with computer and information technology simply old ethical
issues in a new guise?

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 5

INTRODUCTION: WHY COMPUTER ETHICS? 5

The uniqueness issue is intertwined with several other important and per-
sistent questions. Why or how does computer and information technology give
rise to ethical issues? Is a new f ield of study and/or separate academic courses
needed to address the ethical issues surrounding computer and information
technology? What does one “do” when one does computer ethics? That is, is
there a special methodology required? The uniqueness issue seems to be at the
core of all of these questions. Identif ication of something unique about com-
puter technology holds the promise of explaining why computer technology,
unlike other technologies, gives rise to ethical issues and why a special f ield of
study and/or a special methodology may be needed. Of course, if computer
and information technology is not unique, these issues will have to be resolved
in some other way. I begin with the question why computer and information
technology gives rise to ethical issues and proceed from there to a more de-
tailed account of the uniqueness issue.

NEW POSSIBILITIES AND A VACUUM OF POLICIES

Computer and information technology is not the f irst (nor will it be the last)
technology to raise moral concerns. Think of nuclear power and the atom
bomb. New technologies seem to pose ethical issues when they create new
possibilities for human action, both individual action and collective or insti-
tutional action. Should I donate my organs for transplantation? Should em-
ployers be allowed to use urine or blood tests to determine if employees are
using drugs? Should we build intelligent highways that record automobile li-
cense plates and document when cars enter and leave the highway and how
fast they go?
As these questions suggest, the new possibilities created by technology are
not always good. Often they have a mixed value. New technologies bring bene-
f its as well as new problems, as in the case of nuclear power and nuclear waste,
automobiles and air pollution, aerosol cans and global warming.
Because new technological possibilities are not always good or purely
good, they need to be evaluated—morally as well as in other ways (e.g., eco-
nomically, environmentally). Evaluation can and should take place at each
stage of a technology’s development, and can and should result in shaping the
technology so that its potential for good is better realized and negative effects
are eliminated or minimized. Technical possibilities are sometimes rejected
after evaluation, as in the case of biological weapons, nuclear power (no new
nuclear power plant has been built in the United States for several decades),
and various chemicals that deplete the amount of ozone in the atmosphere or
cause other environmental problems.
So it is with computer and information technology. Enormous possibilities
ISBN: 0-558-13856-X

for individual and institutional behavior have been created. We could not have
reached the moon without computers, nor could we have the kind of global

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 6

6 INTRODUCTION: WHY COMPUTER ETHICS?

communication systems we now have. Information technology used in medicine


has enormously enhanced our ability to detect, diagnose, and treat illness. In-
formation technology has created thriving new industries and facilitated a grow-
ing global economy. Nevertheless, computer and information technology creates
potentially detrimental as well as benef icial possibilities. We now have a greater
capacity to track and monitor individuals without their knowledge, to develop
more heinous weapon systems, and to eliminate the need for human contact in
many activities. The possibilities created by computer and information technol-
ogy, like other technologies, need to be evaluated—morally and in other ways.
Extending the idea that computer and information technology creates new
possibilities, James Moor (1985) has suggested that we think of the ethical
questions surrounding computer and information technology as policy vacu-
ums. Computer and information technology creates innumerable opportuni-
ties. This means that we are confronted with choices about whether and how to
pursue these opportunities, and we f ind a vacuum of policies on how to make
these choices. The central task of computer ethics, Moor argues, is to deter-
mine what we should do and what our policies should be. This includes consid-
eration of both personal and social policies.
The sense in which there is a vacuum of policies surrounding computer and
information technology can be illustrated, f irst, with examples from the early
days of the technology. Consider the lack of rules regarding access to electroni-
cally stored data when computers were f irst developed. Initially there were no
formal policies or laws prohibiting access to information stored on a mainframe
computer. From our perspective today, it may seem obvious that computer f iles
should be treated as private; however, since most early computing took place in
business, government, and educational institutions, the privacy of f iles was not
so obvious. That is, most paper f iles in these institutions were not considered the
personal property of individual employees. Or, consider the lack of policies
about the ownership of software when the f irst software was being written. It
wasn’t clear whether software should be considered private property at all. It was
understood simply to be instructions for a machine.
Since the early days, computer technology has been far from stagnant, and
with each new innovation or application, new policy vacuums have been cre-
ated. Is it ethical for a company with a Web site to place a cookie on the hard
drive of those who visit their site?1 Is data mining morally acceptable? Are In-
ternet domain names being distributed in a fair way? Should surgery be per-
formed remotely with medical imaging technology? Who should be liable for
inaccurate or slanderous information that appears on electronic bulletin
boards? Should computer graphical recreations of incidents, such as automo-
bile accidents, be allowed to be used in courtrooms? Is it right for an individual

1
A cookie is a mechanism that allows a Web site to record your comings and goings, usually
ISBN: 0-558-13856-X

w ithout your knowledge or consent. See www.epic.org/pr ivacy/internet/cookies/ and www


.cookiecentral.com.

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 7

INTRODUCTION: WHY COMPUTER ETHICS? 7

to electronically reproduce and alter an artistic image that was originally cre-
ated by someone else? New innovations, and the ethical questions surrounding
them, continue to arise at an awe-inspiring pace. Policy vacuums continue to
arise and are not always easy to f ill.

FILLING THE VACUUM, CLARIFYING


CONCEPTUAL MUDDLES

The fact that computer and information technology creates policy vacuums may
make the task of computer ethics seem, at f irst glance, easy. All we have to do is
develop and promulgate policies—f ill the vacuums. If only it were so simple!
When it comes to f iguring out what the policies should be, we f ind our-
selves confronted with complex issues. We f ind conceptual muddles that make
it diff icult to f igure out which way to go. And, as we begin to sort out the con-
ceptual muddles, often we f ind a moral landscape that is f luid and sometimes
politically controversial. Consider the case of free speech and the Internet. On
the one hand, it takes some conceptual work to understand what the Internet is
and it takes even more conceptual work to f igure out whether it is an appropri-
ate domain for free or controlled expression. Even if information on the Inter-
net is recognized as a form of speech (expression), we are thrust into a
complex legal and political environment in which speech is protected by the
f irst amendment but a variety of exceptions are made depending on content,
when and where the words are expressed, and so on. So, f iguring out what
norms or laws apply or should apply is not a simple matter. Can we distinguish
different types of expression and protect them in different ways? Can we pro-
tect children while not diminishing adult expression and access?

The Traditionalist Account


How policy vacuums are f illed is, in part at least, a matter of methodology.
How can or should computer-ethical issues be resolved? On one account—call
this the traditionalist account—all that is necessary is to take traditional moral
norms and the principles on which they are based, and apply them to the new sit-
uations created by computer and information technology. For example, when it
came to f illing the policy vacuum with regard to ownership of computer soft-
ware, law yers and judges extended existent property law—copyright, patent, and
trade secrecy—to the new “thing,” computer software (more on this in Chapter
6). To use a more current example, when it comes to online communication, the
traditionalist account suggests that we should look at the conventions that are al-
ready followed in face-to-face, telephone, and written communication, and
“map” these conventions onto computer-mediated communication. Certain
ISBN: 0-558-13856-X

words and questions are considered impolite; certain kinds of conversations are
considered conf idential; and so on. According to the traditionalist account, we

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 8

8 INTRODUCTION: WHY COMPUTER ETHICS?

should take these conventions from precomputer communication and create


similar, parallel conventions regarding computer-mediated communication.
The traditionalist account is important both as a descriptive and as a nor-
mative account. That is, it describes both how policy vacuums are often f illed
and recommends how policy vacuums ought to be f illed. Descriptively the ac-
count captures what people often do when they are f irst introduced to computer
and information technology. For example, when individuals f irst begin using
e-mail, they probably imagine themselves writing letters or talking on the phone.
Hence, they identify themselves when they make contact, without considering
the possibilities of being anonymous or pseudononymous. The new communica-
tion device is treated as prior communication devices were treated, with norms
being carried over from the old to the new. When you hear of someone accessing
your computer f iles, you may think of the parallel with someone breaking into
your house or off ice, and it seems clear that they have violated your property
rights. So, the traditionalist account captures the idea that when we develop
policies with regard to computer and information technology, we tend to draw
on familiar social and moral norms, extending them to f it the new situation.
The traditionalist account is also normative in that it recommends how we
should proceed in f illing policy vacuums. It recommends that we make use of
past experience. For example, we already know a good deal about property and
the sorts of situations that are likely to arise when property claims come into
conf lict. Similarly, when it comes to communication, we already know that words
can be harmful and can offend, and we know that individuals have an interest in
some conversations being conf idential. It makes good sense to draw on these ex-
periences when it comes to a new situation, whether it is one created by com-
puter and information technology or something else. So, the normative thrust of
the traditionalist account seems important and valuable. We should take norms
and principles from precomputer situations and see how they extend to the cir-
cumstances of a computerized environment.
Nevertheless, the traditionalist account has two serious problems. As a de-
scriptive account of how the ethical issues associated with computer and infor-
mation technology are addressed and resolved, it oversimplif ies. And, as a
normative account of how we should resolve these ethical issues, it has serious
dangers. The traditionalist account over-simplif ies the task of computer ethics
insofar as it suggests that extending old norms to new situations is a somewhat
mechanical or routine process. This hides the fact that the process is f luid and
synthetic. When it comes to resolving the ethical issues surrounding computer
and information technology, often the technology is not a f ixed and determinate
entity. That is, the technology is itself still “in the making” so to speak. So in try-
ing to resolve an ethical issue arising from the use of computer and information
technology, the f irst step is to clear up the conceptual muddles and uncertain-
ties that are found. These conceptual muddles have to do with understanding
ISBN: 0-558-13856-X

what the technology is or should be and what sort of situations it creates.

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 9

INTRODUCTION: WHY COMPUTER ETHICS? 9

I mentioned earlier the uncertainty surrounding how to think about the


Internet as a forum for communication. This is a good example of the f luidity
and uncertainty of technology. If we don’t know what the Internet is exactly, we
can’t know which rules or principles should be applied.
Another example is computer software. A complex body of law regarding
ownership of new inventions existed long before the invention of computers, in-
cluding patent law, copyright, and trade secrecy. Applying this law to computer
software, however, was enormously diff icult because nothing with all the charac-
teristics of software programs had existed before; so it was unclear how software
programs should be conceptualized or categorized. Is a program the kind of
thing that should be treated as property? Is a program the expression of an
idea? If so, is it a form of intellectual property for which copyright law is appro-
priate? Or, is it (should it be seen as) a process for changing the internal struc-
ture of a computer? Or perhaps a program should be seen as a series of “mental
steps,” capable, in principle, of being thought through by a human, and not,
thereby, appropriate for ownership. Before existent law or norms could be ap-
plied, a concept had to be f ixed.
This is not to say that traditional legal or moral norms were irrelevant to
the policy vacuum surrounding computer programs. On the contrary, there
was a need to clear up the conceptual muddle so that the new entity could be
seen in relation to familiar legal and moral norms. It is important to keep in
mind that deciding whether computer programs are expressions of ideas or
mental steps or design specif ications for machines is not an issue with a prede-
termined right answer. Law yers, judges, and policy makers had to decide what
computer programs should be treated as, and in doing this, they, in a sense,
made computer software what it is. Deciding that copyright law applied to soft-
ware def ined what software is. Later deciding that patent law applied to cer-
tain types of software also def ined it. In Chapter 6, we will see that various
aspects of new software creations persist in challenging traditional property
norms. Filling policy vacuums is not a simple process of applying known laws
and principles to entities that can be subsumed under them. A good deal of ne-
gotiation is required to get the technology and the law or principle to f it.
Our understanding of the Internet also illustrates the f luid rather than
mechanical way that traditional norms and laws are extended to computer and
information technology. Writers have had a good deal of fun trying to concep-
tualize the Internet. Some have conceptualized it as a network of highways, the
superhighways of the future. Others have thought of it as a huge shopping mall
with an almost inf inite number of possible stores, and you navigate your way
through the mall, perhaps discovering some places you do not want to go. Yet
others have likened the Internet to Disneyland, suggesting that what you f ind
on the Internet should always be treated as a fantasy world. These metaphori-
cal renderings of the Internet are attempts to conceptualize the Internet in a
ISBN: 0-558-13856-X

way that will help us f ill policy vacuums. Another good illustration of this f luid

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 10

10 INTRODUCTION: WHY COMPUTER ETHICS?

conceptual activity is the process of trying to understand the act of placing a


cookie on the computer of a visitor to a Web site. Is it intrusive surveillance or
business as usual? Are cookies comparable to a store asking for your zip code
when you buy something, so that it can do marketing analysis and determine in
what neighborhoods to advertise? Or is it more like installing a camera in a
store to watch and see every customer that enters? Or is going from one Web
site to the next more like traveling on a highway in which case cookies seem
more like a surveillance technology. Or, since you may be sitting at home when
you navigate on the Web, are cookies comparable to cameras in your home
watching what you are reading? Needless to say, how we understand the activity
makes all the difference in our evaluation of it and in determining what poli-
cies seem appropriate. Deciding how to conceptualize the activity and decid-
ing which norms apply go hand in hand.
The traditionalist account is correct insofar as it suggests that in resolving
the ethical issues surrounding computer and information technology, we often
try to extend norms and principles from familiar situations to new situations.
The account goes wrong, however, when it suggests that this process is simple,
routine, or mechanical.
A second problem with the traditionalist account arises from its recommen-
dation that we resolve the ethical issues involving computer and information
technology by extending norms and laws from situations in which there is no
technology or old technology. As already suggested, this is a worthy recommen-
dation insofar as it recommends drawing on experience. The norms that are fol-
lowed in many prevailing practices have survived the test of time. They often
embody important social values such as respect for persons, fairness, and so on.
Nevertheless, the recommendation has a danger that should be kept in mind.
New technologies, as already mentioned, create new opportunities. If we simply
extend old norms to new situations, we run the risk of missing the new opportu-
nities. In other words, if we treat new situations as if they are comparable to
known and familiar situations, we may fail to take advantage of the novel fea-
tures of the new technology. To f ill policy vacuums created by computer and in-
formation technology with traditional norms may prevent the creation of new
ways of doing things.
Since we do not live in a perfect world, the opportunities created by com-
puter and information technology are opportunities to change the way we do
things for the better. Computer and information technology creates opportuni-
ties for new kinds of practices—new kinds of social arrangements, relation-
ships, and institutions. Extending traditional norms and principles to the new
possibilities runs the risk of reproducing undesirable practices or not improv-
ing on acceptable practices.
When computer programs were f irst being developed, many in the com-
puting community saw the potential for software to be readily available to ev-
ISBN: 0-558-13856-X

eryone, since programs could be copied without loss to the original developer.
They also saw the potential for all kinds of information to be distributed

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 11

INTRODUCTION: WHY COMPUTER ETHICS? 11

cheaply and easily in the electronic medium. This was recognized to be an in-
vention on the order of the printing press in importance but on an even
grander scale. The debate about property rights and how to interpret and
apply them to software is, in a sense, a debate about taking advantage of the
special features of software to create a system of distribution that has never
been possible before. Most recently, this debate is taking place around the dis-
tribution of music on the Internet.
To be fair to the traditionalist account, it need not be committed to adopt-
ing norms and policies that are identical to those that prevailed before com-
puter and information technology. A traditionalist could take the position that
traditional norms and principles must be modif ied when they are extended to
new situations. In modifying their position in this way, the traditionalist moves
somewhat away from recommending simply that we extend the old to the new.
In this weaker version, there is the suggestion of something new being created
in the process of extending old norms and principles.
The traditionalist account is a good starting place for understanding how
the ethical issues surrounding computer and information technology are and
should be resolved and how policy vacuums are and should be f illed, but it has
serious limitations. As a descriptive account, it does not capture all that is in-
volved. Filling policy vacuums is not only a matter of mechanically applying
traditional norms and principles. Conceptual muddles have to be cleared up,
often a synthetic process in which normative decisions are invisibly made.
Moreover, as a normative account, the traditionalist position runs the risk of
not taking advantage of the new features of, and new opportunities created by,
computer and information technology. Hence, we need to move beyond the
traditionalist account.

COMPUTERS USED IN A SOCIAL CONTEXT

Clearing up the conceptual muddles and f illing policy vacuums involves under-
standing the social context in which the technology is embedded. Computer and
information technology is developed and used in a social context rich with
moral, cultural, and political ideas. The technology is used in businesses, homes,
criminal justice systems, educational institutions, medicine, science, govern-
ment, and so on. In each one of these environments, there are human purposes
and interests, institutional goals, social relationships, traditions, social conven-
tions, regulations, and so on. All of these have an inf luence on how a new tech-
nology is understood and how policy vacuums are f illed.
For example, by some measure of eff iciency, it might be best for the United
States, as a whole, to create one master database of information on individual
citizens, with private and public agencies having access to appropriate seg-
ISBN: 0-558-13856-X

ments of the database. There are, however, a variety of reasons why such an
arrangement has not yet come about and is not likely to come about in the near

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.
John_0130836990_c01.qxd 10/17/00 8:13 AM Page 12

12 INTRODUCTION: WHY COMPUTER ETHICS?

future. These reasons include historically shaped social fears of powerful cen-
tralized government, beliefs about the ineff iciency of centralized control, an
already established information industry, a political environment favoring pri-
vatization, and so on.
Social context shapes the very character and direction of technological de-
velopment. This is true at the macro level when we think about the development
of computer and information technology over time. It is also true at the micro
level when we focus on how specif ic applications are adopted and used at partic-
ular sites such as small businesses, college campuses, or government agencies.
Imagine, for example, the process of automating criminal justice records in a
local police station. The specif ications of the system—who will have access to
what, the kind of information that is stored and processed, the type of security,
and so on—are likely to be determined by a wide variety of factors, including the
unit’s understanding of its mission and priorities, the existence of laws specify-
ing the legal rights of citizens who are arrested and accused, the agency’s
budget, and the relationships the unit has with other criminal justice agencies.
One of the reasons the study of ethical issues surrounding computer and in-
formation technology is so fascinating is that in order to understand these is-
sues, one has to understand the environments in which it is being used. In this
respect, the study of computer ethics turns out be the study of human beings
and society—our goals and values, our norms of behavior, the way we organize
ourselves and assign rights and responsibilities. To understand the impact of
computer and information technology in education or government, for example,
we have to learn a good deal about what goes on and is intended to go on in
these sectors. To f igure out what the rules governing electronic communication
should be in a particular environment, we have to explore the role of communi-
cation in whatever sector we are addressing. For example, because universities
are educational institutions, they tend to promote free speech much more than
would be tolerated in a business environment. And even in a university environ-
ment, attitudes toward free speech will vary from country to country.
The study of computer ethics may be seen as a window through which we
view a society—its activities and ideals, the social, political, and economic forces
at work. Perhaps the most important thing about computer and information
technology is its malleability. It can be used to do almost anything that can be
thought of in terms of a series of logical steps or operations, with input and out-
put. Because of this malleability, computer and information technology can be
used in a wide range of activities touching every aspect of human endeavor.
Computer and information technology can be used as much to keep things
the same as to cause change. Indeed, as the traditionalist account suggests, when
this technology enters a new environment, we tend, initially at least, to map the
way we had been doing things onto the new computer system. The process of
computerization often involves looking at the way people have been doing a par-
ISBN: 0-558-13856-X

ticular task or set of tasks—bookkeeping, educating, manufacturing, communi-


cating, and then computerizing those activities. Nevertheless, over time, these

Computer Ethics, Third Edition, by Deborah G. Johnson. Published by Prentice Hall. Copyright © 2001 by Prentice-Hall, Inc.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy