60
Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Embed Size (px)

Citation preview

Page 1: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Ethics, Values & Issues in Cybertechnology >>> CS222.01

Concepts, methodologies and Codes of Cyberethics

Page 2: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

What Is Cyberethics?

Cyberethics is the study of moral, legal, and social issues involving cybertechnology.

It examines the impact that cybertechnology has for our social, legal, and moral systems.

It also evaluates the social policies and laws that have been framed in response to issues generated by the development and use of cybertechnology.

Hence, there is a reciprocal relationship here.

Page 3: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

What Is Cybertechnology?

Cybertechnology refers to a wide range of computing and communications devices – from standalone computers, to "connected" or networked computing and communications technologies, to the Internet istself.

Cybertechnologies include: hand-held devices (such as iPhones), personal computers (desktops and laptops), mainframe computers, and so forth.

Page 4: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Cybertechnology (Continued)

Networked devices can be connected directly to the Internet.

They also can be connected to other devices through one or more privately owned computer networks.

Privately owned networks include both Local Area Networks (LANs) and Wide Area Networks (WANs).

Page 5: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Why the term cyberethics?

Cyberethics is a more accurate label than computer ethics, which might suggest the study of ethical issues limited to computing machines, or to computing professionals.

It is more accurate than Internet ethics, which is limited only to ethical issues affecting computer networks.

Page 6: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Table 1-1: Summary of Four Phases of Cyberethics

Phase

Time Period Technological Features

Associated Issues

1 1950s-1960s Stand-alone machines (large mainframe computers)

Artificial intelligence (AI), database privacy ("Big Brother")

2 1970s-1980s Minicomputers and PCs interconnected via privately owned networks

Issues from Phase 1 plus concerns involving intellectual property and software piracy, computer crime, privacy and the exchange of records.

3 1990s-Present Internet and World Wide Web Issues from Phases 1 and 2 plus concerns about free speech, anonymity, legal jurisdiction, virtual communities, etc.

4 Present toNear Future

Convergence of information and communication technologies with nanotechnology research and genetic and genomic research, etc.

Issues from Phases 1-3 plus concerns about artificial electronic agents ("bots") with decision-making capabilities, bionic chip implants, nanocomputing research, etc.

Page 7: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Are Cyberethics issues unique?

7

Amy Boyer, 20, from NH, was shot and killed outside her car in 1999.

The killer, who had seen her once in middle school and became infatuated, got her SS#, license plate, and place of employment from the Internet. He ambushed her as she left work.

An early instance of cyberstalking, Boyer’s case led to new criminal laws.

Page 8: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (cont.)

Is there anything new or unique about Boyer’s case from an ethical point of view?

Boyer was stalked in ways that were not possible before cybertechnology.

But do new ethical issues arise?

Page 9: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (Continued) Two points of view: Traditionalists argue that nothing is new

– crime is crime, and murder is murder. Uniqueness Proponents argue that

cybertechnology has introduced (at least some) new and unique ethical issues that could not have existed before computers.

Page 10: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (Continued) Both sides seem correct on some claims,

and both seem to be wrong on others. Traditionalists underestimate the role that

issues of scale and scope that apply because of the impact of computer technology.

Cyberstalkers can stalk multiple victims simultaneously (scale) and globally (because of the scope or reach of the Internet).

They also can operate without ever having to leave the comfort of their homes.

Page 11: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (Continued) Uniqueness proponents tend to

overstate the effect that cybertechnology has on ethics per se.

Maner (1996) argues that computers are uniquely fast, uniquely malleable, etc.

There may indeed be some unique aspects of computer technology.

Page 12: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (Continued) But uniqueness proponents tend to

confuse unique features of technology with unique ethical issues.

They use the following logical fallacy: Cybertechnology has some unique

technological features. Cybertechnology generates ethical issues. Therefore, the ethical issues generated by

cybertechnology must be unique.

Page 13: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (Continued) Traditionalists and uniqueness

proponents are each partly correct. Traditionalists correctly point out that no

new ethical issues have been introduced by computers.

Uniqueness proponents are correct in that cybertechnology has complicated our analysis of traditional ethical issues.

Page 14: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Uniqueness Issue (Continued) So we must distinguish between: (a)

unique technological features, and (b) any (alleged) unique ethical issues.

Two scenarios from the text: (a) Computer professionals designing and

coding a controversial computer system (b) Software piracy

Page 15: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Alternative Strategy for Anal- yzing the Uniqueseness Issue

James Moor (1985) argues that computer technology generates “new possibilities for human action” because computers are logically malleable.

Logical malleability, in turn, introduces policy vacuums.

Policy vacuums often arise because of conceptual muddles.

Page 16: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Case Illustration of a Policy Vacuum: Duplicating Software

In the early 1980s, there were no clear laws regarding the duplication of software programs, which was made easy because of personal computers.

A policy vacuum arose. Before the policy vacuum could be filled,

we had to clear up a conceptual muddle: What exactly is software?

Page 17: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Laws vs. Software Controlling Technology

17

Attempting to control technology through law and regulation has often been futile.

Correcting technology with other technology has been more effective.

Ex. Laws suppressing pornography have been rough to enforce but software that filters out pornography has been more successful.

Page 18: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Cyberethics as a Branch of Applied Ethics

Applied ethics, unlike theoretical ethics, examines "practical" ethical issues.

It analyzes moral issues from the vantage-point of one or more ethical theories.

Ethicists working in fields of applied ethics are more interested in applying ethical theories to the analysis of specific moral problems than in debating the ethical theories themselves.

Page 19: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Cyberethics as a Branch of Applied Ethics (continued)

Three distinct perspectives of applied ethics (as applied to cyberethics):

Professional Ethics Philosophical Ethics Descriptive Ethics

Page 20: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Perspective # 1: Professional Ethics

According to this view, cyberethics is the field that identifies and analyzes issues of ethical responsibility for computer professionals.

Consider a computer professional's role in designing, developing, and maintaining computer hardware and software systems. Suppose a programmer discovers that a

software product she has been working on is about to be released for sale to the public, even though it is unreliable because it contains "buggy" software.

Should she "blow the whistle?"

Page 21: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Professional Ethics

Don Gotterbarn (1991) argued that all genuine computer ethics issues are professional ethics issues.

Computer ethics, for Gotterbarn is like medical ethics and legal ethics, which are tied to issues involving specific professions.

He notes that computer ethics issues aren’t about technology – e.g., we don’t have automobile ethics, airplane ethics, etc.

Page 22: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Criticism of Professional Ethics Perspective

Gotterbarn’s model for computer ethics seems too narrow for cyberethics.

Cyberethics issues affect not only computer professionals; they effect everyone.

Before the widespread use of the Internet, Gotterbarn’s professional-ethics model may have been adequate.

Page 23: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Perspective # 2: Philosophical Ethics

From this perspective, cyberethics is a field of philosophical analysis and inquiry that goes beyond professional ethics (Gotterbarn).

Moor (1985), defines computer ethics as:

...the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such technology. [Italics Added.]

Page 24: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Philosophical Ethics Perspective (continued)

Moor argues that automobile and airplane technologies did not affect our social policies and norms in the same kinds of fundamental ways that computer technology has.

Automobile and airplane technologies have revolutionized transportation, resulting in our ability to travel faster and farther than was possible in previous eras.

But they did not have the same impact on our legal and moral systems as cybertechnology.

Page 25: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Philosophical Ethics: Standard Model of Applied Ethics

Philip Brey (2000) describes the “standard methodology” used by philosophers in applied ethics research as having three stages:

1) Identify a particular controversial practice as a moral problem.

2) Describe and analyze the problem by clarifying concepts and examining the factual data associated with that problem.

3)Apply moral theories and principles to reach a position about the particular moral issue.

Page 26: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Perspective #3: Cyberethics as a Field of

Descriptive Ethics

The professional and philosophical perspectives both illustrate normative inquiries into applied ethics issues.

Normative inquiries or studies are contrasted with descriptive studies.

Descriptive investigations report about "what is the case“; normative inquiries evaluate situations from the vantage-point of the question: "what ought to be the case."

Page 27: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Descriptive Ethics Perspective (continued)

Scenario: A community’s workforce and the introduction of a new technology.

Suppose a new technology displaces 8,000 workers in a community.

If we analyze the issues solely in terms of the number of jobs that were gained or lost in that community, our investigation is essentially descriptive in nature.

We are simply describing an impact that technology X has for Community Y.

Page 28: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Descriptive Ethics Perspective (continued)

Descriptive vs. Normative Claims Consider three assertions:

(1) "Bill Gates served as the Chief Executive Officer of Microsoft Corporation for many years.”

(2) "Bill Gates should expand Microsoft’s product offerings.“

(3) “Bill Gates should not engage in business practices that are unfair to competitors.”

Claims (2) And (3) are normative, (1) is descriptive; (2) is normative but nonmoral, while (3) is both normative and moral.

Page 29: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Figure 1-1: Descriptive vs. Normative Claims

Descriptive Normative (Report or describe what is the case) (Prescribe what ought to be the case)

Non-moral Moral

Prescribe or evaluatein matters involvingstandards such as art and sports  (e.g., criteria for a good painting or an outstanding athlete).

Prescribe or evaluate in matters having to do with fairness and Obligation (e.g., criteria for just and unjust actions and policies).

Page 30: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Some Benefits of Using the Descriptive Approach

Huff & Finholt (1994) claim that when we understand the descriptive aspect of social effects of technology, the normative ethical issues become clearer.

The descriptive perspective prepare us for our subsequent analysis of ethical issues that affect our system of policies and laws.

Page 31: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Table 1-2: Summary of Applied Cyberethics Perspectives

Type of Perspective

Associated Disciplines

Issues Examined

Professional Computer ScienceEngineeringLibrary/Information Science

Professional ResponsibilitySystem Reliability/SafetyCodes of Conduct

Philosophical PhilosophyLaw

Privacy & AnonymityIntellectual PropertyFree Speech

Descriptive SociologyBehavioral Sciences

Impact of cybertechnology on governmental/financial/ educational institutions and socio-demographic groups

Page 32: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

General Cyberethics Theory and Methodology

32

Lessig Moor Finnis Brey

Page 33: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Larry Lessig’s Framework33

Four constraints that regulate our behavior in real space: laws, norms, the market and code / architecture

Laws – rules imposed by the government which are enforced by ex post (after the fact) sanctions The complicated IRS tax code is a set of

laws that dictates how much we owe. If we break these laws we are subject to fines / penalties.

Page 34: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Larry Lessig’s Framework34

• Social Norms – expressions of the community. Most have well defined sense of normalcy in norms, standards and behavior.– Cigar smokers are not welcome at most

functions.• The Market – prices set for goods, services

or labor.– $3.95 for coffee and local coffee shop

• Architecture – physical constraints of our behavior.– A room without windows imposes certain

constraints because no one can see outside.

Page 35: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Real Life vs. Cyberspace35

Subject to the same four constraints Laws – provide copyright and patent

protection Markets – advertisers gravitate towards more

popular web sites Architectural – software code such as

programs and protocols (constrain and control our activities). Ex. Web sites demanding username/passwords and software deployed to filter spam and certain email.

Norms – Internet etiquette and social customs. Flaming is a bad norm.

Page 36: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

James Moor36

Moor’s list of core human goods (considered thin) include: Life Happiness – pleasure and absence of pain Autonomy – goods that we need to

complete our projects (ability, security, knowledge, freedom, opportunity, reason)

Page 37: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

John Finnis37

• Finnis’ version of human good (considered thick) includes:– Life– Knowledge– Play (and skillful work)– Aesthetic experience– Sociability– Religion– Practical reasonableness (includes autonomy)

• Participation in these goods allow us to achieve genuine human flourishing

Page 38: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Both Moor and Finnis Believe38

Ultimate good, human flourishing of ourselves and others should be our guidepost of value, serving as a basis for crafting laws, developing social institutions and regulating the Internet.

Golden Rule (Matthew 7:12) “So whatever you wish that others would do to

you, do also to them” Immanual Kant stated “Act so that you treat

humanity always as an end and never as a means”

Page 39: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Blocking Software39

• Those who write programs or create laws should rely on ethics as their guide.

• Code writers need to write in such a way that preserves basic moral values such as autonomy and privacy.

• Many feel technology is just a tool and it is up to us whether this powerful tool is used for good or ill purposes.

Page 40: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Technological Realism40

Two extremes: Up to us what happens Technology locks us into inescapable cage

Technological Realism – acknowledges that technology has reconfigured our political and social reality and it does influence human behavior in particular ways.

Page 41: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Two Broad Ethical Frameworks41

Teleological – rightness or wrongness of an action depends on whether the goal or desired end is achieved (look at the consequences – maybe OK to lie). Sometimes called consequentialism

Deontological – is an action right or wrong. Act out of obligation or duty. Prohibition against harming the innocent.

Page 42: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Utilitarianism42

Teleological Most popular version of

consequentialism Right course of action is to promote

the most general good The action is good if it produces the

greatest net benefits or lowest net cost

Page 43: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Contractarianism43

Deontologic Rights-based Looks at moral issues from viewpoint of

the human rights that may be at stake Negative right – implies one is free from

external interference in one’s affairs (state can’t tap phones)

Positive right – implies a requirement that the holder of this right be provided with whatever one needs to pursue legitimate interests (rights to medical care and education)

Page 44: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Pluralism44

• Deontologic• Duty-based• Actions only have moral worth when they

are done for the sake of duty– Ex. If everyone would break promises there

would be no such thing as a promise.– Consider this when looking at intellectual

property– Ask the question “What if everybody did what

you are doing?”– Respect for other human beings

Page 45: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

7 Moral Duties45

1. Keep promises and tell truth (fidelity)2. Right the wrongs you inflicted (reparation)3. Distribute goods justly (justice)4. Improve the lot of others with respect to

virtue, intelligence and happiness (beneficence)

5. Improve oneself with respect to virtue, intelligence and happiness (self-improvement)

6. Exhibit gratitude when appropriate (gratitude)7. Avoid injury to others (noninjury)

Page 46: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

New Natural Law46

Good should be done and evil avoided This principle is too general.

Page 47: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Flaws in Moral Theories47

None are without flaws or contradictions 4 frameworks converge on same

solutions but suggest different solutions One must decide which framework they

will follow and “trump” the others

Page 48: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Principlism48

Popularized by Beauchamp and Childress “At first glance” one principle should be

given more weight than others but 4 principles are: autonomy,

nonmaleficence, beneficence and justice

Page 49: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Autonomy49

Is a necessary condition of moral responsibility

Individuals shape their destiny according to their notion of the best sort of life worth living

If deprived of their autonomy, someone is not treated with the respect they deserve.

Page 50: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Nonmaleficence50

Above all else – do no harm

Page 51: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Beneficence 51

This is a positive duty We should act in such a way that we

advance the welfare of other people when we are able to do so

Page 52: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Justice 52

Similar cases should be treated in similar ways

Fair treatment

Page 53: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Is Cyber-technology Neutral? Technology seems neutral, at least

initially. Consider the cliché: “Guns don’t kill

people, people kill people.” Corlann Gee Bush (19997) argues that

gun technology, like all technologies, is biased in certain directions.

She points out that certain features inherent in gun technology itself cause guns to be biased in a direction towards violence.

Page 54: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Is Technology Neutral (continued)?

Bush uses an analogy from physics to illustrate the bias inherent in technology.

An atom that either loses or gains electrons through the ionization process becomes charged or valenced in a certain direction.

Bush notes that all technologies, including guns, are similarly valenced in that they tend to "favor" certain directions rather than others.

Thus technology is biased and is not neutral.

Page 55: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

A "Disclosive" Method for Cyberethics

Brey (2001) believes that because of embedded biases in cybertechnology, the standard applied-ethics methodology is not adequate for identifying cyberethics issues.

We might fail to notice certain features embedded in the design of cybertechnology.

Using the standard model, we might also fail to recognize that certain practices involving cybertechnology can have moral implications.

Page 56: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Disclosive Method (Continued) Brey notes that one weakness of the

“standard method of applied ethics” is that it tends to focus on known moral controversies

So that model fails to identify those practices involving cybertechnology which have moral implications but that are not yet known.

Brey refers to these practices as having morally opaque (or morally non-transparent) features, which he contrasts with "morally transparent” features.

Page 57: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Figure 1-2Embedded Technological Features Having Moral Implications

Known Features Unknown Features

Transparent Features Morally Opaque Features

Users are aware of these features but do not realize they have moral implications.

Examples can include:Web Forms and search-engine tools.

Users are not even aware of the technological featuresthat have moral implications Examples can include:Data mining and Internet cookies.

Page 58: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

A Multi-Disciplinary & Multi-Level Method for Cyberethics

Brey’s “disclosive method” is multidisciplinary because it requires the collaboration of computer scientists, philosophers, and social scientists.

It also is multi-level because the method for conducting computer ethics research requires the following three levels of analysis: disclosure level theoretical level application level.

Page 59: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Table 1-3: Three Levels in Brey’s “Disclosive Model”

Disclosive Computer ScienceSocial Science (optional)

Disclose embedded features in computer technology that have moral import

Theoretical Philosophy Test newly disclosed features against standard ethical theories

Application Computer SciencePhilosophySocial Science

Apply standard or newly revised/ formulated ethical theories to the issues

Level Disciplines Involved Task/Function

Page 60: Ethics, Values & Issues in Cybertechnology >>> CS222.01 Concepts, methodologies and Codes of Cyberethics

Three-step Strategy for Approaching Cyberethics Issues

Step 1. Identify a practice involving cyber-technology, or a feature in that technology, that is controversial from a moral perspective.

1a. Disclose any hidden (or opaque) features or issues that have moral implications 1b. If the issue is descriptive, assess the sociological implications for relevant social institutions

and socio-demographic and populations. 1c. If there are no ethical/normative issues, then stop.

1d. If the ethical issue is professional in nature, assess it in terms of existing codes of conduct/ethics for relevant professional associations (see Chapter 4).

1e. If one or more ethical issues remain, then go to Step 2.

Step 2. Analyze the ethical issue by clarifying concepts and situating it in a context.

2a. If a policy vacuums exists, go to Step 2b; otherwise go to Step 3.2b. Clear up any conceptual muddles involving the policy vacuum and go to Step 3 .

Step 3. Deliberate on the ethical issue. The deliberation process requires two stages:

3a. Apply one or more ethical theories (see Chapter 2) to the analysis of the moral issue, and then go to step 3b.

3b. Justify the position you reached by evaluating it against the rules for logic/critical thinking (see Chapter 3).