25
Ethics in a Computing Culture Chapter 9 Autonomous and Pervasive Technology

Ethics in a Computing Culture

Embed Size (px)

DESCRIPTION

Ethics in a Computing Culture. Chapter 9 Autonomous and Pervasive Technology. Objectives. What makes a technology pervasive? What does it mean to be autonomous? How are pervasiveness and autonomy related? How do they contribute to change in societies? - PowerPoint PPT Presentation

Citation preview

Page 1: Ethics in a Computing Culture

Ethics in a Computing Culture

Chapter 9Autonomous and Pervasive Technology

Page 2: Ethics in a Computing Culture

Objectives

• What makes a technology pervasive? What does it mean to be autonomous?

• How are pervasiveness and autonomy related? How do they contribute to change in societies?

• How can the existence of such phenomena cause problems for members in society?

2Ethics in a Computing Culture

Page 3: Ethics in a Computing Culture

Autonomous and Pervasive Technology

• Pervasive: technology that has spread widely throughout society

• Autonomy: the freedom to make decisions without outside constraints or interference

3Ethics in a Computing Culture

Page 4: Ethics in a Computing Culture

Case: Pervasiveness of Camera Phones

• What are some of the other effects of the pervasiveness of cameras, and especially camera phones?

• List some of the ways people use cameras differently now than they did 20 years ago.

• Are these changes good, bad, or neutral?

4Ethics in a Computing Culture

Page 5: Ethics in a Computing Culture

Case: Pervasiveness of Camera Phones

• Lifelogging: a lifelogger wears a computer that records every moment of his or her life, usually in both audio and video. The camera and audio recording devices are never turned off, no matter what, but the recorded data is not necessarily publicly posted. – What are the potential dangers, and benefits, of lifelogging? – Is lifelogging morally permissible?

5Ethics in a Computing Culture

Page 6: Ethics in a Computing Culture

Case: Injured By GPS

• Consider the three main parties in the case: 1) Ms. Rosenberg

2) Google

3) the driver of the car

– What percentage of the blame does each deserve for what happened?

6Ethics in a Computing Culture

Page 7: Ethics in a Computing Culture

Case: Injured By GPS (continued)

7

Page 8: Ethics in a Computing Culture

Case: Injured By GPS (continued)

• Jacobsson’s article about the lawsuit only briefly mentions the driver of the car. – Why do you think that the article focuses on the Rosenberg vs.

Google part of the case, instead of Rosenberg vs. the driver?

8Ethics in a Computing Culture

Page 9: Ethics in a Computing Culture

Case: More GPS, More Risks

• Overall, do you think that emergency locator beacon technology is a good thing, or a bad thing?

• How should authorities handle false alarms? Should people who send false alarms be fined, charged for the cost of their rescue, or otherwise penalized?

• Is it morally permissible to charge people to be rescued when they really are in danger?

9Ethics in a Computing Culture

Page 10: Ethics in a Computing Culture

More on the Definition of “Autonomous”

10Ethics in a Computing Culture

Page 11: Ethics in a Computing Culture

More on the Definition of “Autonomous” (continued)

• Autonomous, intelligent, or robotic?– Google’s Web search– Security cameras that can “recognize” wanted criminals and

alert authorities if a match is seen – The control mechanism of a traffic light, which causes it to cycle

through its signals appropriately

11Ethics in a Computing Culture

Page 12: Ethics in a Computing Culture

Automated Security Cameras

• One way to minimize the possibility for abuse of surveillance cameras is to remove the robotic tilt and pan, so that the cameras cannot be redirected to spy on people. – What are the possible negative consequences of this policy?– Should the policy be adopted?

12Ethics in a Computing Culture

Page 13: Ethics in a Computing Culture

Case: Grading Essays by Computer

• Could a computer do as good a job of grading essays as an average teacher? – Could it do a better job than the worst teacher you have had? – Than the best teacher you had?

• Should students have the right to challenge their grades and demand that a human grade their papers if they disagree with a computer-generated grade?

• Should students have the right to challenge their grades and demand that a computer grade their papers if they disagree with a human-generated grade?

13Ethics in a Computing Culture

Page 14: Ethics in a Computing Culture

Case: Grading Essays by Computer (continued)

• Is it morally permissible for a college instructor to use e-rater and assign grades to student essays based on its output without actually looking at the essays?

• Assuming that college instructors continued to grade papers in the way that they always have, would it be beneficial for them also to use e-rater, to get a second opinion?

14Ethics in a Computing Culture

Page 15: Ethics in a Computing Culture

Case: Grading Essays by Computer (continued)

• Imagine a situation in which a student applies to an MBA program, but is rejected due to a low GMAT score. Suppose also that this low GMAT score was an error: The student’s essays were actually quite good, but e-rater scored them incorrectly because the student had an unusual writing style. – Who, if anyone, is morally responsible for the student’s

unfortunate situation?

15Ethics in a Computing Culture

Page 16: Ethics in a Computing Culture

Case: Remote Parking

• A driver pulls up next to a parking space, checks to make sure the space is clear, presses the button to start the automatic parking, and then walks away. After the driver’s back is turned, a small child runs into the space and is seriously injured.

• Who is primarily morally responsible for the child’s injury? 1. the driver

2. the car company

3. the child

4. the adult in charge of the child

5. no one

16Ethics in a Computing Culture

Page 17: Ethics in a Computing Culture

Case: Remote Parking (continued)

• Imagine that, instead of using a computerized parking assistant, the driver had used valet parking (that is, a human parking assistant), and a child was injured. – What if the valet was noticeably dizzy and smelled strongly of

alcohol, and the driver still chose to give his keys to the valet?

17Ethics in a Computing Culture

Page 18: Ethics in a Computing Culture

Software with Emergent Behaviors

• Machine learning: type of artificial intelligence; algorithms that allow computers to take in data and automatically learn to recognize patterns and make predictions

• Unmanned Aerial Vehicles (UAVs or “drones”)– What are some possible negative effects of an over-sensitive

UAV that mistakes something harmless for a threat?– Consider this statement:

“An erroneous decision made by the UAV is a fault in the UAV, and not directly attributable to any person or group of persons.”

18Ethics in a Computing Culture

Page 19: Ethics in a Computing Culture

I, Roommate

• Companion robots: not capable of “loving you back”– How is this problematic?– Is it ethical to encourage senior citizens to form emotional bonds

with robots, simply because it makes them feel happier?

19Ethics in a Computing Culture

Page 20: Ethics in a Computing Culture

I, Roommate (continued)

• In I, Robot, Isaac Asimov proposed three rules that would provide the basic moral guidelines for a robot:1. A robot may not injure a human being or, through inaction,

allow a human being to come to harm.

2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

20Ethics in a Computing Culture

Page 21: Ethics in a Computing Culture

Touch Screens and Visual Disabilities

21Ethics in a Computing Culture

Page 22: Ethics in a Computing Culture

Touch Screens and Visual Disabilities (continued)

• Recall the ACM/IEEE Software Engineering Code of Ethics and Professional Practice from Chapter 2, which describes how software engineers ought to behave. – Should software and hardware developers be required, by the

Code of Professional Ethics, to take vision impairments into account when designing new technologies?

– Was it morally praiseworthy for Google to hire T.V. Raman to work on accessibility features for their products?

22Ethics in a Computing Culture

Page 23: Ethics in a Computing Culture

Case: The Flash Crash

• Most experts believe that high-frequency trading has significantly increased the efficiency of the stock market. This results in everyone, not just the high-frequency traders, being better off. – Is the risk of periodic flash crashes worth the benefits of high

frequency trading? – What additional information would you need to answer this

question more fully?

23Ethics in a Computing Culture

Page 24: Ethics in a Computing Culture

Augmented Reality Marketing

• Augmented reality: a computer graphics technology that draws virtual objects laid over the real world

• If augmented reality is widely adopted, the government could make regulations about what types of ads can be associated with public billboards (for example, banning ads for cigarettes or ads with obscene sexual content). – Should the government regulate these ads? If so, how?

24Ethics in a Computing Culture

Page 25: Ethics in a Computing Culture

Augmented Reality Marketing (continued)

• Should the public posting of QR codes that lead the viewer to racy ads be banned?

• Shock images: incredibly graphic images posted on the Internet; designed to shock or sicken ordinary people– Examples include pictures of real deaths and violence, obscene

or illegal pornography– Should the public posting of QR codes leading to shock images

be banned?

25Ethics in a Computing Culture