Ethics and Technology
Controversies, Questions, and Strategies for Ethical Computing
Häftad, Engelska, 2023
1 359 kr
Beställningsvara. Skickas inom 5-8 vardagar
Fri frakt för medlemmar vid köp för minst 249 kr.Ethics and Technology, 5th Edition, by Herman Tavani introduces students to issues and controversies that comprise the relatively new field of cyberethics. This text examines a wide range of cyberethics issues--from specific issues of moral responsibility that directly affect computer and information technology (IT) professionals to broader social and ethical concerns that affect each of us in our day-to-day lives. The 5th edition shows how modern day controversies created by emerging technologies can be analyzed from the perspective of standard ethical concepts and theories.
Produktinformation
- Utgivningsdatum2023-02-03
- Mått203 x 252 x 23 mm
- Vikt771 g
- FormatHäftad
- SpråkEngelska
- Antal sidor400
- Upplaga5
- FörlagJohn Wiley & Sons Inc
- ISBN9781119239758
Tillhör följande kategorier
- PREFACE xviiNew to the Fifth Edition xviiiAudience and Scope xixOrganization and Structure of the Book xxThe Web Site for Ethics and Technology xxiiA Note to Students xxiiiNote to Instructors: A Roadmap for Using This Book xxiiiA Note to Computer Science Instructors xxivAcknowledgments xxvFOREWORD xxviiCHAPTER 1Introduction to Cyberethics: Concepts, Perspectives, and Methodological Frameworks 1Scenario 1–1: Hacking into the Mobile Phones of Celebrities 11.1 Defining Key Terms: Cyberethics and Cybertechnology 21.1.1 What Is Cybertechnology? 31.1.2 Why the Term Cyberethics? 31.2 The Cyberethics Evolution: Four Developmental Phases in Cybertechnology 41.3 Are Cyberethics Issues Unique Ethical Issues? 7Scenario 1–2: Developing the Code for a Computerized Weapon System 8Scenario 1–3: Digital Piracy 81.3.1 Distinguishing between Unique Technological Features and Unique Ethical Issues 91.3.2 An Alternative Strategy for Analyzing the Debate about the Uniqueness of Cyberethics Issues 101.3.3 A Policy Vacuum in Duplicating Computer Software 101.4 Cyberethics as a Branch of Applied Ethics: Three Distinct Perspectives 121.4.1 Perspective #1: Cyberethics as a Field of Professional Ethics 121.4.2 Perspective #2: Cyberethics as a Field of Philosophical Ethics 141.4.3 Perspective #3: Cyberethics as a Field of Sociological/Descriptive Ethics 16Scenario 1–4: The Impact of Technology X on the Pleasantville Community 171.5 A Comprehensive Cyberethics Methodology 191.5.1 A “Disclosive” Method for Cyberethics 191.5.2 An Interdisciplinary and Multilevel Method for Analyzing Cyberethics Issues 211.6 A Comprehensive Strategy for Approaching Cyberethics Issues 211.7 Chapter Summary 22Review Questions 23Discussion Questions 23Scenarios for Analysis 23Endnotes 24References 25Further Readings 26Online Resources 26CHAPTER 2Ethical Concepts And Ethical Theories: Frameworks For Analyzing Moral Issues 27Scenario 2–1: The Case of the “Runaway Trolley”: A Classic Moral Dilemma 272.1 Ethics and Morality 292.1.1 What Is Morality? 292.1.2 The Study of Morality: Three Distinct Approaches for Evaluating and Justifying the Rules Comprising a Moral System 322.2 Discussion Stoppers as Roadblocks to Moral Discourse 352.2.1 Discussion Stopper #1: People Disagree on Solutions to Moral Issues 362.2.2 Discussion Stopper #2: Who Am I to Judge Others? 372.2.3 Discussion Stopper #3: Morality Is Simply a Private Matter 392.2.4 Discussion Stopper #4: Morality Is Simply a Matter for Individual Cultures to Decide 40Scenario 2–2: The Price of Defending Moral Relativism 412.3 Why Do We Need Ethical Theories? 432.4 Consequence‐Based Ethical Theories 442.4.1 Act Utilitarianism 46Scenario 2–3: A Controversial Policy in Newmerica 462.4.2 Rule Utilitarianism 462.5 Duty‐Based Ethical Theories 472.5.1 Rule Deontology 48Scenario 2–4: Making an Exception for Oneself 482.5.2 Act Deontology 49Scenario 2–5: A Dilemma Involving Conflicting Duties 502.6 Contract‐Based Ethical Theories 512.6.1 Some Criticisms of Contract‐Based Theories 522.6.2 Rights‐Based Contract Theories 532.7 Character‐Based Ethical Theories 542.7.1 Being a Moral Person vs. Following Moral Rules 542.7.2 Acquiring the “Correct” Habits 552.8 Integrating Aspects of Classical Ethical Theories into a Single Comprehensive Theory 562.8.1 Moor’s Just‐Consequentialist Theory and Its Application to Cybertechnology 572.8.2 Key Elements in Moor’s Just‐Consequentialist Framework 582.9 Chapter Summary 59Review Questions 59Discussion Questions 60Scenarios for Analysis 60Endnotes 61References 61Further Readings 62CHAPTER 3Critical Reasoning Skills for Evaluating Disputes in Cyberethics 63SCENARIO 3–1: Reasoning About Whether to Download Software from “Sharester” 633.1 What Is Critical Reasoning? 643.1.1 Some Basic Concepts: (Logical) Arguments and Claims 643.1.2 The Role of Arguments 653.1.3 The Basic Structure of an Argument 653.2 Constructing an Argument 673.3 Valid Arguments 683.4 Sound Arguments 713.5 Invalid Arguments 733.6 Inductive Arguments 743.7 Fallacious Arguments 753.8 A Seven‐Step Strategy for Evaluating Arguments 773.9 Identifying Some Common Fallacies 793.9.1 Ad Hominem Argument 793.9.2 Slippery Slope Argument 803.9.3 Fallacy of Appeal to Authority 803.9.4 False Cause Fallacy 813.9.5 Fallacy of Composition/Fallacy of Division 813.9.6 Fallacy of Ambiguity/Equivocation 823.9.7 The False Dichotomy/Either–Or Fallacy/All‐or‐Nothing Fallacy 823.9.8 The Virtuality Fallacy 833.10 Chapter Summary 84Review Questions 84Discussion Questions 85Scenarios for Analysis 85Endnotes 85References 86Further Readings 86CHAPTER 4Professional Ethics, Codes of Conduct, and Moral Responsibility 87Scenario 4–1: Fatalities Involving the Oerlikon GDF‐005 Robotic Cannon 874.1 What Is Professional Ethics? 884.1.1 What Is a Profession? 894.1.2 Who Is a Professional? 894.1.3 Who Is a Computer/IT Professional? 904.2 Do Computer/IT Professionals Have Any Special Moral Responsibilities? 904.3 Professional Codes of Ethics and Codes of Conduct 914.3.1 The Purpose of Professional Codes 924.3.2 Some Criticisms of Professional Codes 934.3.3 Defending Professional Codes 944.3.4 The IEEE‐CS/ACM Software Engineering Code of Ethics and Professional Practice 954.4 Conflicts of Professional Responsibility: Employee Loyalty and Whistle‐Blowing 974.4.1 Do Employees Have an Obligation of Loyalty to Employers? 974.4.2 Whistle‐Blowing 98Scenario 4–2: NSA Surveillance and the Case of Edward Snowden 1014.5 Moral Responsibility, Legal Liability, and Accountability 1034.5.1 Distinguishing Responsibility from Liability and Accountability 1044.5.2 Accountability and the Problem of “Many Hands” 105Scenario 4–3: The Case of the Therac‐25 Machine 1054.5.3 Legal Liability and Moral Accountability 1064.6 Do Some Computer Corporations Have Special Moral Obligations? 1074.7 Chapter Summary 108Review Questions 109Discussion Questions 109Scenarios for Analysis 110Endnotes 110References 111Further Readings 112CHAPTER 5Privacy and Cyberspace 113Scenario 5–1: A New NSA Data Center 1135.1 Privacy in the Digital Age: Who Is Affected and Why Should We Worry? 1145.1.1 Whose Privacy Is Threatened by Cybertechnology? 1155.1.2 Are Any Privacy Concerns Generated by Cybertechnology Unique or Special? 1155.2 What Is Personal Privacy? 1175.2.1 Accessibility Privacy: Freedom from Unwarranted Intrusion 1185.2.2 Decisional Privacy: Freedom from Interference in One’s Personal Affairs 1185.2.3 Informational Privacy: Control over the Flow of Personal Information 1185.2.4 A Comprehensive Account of Privacy 119Scenario 5–2: Descriptive Privacy 119Scenario 5–3: Normative Privacy 1205.2.5 Privacy as “Contextual Integrity” 120Scenario 5–4: Preserving Contextual Integrity in a University Seminar 1215.3 Why Is Privacy Important? 1215.3.1 Is Privacy an Intrinsic Value? 1225.3.2 Privacy as a Social Value 1235.4 Gathering Personal Data: Surveillance, Recording, and Tracking Techniques 1235.4.1 “Dataveillance” Techniques 1245.4.2 Internet Cookies 1245.4.3 RFID Technology 1255.4.4 Cybertechnology and Government Surveillance 1265.5 Analyzing Personal Data: Big Data, Data Mining, and Web Mining 1275.5.1 Big Data: What, Exactly, Is It, and Why Does It Threaten Privacy? 1285.5.2 Data Mining and Personal Privacy 128Scenario 5–5: Data Mining at the XYZ Credit Union 1295.5.3 Web Mining: Analyzing Personal Data Acquired from Our Interactions Online 1325.6 Protecting Personal Privacy in Public Space 1325.6.1 PPI vs. NPI 133Scenario 5–6: Shopping at SuperMart 133Scenario 5–7: Shopping at Nile.com 1345.6.2 Search Engines and the Disclosure of Personal Information 1355.7 Privacy Legislation and Industry Self‐Regulation 1375.7.1 Industry Self‐Regulation and Privacy‐Enhancing Tools 1375.7.2 Privacy Laws and Data Protection Principles 1395.8 A Right to “Be Forgotten” (or to “Erasure”) in the Digital Age 140Scenario 5–8: An Arrest for an Underage Drinking Incident 20 Years Ago 1415.8.1 Arguments Opposing RTBF 1425.8.2 Arguments Defending RTBF 1435.8.3 Establishing “Appropriate” Criteria 1445.9 Chapter Summary 146Review Questions 146Discussion Questions 147Scenarios for Analysis 148Endnotes 148References 149Further Readings 150CHAPTER 6Security in Cyberspace 151Scenario 6–1: The “Olympic Games” Operation and the Stuxnet Worm 1516.1 Security in the Context of Cybertechnology 1526.1.1 Cybersecurity as Related to Cybercrime 1536.1.2 Security and Privacy: Some Similarities and Some Differences 1536.2 Three Categories of Cybersecurity 1546.2.1 Data Security: Confidentiality, Integrity, and Availability of Information 1556.2.2 System Security: Viruses, Worms, and Malware 1566.2.3 Network Security: Protecting our Infrastructure 156Scenario 6–2: The “GhostNet” Controversy 1576.3 Cloud Computing and Security 1586.3.1 Deployment and Service/Delivery Models for the Cloud 1586.3.2 Securing User Data Residing in the Cloud 1596.3.3 Assessing Risk in the Cloud and in the Context of Cybersecurity 1606.4 Hacking and “The Hacker Ethic” 1606.4.1 What Is “The Hacker Ethic”? 1616.4.2 Are Computer Break‐ins Ever Ethically Justifiable? 1636.5 Cyberterrorism 1646.5.1 Cyberterrorism vs. Hacktivism 165Scenario 6–3: Anonymous and the “Operation Payback” Attack 1666.5.2 Cybertechnology and Terrorist Organizations 1676.6 Information Warfare (IW) 1676.6.1 Information Warfare vs. Conventional Warfare 1676.6.2 Potential Consequences for Nations that Engage in IW 1686.7 Chapter Summary 170Review Questions 170Discussion Questions 171Scenarios for Analysis 171Endnotes 171References 172Further Readings 174CHAPTER 7Cybercrime and Cyber‐Related Crimes 175Scenario 7–1: Creating a Fake Facebook Account to Catch Criminals 1757.1 Cybercrimes and Cybercriminals 1777.1.1 Background Events: A Brief Sketch 1777.1.2 A Typical Cybercriminal 1787.2 Hacking, Cracking, and Counter Hacking 1787.2.1 Hacking vs. Cracking 1797.2.2 Active Defense Hacking: Can Acts of “Hacking Back” or Counter Hacking Ever Be Morally Justified? 1797.3 Defining Cybercrime 1807.3.1 Determining the Criteria 1817.3.2 A Preliminary Definition of Cybercrime 1817.3.3 Framing a Coherent and Comprehensive Definition of Cybercrime 1827.4 Three Categories of Cybercrime: Piracy, Trespass, and Vandalism in Cyberspace 1837.5 Cyber‐Related Crimes 1847.5.1 Some Examples of Cyber‐Exacerbated vs. Cyber‐Assisted Crimes 1847.5.2 Identity Theft 1857.6 Technologies and Tools for Combating Cybercrime 1877.6.1 Biometric Technologies 1877.6.2 Keystroke‐Monitoring Software and Packet‐Sniffing Programs 1887.7 Programs and Techniques Designed to Combat Cybercrime in the United States 1897.7.1 Entrapment and “Sting” Operations to Catch Internet Pedophiles 189Scenario 7–2: Entrapment on the Internet 1897.7.2 Enhanced Government Surveillance Techniques and the Patriot Act 1897.8 National and International Laws to Combat Cybercrime 1907.8.1 The Problem of Jurisdiction in Cyberspace 190Scenario 7–3: A Virtual Casino 191Scenario 7–4: Prosecuting a Computer Corporation in Multiple Countries 1927.8.2 Some International Laws and Conventions Affecting Cybercrime 192Scenario 7–5: The Pirate Bay Web Site 1937.9 Cybercrime and the Free Press: The Wikileaks Controversy 1937.9.1 Are WikiLeaks’ Practices Ethical? 1947.9.2 Are WikiLeaks’ Practices Criminal? 1947.9.3 WikiLeaks and the Free Press 1957.10 Chapter Summary 196Review Questions 197Discussion Questions 197Scenarios for Analysis 198Endnotes 199References 199Further Readings 200CHAPTER 8Intellectual Property Disputes in Cyberspace 201Scenario 8–1: Streaming Music Online 2018.1 What Is Intellectual Property? 2028.1.1 Intellectual Objects 2038.1.2 Why Protect Intellectual Objects? 2038.1.3 Software as Intellectual Property 2048.1.4 Evaluating a Popular Argument Used by the Software Industry to Show Why It Is Morally Wrong to Copy Proprietary Software 2058.2 Copyright Law and Digital Media 2068.2.1 The Evolution of Copyright Law in the United States 2068.2.2 The Fair‐Use and First‐Sale Provisions of Copyright Law 2078.2.3 Software Piracy as Copyright Infringement 2088.2.4 Napster and the Ongoing Battles over Sharing Digital Music 2098.3 Patents, Trademarks, and Trade Secrets 2128.3.1 Patent Protections 2128.3.2 Trademarks 2138.3.3 Trade Secrets 2148.4 Jurisdictional Issues Involving Intellectual Property Laws 2148.5 Philosophical Foundations for Intellectual Property Rights 2158.5.1 The Labor Theory of Property 215Scenario 8–2: DEF Corporation vs. XYZ Inc. 2168.5.2 The Utilitarian Theory of Property 216Scenario 8–3: Sam’s e‐Book Reader Add‐on Device 2178.5.3 The Personality Theory of Property 217Scenario 8–4: Angela’s B++ Programming Tool 2188.6 The “Free Software” and “Open Source” Movements 2198.6.1 GNU and the Free Software Foundation 2198.6.2 The “Open Source Software” Movement: OSS vs. FSF 2208.7 The “Common Good” Approach: An Alternative Framework for Analyzing the Intellectual Property Debate 2218.7.1 Information Wants to be Shared vs. Information Wants to be Free 2238.7.2 Preserving the Information Commons 2258.7.3 The Fate of the Information Commons: Could the Public Domain of Ideas Eventually Disappear? 2268.7.4 The Creative Commons 2278.8 Pipa, Sopa, and Rwa Legislation: Current Battlegrounds in the Intellectual Property War 2288.8.1 The PIPA and SOPA Battles 2288.8.2 RWA and Public Access to Health‐Related Information 229Scenario 8–5: Elsevier Press and “The Cost of Knowledge” Boycott 2298.8.3 Intellectual Property Battles in the Near Future 2318.9 Chapter Summary 231Review Questions 231Discussion Questions 232Scenarios for Analysis 232Endnotes 233References 234Further Readings 235CHAPTER 9Regulating Commerce and Speech in Cyberspace 236Scenario 9–1: Anonymous and the Ku Klux Klan 2369.1 Introduction and Background Issues: Some Key Questions and Critical Distinctions Affecting Internet Regulation 2379.1.1 Is Cyberspace a Medium or a Place? 2389.1.2 Two Categories of Cyberspace Regulation: Regulating Content and Regulating Process 2399.1.3 Four Modes of Regulation: The Lessig Model 2409.2 Digital Rights Management (Drm) 2429.2.1 Some Implications of DRM for Public Policy Debates Affecting Copyright Law 2429.2.2 DRM and the Music Industry 243Scenario 9–2: The Sony Rootkit Controversy 2439.3 E‐Mail Spam 2449.3.1 Defining Spam 2449.3.2 Why Is Spam Morally Objectionable? 2459.4 Free Speech vs. Censorship and Content Control in Cyberspace 2469.4.1 Protecting Free Speech 2479.4.2 Defining Censorship 2479.5 Pornography in Cyberspace 2489.5.1 Interpreting “Community Standards” in Cyberspace 2489.5.2 Internet Pornography Laws and Protecting Children Online 2499.5.3 Virtual Child Pornography 2509.5.4 Sexting and Its Implications for Current Child Pornography Laws 252Scenario 9–3: A Sexting Incident Involving Greensburg Salem High School 2529.6 Hate Speech and Speech that Can Cause Physical Harm to Others 2549.6.1 Hate Speech on the Web 2549.6.2 Online “Speech” that Can Cause Physical Harm to Others 2559.7 “Network Neutrality” and the Future of Internet Regulation 2569.7.1 Defining Network Neutrality 2569.7.2 Some Arguments Advanced by Net Neutrality’s Proponents and Opponents 2579.7.3 Future Implications for the Net Neutrality Debate 2579.8 Chapter Summary 258Review Questions 259Discussion Questions 259Scenarios for Analysis 260Endnotes 260References 261Further Readings 262CHAPTER 10The Digital Divide, Democracy, and Work 263Scenario 10–1: Digital Devices, Social Media, Democracy, and the “Arab Spring” 26410.1 The Digital Divide 26510.1.1 The Global Digital Divide 26510.1.2 The Digital Divide within Nations 266Scenario 10–2: Providing In‐Home Internet Service for Public School Students 26710.1.3 Is the Digital Divide an Ethical Issue? 26810.2 Cybertechnology and the Disabled 27010.3 Cybertechnology and Race 27110.3.1 Internet Usage Patterns 27210.3.2 Racism and the Internet 27210.4 Cybertechnology and Gender 27310.4.1 Access to High‐Technology Jobs 27410.4.2 Gender Bias in Software Design and Video Games 27510.5 Cybertechnology, Democracy, and Demotratic Ideals 27610.5.1 Has Cybertechnology Enhanced or Threatened Democracy? 27610.5.2 How has Cybertechnology Affected Political Elections in Democratic Nations? 27910.6 The Transformation and the Quality of Work 28010.6.1 Job Displacement and the Transformed Workplace 28110.6.2 The Quality of Work Life in the Digital Era 283Scenario 10–3: Employee Monitoring and the Case of Ontario vs. Quon 28410.7 Chapter Summary 287Review Questions 287Discussion Questions 288Scenarios for Analysis 288Endnotes 289References 289Further Readings 291CHAPTER 11Online Communities, Virtual Reality, and Artificial Intelligence 292Scenario 11–1: Ralph’s Online Friends and Artificial Companions 29211.1 Online Communities and Social Networking Services 29311.1.1 Online Communities vs. Traditional Communities 29411.1.2 Blogs and Some Controversial Aspects of the Bogosphere 295Scenario 11–2: “The Washingtonienne” Blogger 29511.1.3 Some Pros and Cons of SNSs (and Other Online Communities) 296Scenario 11–3: A Suicide Resulting from Deception on MySpace 29811.2 Virtual Environments and Virtual Reality 29911.2.1 What Is Virtual Reality (VR)? 30011.2.2 Ethical Aspects of VR Applications 30111.3 Artificial Intelligence (AI) 30511.3.1 What Is AI? A Brief Overview 30511.3.2 The Turing Test and John Searle’s “Chinese Room” Argument 30611.3.3 Cyborgs and Human–Machine Relationships 30711.4 Extending Moral Consideration to AI Entities 310Scenario 11–4: Artificial Children 31011.4.1 Determining Which Kinds of Beings/Entities Deserve Moral Consideration 31011.4.2 Moral Patients vs. Moral Agents 31111.5 Chapter Summary 312Review Questions 313Discussion Questions 313Scenarios for Analysis 313Endnotes 314References 315Further Readings 316CHAPTER 12Ethical Aspects of Emerging and Converging Technologies 317Scenario 12–1: When “Things” Communicate with One Another 31712.1 Converging Technologies and Technological Convergence 31812.2 Ambient Intelligence (AmI) and Ubiquitous Computing 31912.2.1 Pervasive Computing, Ubiquitous Communication, and Intelligent User Interfaces 32012.2.2 Ethical and Social Aspects of AmI 321Scenario 12–2: E. M. Forster’s “(Pre)Cautionary Tale” 322Scenario 12–3: Jeremy Bentham’s “Panopticon/Inspection House” (Thought Experiment) 32312.3 Nanotechnology and Nanocomputing 32412.3.1 Nanotechnology: A Brief Overview 32412.3.2 Ethical Issues in Nanotechnology and Nanocomputing 32612.4 Autonomous Machines 32912.4.1 What Is an AM? 32912.4.2 Some Ethical and Philosophical Questions Pertaining to AMs 33212.5 Machine Ethics and Moral Machines 33612.5.1 What Is Machine Ethics? 33612.5.2 Designing Moral Machines 33712.6 A “Dynamic” Ethical Framework for Guiding Research in New and Emerging Technologies 34012.6.1 Is an ELSI‐Like Model Adequate for New/Emerging Technologies? 34012.6.2 A “Dynamic Ethics” Model 34112.7 Chapter Summary 341Review Questions 342Discussion Questions 342Scenarios for Analysis 343Endnotes 343References 344Further Readings 346GLOSSARY 347INDEX 353