1 / 33

IT Ethics

IT Ethics. Lecture 1 (A Short) Introduction to Ethics. Why ethics?. Technology advances faster than ethical values, morals and laws Rules of taking pictures in public places No time needed to develop pictures, can be sent to anyone with MMS capable mobile with-in seconds etc

everettv
Download Presentation

IT Ethics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IT Ethics Lecture 1 (A Short) Introduction to Ethics Kai Kimppa, IT Department, Information Systems

  2. Why ethics? • Technology advances faster than ethical values, morals and laws • Rules of taking pictures in public places • No time needed to develop pictures, can be sent to anyone with MMS capable mobile with-in seconds etc • The laws lacking • Discussion between relevant parties needed, ethicists, professionals, ’intelligentsia’, organisation representatives, politicians, media, ’normal’ people, etc. • Law and morals do not always meet Kai Kimppa, IT Department, Information Systems

  3. Motivation • Vacuum of rules • Rules of the field derived from old rules, there aren’t any rules or they aren’t followed • Conceptual muddles • Is a program a service, means of production, idea or a presentation of an idea? • Social use environment • ICT artefacts are seldom private affairs anymore Kai Kimppa, IT Department, Information Systems

  4. New questions? • New area, old questions or new area with new questions? • Does the medium bring new ethical questions to bear? • Is there something fundamentally different about IT compared to other things? • E.g. privacy; we have the same information (and possibly more) but the problems come from easier handling of that information. • A quantitative difference becomes a qualitative “leap” Kai Kimppa, IT Department, Information Systems

  5. Why ethicists and professionals? • Need to understand both to be able to discuss rationally (and emotionally, at that) • Ethics • Technology • To avoid conceptual muddles • Professionals can provide the necessary facts of the field – descriptive claims • Ethicists can clarify the questions so that discussion is possible – normative claims Kai Kimppa, IT Department, Information Systems

  6. Ethics, Applied Ethics and Morals • Ethics is the study of morals • Morals are the (right or good) habits which people have in a society (lat. mores) • Applied ethics tries to clarify the questions of ethics/morals so that they can be discussed • Ethics have been and are still (albeit to a lesser degree) used to formulate policies in societies Kai Kimppa, IT Department, Information Systems

  7. The aim(s) of Ethics • The good of the people • To understand what it would be – meta ethics • To build a system(s) to solve how to get there • To apply the system(s) to actual questions coherently and consistently • To aid us in our moral problems Kai Kimppa, IT Department, Information Systems

  8. Pietarinen ja Poutanen, Etiikan Teorioita (1997) Ethical relativism(s) • Cultural relativism • Descriptive fact, that morals differ in cultures • This, however, does not mean, that we ought to agree that what is right is relative • Ethical relativism • A moral claim is right when compared to such standards, which can’t be questioned (not necessary or not possible to question) Kai Kimppa, IT Department, Information Systems

  9. Ethical relativism • Biased and non-biased relativism • “Electronic surveillance of personnel in toilets is wrong” can mean: • It is wrong in my culture • It is wrong in all cultures even though that is just our opinion • Extreme forms are • ethical subjectivism • ethical skepticism • nihilism Kai Kimppa, IT Department, Information Systems

  10. Relative problems • Aim of ethics is to help in clearing conceptual muddles, ethical relativism avoids this question claiming it to be the wrong question • Intersubjective normative claims – do they exist? • What is right now and here, is not right tomorrow or there • What is the group that holds the ethical beliefs? Kai Kimppa, IT Department, Information Systems

  11. John Stuart Mill, Utilitarianism (1861) Jeremy Bentham, Introduction to the Principles of Morals and Legislation (1789) Utilitarianism • Greatest amount of good to greatest amount of people • Act-utilitarianism: do what maximizes good in a given situation, good will follow • Rule-utilitarianism: follow rules A1-An in all and any situations, good will follow • Good does not necessarily equal happiness, whatever Johnson claims • Hospitals and health care typical area of implementation for utilitarian thinking Kai Kimppa, IT Department, Information Systems

  12. Utility – or problem? • Utility hard to measure • Utility hard to define • If 10’000 people gain “utility” (what ever it might then be) from killing one, is it right? • If 10’000 people gain ”utility” from ignoring one persons IPR’s in their digital media, is it right? Kai Kimppa, IT Department, Information Systems

  13. Consequentialism • Since utility is hard to measure, not all theories in the same camp try to measure it • Consequences still valuable, but not directly measurable like in utilitarianism and not necessarily always comparable • Avoids certain pitfalls of utilitarianism but is open to ones utilitarianism is not Kai Kimppa, IT Department, Information Systems

  14. Consequently, problems • How do we measure whether the consequences were good or not? • If “A” aims to shoot “B” but accidentally shoots “C” who was trying to stab “B”, did “A” do right? Kai Kimppa, IT Department, Information Systems

  15. Immanuel Kant, Grundlegung (1785) (Kantian) Deontology • Duty or intent important • One ought to intend good for another for an action to be good • One has duties towards other rational beings • Categorical Imperatives: • CI1: Never treat a person merely as means to an end, but always as an end in themselves • CI2: If a certain action would be good as a universal law, it is good action • CI3: An act can be good only if it is a voluntary act Kai Kimppa, IT Department, Information Systems

  16. Unintentional problems • How do we know what someone's intentions were? Ask them? Would they tell the truth if their intentions were not good? • Extreme forms – like Kant’s – prohibit certain things in all situations, like lying even if dire consequences (since consequences don’t matter, intentions do) Kai Kimppa, IT Department, Information Systems

  17. John Locke, Two Treatises on Government (1689) Rights • Negative rights (liberalism and libertarianism): The right to be left alone • USA, UK (to lesser degree) • Positive rights (communism, communitarianism, socialism, social-democratism): The right to be helped • Finland, Continental Europe, Canada Kai Kimppa, IT Department, Information Systems

  18. Right, problems • Negative rights say it is right to not help a car crash victim (since it is my negative right not to meddle with another persons troubles) – is it right? • Positive rights say that I ought to care about everyone – how far does this reach? Should I choose to send money to the poor in Africa instead of buying a DVD-player? Kai Kimppa, IT Department, Information Systems

  19. John Rawls, A Theory of Justice (1971) Rawls • A just society from behind a veil of ignorance • Those deciding on the rules of a society do not know who they end up being in that society • They are rational • They are self-interested • Theoretical • Not agreed on the conclusions Rawls himself draws on his theory Kai Kimppa, IT Department, Information Systems

  20. Ignorant to problems? • How could we be human if we didn’t know what we are? • If equal intelligence isn’t supposed, how could participants in the negotiation negotiate on even ground and wouldn’t we know too much about the others and ourselves? • “A fair shot at a decent life”? • “Liberty and opportunity”? • For those less fortunate that would mean...? Kai Kimppa, IT Department, Information Systems

  21. Alasdair MacIntyre, After Virtue (1981) Virtue (arete, virtus)ethics • Aristoteles, MacIntyre • Character building—The aim of humans in society, telos • What kinds of qualities the character of a person ought to have so that the person could act morally • The golden middle way—no excesses on anything (Aristoteles) • 1) How we are, 2) how we should/could be, 3) how do we get from 1) to 2) (ethics) Kai Kimppa, IT Department, Information Systems

  22. http://www.iep.utm.edu/v/virtue.htm Charasteristic problems • What is this telos, aim of the human then? • How do we know it? • How to find the criteria for this? • If we cannot find a universal truth about it, how to get there? • There is no individual telos (according to Aristoteles and MacIntyre), is this necessarily so? Kai Kimppa, IT Department, Information Systems

  23. Legalism, just problems? • “It is right because it is legal” • “It is wrong because it is illegal” • Commonly seen in populist phrases • Problematic, because laws ought to reflect what is right, not the other way around • Laws can, however, tell us something about the morals of a society Kai Kimppa, IT Department, Information Systems

  24. Other moral (?) theories • The next theories are more or less moral theories in the traditional sence • Many of them have been linked with Nietzschean ethics, especially moral egoism and social darwinism • Nietzschean ethics is really about winning over oneself, but the methods for this – as presented by Nietzsche – aren’t necessarily agreed upon nor is the meaning of the statement itself Kai Kimppa, IT Department, Information Systems

  25. Moral egoism • If all people do what they consider good for themselves, good for all will follow • Laissez-faire capitalism? From following our most self-indulgent needs would somehow, magically follow good for all? • Moral theory is something that can be tought to others – were I to teach my doctrine of moral egoism, I would do a disservice to my own possibility to follow it. Rather it’d be better for me to not teach it and hope others follow more altruistic moral teachings. Kai Kimppa, IT Department, Information Systems

  26. Social darwinism • Some individuals or ’races’ based on some of their features are ’better’ than others and thus ought be favoured above others • How to measure these traits? • What to do with those less ’good’? • Can lead to nasty situations; ’nazi-card’ Kai Kimppa, IT Department, Information Systems

  27. Evolutionism • From that we are all humans, certain intersubjective norms follow • All humans subscribe to certain normative facts, like ’protect your young’, ’try to maximise your own survival’ etc. • From these follows need to act socially by the mores of the society • Can be seen as basis for all and any moral theories, so doesn’t really answer which one(s) we ought to follow. Kai Kimppa, IT Department, Information Systems

  28. Some areas IT ethics handles • Access to Information • AI, Cyborgs, Robotics • Bio-Informatics and -Technology • (IT-)Business Ethics (if any…) • Codes of Conduct/Ethics/Practice • Children and Youth • Civil Disobedience • Data Mining • Decicion making algorithms • Design • Digital Divide • Digital Signatures • Disabled Persons Kai Kimppa, IT Department, Information Systems

  29. …a few more areas… • Distancing • eCommerce • eEducation • eGovernment, eDemocracy, e? • Freedom of Speech (in Internet), Freedom of Information • Games • Gender Issues • Globalisation and Localisation • Information Ethics • Intellectual Property Rights, Piratism, P2P and Digital Rights Management • IT use in Warfare • Location Based Services, Radio Frequency Identification • Mis-Information Kai Kimppa, IT Department, Information Systems

  30. …and a few more areas • Nanotechnology • Illegal copying of SW (Piratism) • Privacy and Data Protection • Power • Professional Ethics, Work Ethics • Responsibility and Accountability • Security • Sex, Pornography and Child Abuse • Social Issues • Spam • (Digital) Surveys • Teaching (IT-)Ethics • Telemedicine, Health and Medical Informatics, eHealth Kai Kimppa, IT Department, Information Systems

  31. Assignment • Write a summary of approximately 8 pages on selected parts of Feldman • Chapters 1-5, 7-9, 11 expected to be handled, more can voluntarily be done (might rise grade if not otherwise pure 3... Will not lower grade in any case – extra pages might be necessary) • For virtue ethics use http://www.iep.utm.edu/v/virtue.htm • Return the summary by or during the next lecture in .rtf format through e-mail to: kai.kimppa@it.utu.fi Kai Kimppa, IT Department, Information Systems

  32. References • Feldman, Fred. (1978) Introductory Ethics, Prentice-Hall, Inc., Englewood Cliffs, N.J. • Johnson, Deborah. (2001) Computer Ethics (3rd ed.), Prentice-Hall, Inc., Upper Saddle River, New Jersey. • Pietarinen, Juhani & Seppo Poutanen. (1997) Etiikan teorioita, Turun yliopiston offsetpaino, Turku. • Spinello, Richard. Ethical Aspects of Information Technology Prentice-Hall, 1995. Kai Kimppa, IT Department, Information Systems

  33. References • Weckert, John and Douglas Adeney Computers and Information Ethics, Greenwood Press, 1997. • Nietzsche, Friedrich. Moraalin alkuperästä, kustannusosakeyhtiö Otavan laakapaino, Helsinki, 1969. Saksalainen alkuteos, Zur Genealogie der Moral, 1887. • Others listed along the theories handled. Kai Kimppa, IT Department, Information Systems

More Related