540 likes | 550 Views
This course explores the ethical implications of surveillance practices on the environment, privacy, and human rights. Topics covered include mass surveillance, social sorting, and the rise of surveillance culture.
E N D
COMS3403A • Winter 2017 • Communication Technology and Culture • Week 11 (Mar.28) – • Ethics and the Environment Dr Tracey P. LauriaultCommunication StudiesSchool of Journalism and CommunicationTracey.Lauriault@carleton.ca Class Schedule: Tuesdays, 8:30-11:30Location: Southam Hall 516Instructor: Dr. Tracey P. LauriaultE-mail: Tracey.Lauriault@Carleton.ca Office: 4110b RichcraftHallOffice Hours: Tuesdays & Wednesdays 13:00-16:00
Term Overview Week 1 (Jan.10) – What is Technology? Week 2* (Jan. 17) – Technology, Society & Culture #1 Due Week 3 (Jan. 24) – Philosophy of Technology Week 4* (Jan. 31) – Communication Infrastructure #2 In-class Week 5* (Feb. 7) – Digital Labour & the Digital Divide - Essay Prop Week 6* (Feb. 14) – Love, Relationships and Porn #3 Due Study Break Week 7 (Feb. 28) - Code, Software and Platforms Week 8 (Mar. 7) – The Sharing Economy Week 9* (Mar. 14) – Resistance, Hacking, Technological Citizenship #4 Due Week 10 (Mar. 21) – Surveillance and Policing Week 11*(Mar. 28) – Ethics and the Environment, Essay Due Week 12 (Apr. 4) – Review
Announcements & Agenda • Office Hours Tues 1-4PM • Essay due Week 11, March 28 • Data Day 4.0 • Exam: Wednesday April 19th, 9AM, AT302 • Exam Review Week 12, April 4th • Bitcoin • Lyons Paper on Surveillance • Lesson: Ethics and the Environment
Week 10 Chapter 3, Predictive Policing: From Neighborhoods to Individuals Chapter 11, Surveillance Brevinin & Murdock, Following the Money: Wikileaks and the Political Economy of Disclosure
3 challenges (David Lyon, 2015) Surveillance studies: • Research Disregard • Research Deficit • Research Direction High stakes: • Privacy • Human rights • Civil liberties • Freedom • Justice
1. Research Disregard (David Lyon, 2015) 3 surveillance practices • Mass surveillance on own citizens by governments • Corporations share their own data supplies with government • Ordinary citizens participate via their online social media & cell phone behavior All feeding the NSA and its cognate agencies Under the radar of most citizens
Individual privacy? (David Lyon, 2015) • Surveillance: • Social sorting by targeting particular populations before individuals • Privacy is also about human rights and social justice • Trust • Issues • False positives • Surveillance and privacy • Spectrum long monad to multitude
Genealogy of surveillance (David Lyon, 2015) 1980s • concerned with state surveillance & workplace surveillance • Surveillance in the service of social control re-policing and management of offenders • Early notions of national security 1990s • Consumer surveillance • Enabled by computerization • Surveillance society term: • Systemic surveillance of many kinds could be expected simply as a result of conducting daily affairs • Credit cards • Loyalty cards • Online interactions • CCTV • Commercialization of internet 2000s • 9/11 • 7/7 London bombing • Madrid train attack • Boosted security surveillance in the global north • Customer relationship management systems • Total information awareness (TIA) • Facebook 2004 • Rise in consumer surveillance • Rise in social surveillance • Social Networking Monitoring Centre • Data analytics • Cookies • Medical files • Snowden was a wake up call
The every day (David Lyon, 2015) • Question assumptions of surveillance studies • More than forms of power, policies and democratic institutions and processes • Surveillance is not part of every day via bureaucratic power upon hapless citizens • Surveillance culture • “an increasing proportion of the world’s population live and to which, for a number of reasons, many have become enured” p.143
Security (David Lyon, 2015) • Security is becoming the driver • Nationally • Policing • Urban security • In workplaces • Transit systems • Schools • Evidence producing technologies • Efficiency • Cost cutting technological means • Convenience • This form of security however has little to do with security from things like: • Famines • Fear • Freedom • Some surveillance procurement seem at odds even with this understanding of security such as; • Civil liberties • Human rights
Change (David Lyon, 2015) • Trumping democracy • Trumping politics • Integration of private and public sector • Other trends: • LBS • Mobile devices • Time & space coordinates • Small buildings, homes, cities, metering • Human body • Biometrics • Dna • Fingerprinting • Facial recognition
2. Research Deficit (David Lyon, 2015) • Research needs to catch up w/ surveillance developments • Digital infrastructure • Professional networks • Social media practices • Analytic fog
Catching up (David Lyon, 2015) • Boundaries between law enforcement and intelligence • Affiliation of global professionals and organizations • Development of protocols, rationales & practices • Targets of surveillance • Networks + practices + targets • Social media • Cell phone • Research: • Oxford Internet Institute • Pew Internet and American Life program • Surveillance regimes and a surveillance culture • Growth of NGOs • exposure • Cloud computing • Changes in storage & electronic transfer • Assumption of immateriality of data • NSA interception • Upstream • Quantuminset • Tempora • Xkeyscore - caches • PRISM • Muscular • Who is conducting surveillance • Private sector • Security professionals • Security consultancies & subcontractors • Advisors • Police
3. Research direction (David Lyon, 2015) • Must be informed by changes in surveillance as a result of the digital & big data • Increased arena of political struggle based on surveillance • What of the utopian dreams of internet freedom? Its democratic and emancipatory possibilities? • Political economy of the internet • Knowledge is not independent of technology • PRISM VERIZON YAHOO Microsoft Google Facebook • Beyond FISA authorized access • 5 eyes partners • Australia, Canada, New Zealand, the United Kingdom and the United States. Bound by UKUSA Agreement, a treaty for joint cooperation in signals intelligence • Transparency of private sector vs public sector obsfucation • Government + private sector + government contracts • Communications • Turow • Andejevic • Marwick • Need for public policy analysists to get up to speed
Directions (David Lyon, 2015) • Information is not a fictitious commodity • Does big data loose its person? • Commercial exchange vs security goals • Politics of algorithms • Politics of the Internet • State power and private sector • Public distance but real proximity • Privacy advocates (Bennett 2008) • EPIC, EFF, ACLU, Open Media • Black Code (Deibert 2013) • Zittrain (2009)
Conclusion (David Lyon, 2015) • Why were the Snowden files considered to be such a surprise? • Is the person or the profile being surveilled? • Anticipatory profiling? • Surveillance is: • Information intensive • National security oriented • Security needs a new definition • More nuanced understanding is required to ensure the well-being of citizens • Multi-disciplinary enterprise
Week 11 Readings Chapter 12, Ethical Dimensions of Technology Chapter 10, Digital Ethics
3 Central Themes (Quan-Haase 2016) • The socio-technical approach: the study of technology needs to be approached from a socio-technical viewpoint • Technological inequality: technology and innovation are closely interwoven with economics and hence have consequences for our understanding of inequality • Social change: changeresults inevitably from technological developments
1. The Socio-technical Approach (Quan-Haase 2016) • Early conceptualizations tended to focus on technology as material substance, disregarding the social nature of technological invention, implementation, and use. • The socio-technical approach argues that the social and technological are closely interwoven. • Criticisms of this perspective: • It does not explicitly delineate a select set of variables that need to be examined. • The approach does not state what mechanisms underlie the relationship between technology and society. • Little detail is provided as to how this approach should uncover mechanisms underlying the mutual shaping process.
2. Technological Inequality (Quan-Haase 2016) • Technological inequality generally occurs on three levels (Schumpeter): • The gapbetween those involved in innovationversus those in the workforce continues to grow. • The difference in society between the haves and have-nots often plays out in terms of technological savviness. • A global digital divide exists between those nations who invent, produce, and distribute new technology and those who continue to fall behind.
3. Social Change (Quan-Haase 2016) • Social change is described as a significant change of structured social action or social structure, which takes place in a society, community, or social group. • In the context of technologically induced social change, analysts often preclude that these changes are negative, but they can also be positive. • In some instances, technology is used as an additional resource that facilitates the economic, cultural, and social development of social groups • It does not, however, always occur in predictable ways and simultaneously does not have the same implications for all members in society.
Ethical and Moral Dimensions of Our Technological Society (Quan-Haase 2016) • Neutrality of technology • Technology as human destiny • Technology as progress
1. Neutrality of Technology (Quan-Haase 2016) • Of particular importance in the debate about how technology intersects with society is the neutrality of technology argument. • Supporters of this perspective argue that technology is impartial because, unlike humans, it lacks a set of moral values and direction. • Swedish philosopher Sundström (1998) • described 3 instances in which technology could be deemed as being value-neutral: • Multiple uses of tools • Uncontextualized tool • Tool as science
AK47 • Historical development of AK-47 • Taken out of the social and historical context, the AK-47 is a source of sorrow, death, and never-ending wars. • The ramifications of Kalashnikov’s invention can be related to the characteristics that made it so popular.
2. Technology as Human Destiny (Quan-Haase 2016) • The metaphor of destiny in relation to technology is a powerful mode of approaching humanity’s relationship to the world. • Jonas (2003) divides technology into two distinct and separate spheres: • traditional technologies • modern technologies • Destiny is a central part of Heidegger’s inquiry. • human destiny is not fully determined but is closely linked to human agency and choice, and modern technology endangers this freedom by concealing the full reality of its true nature. • feared that humans would become an object of technology. • solution:not to outright reject technology, but to detach ourselves and extensively question technology’s purpose and role in society.
3. Technology as Progress (Quan-Haase 2016) Technology = progress • This notion is still deeply rooted in the Western culture and continues to have a profound impact on how we perceive, use, and evaluate technology. • Advances in the natural sciences allowed for new technologies to be developed, and these created new possibilities for how time was spent/divided. • Technologies are objects, whose value is developed through our perception of their functional or symbolic worth (Baudrillard 2005) • Moral backwardness: individuals stand vis-à-vis technology as inferior entities, who do not question the nature of their social system Technological advancement ≠ progress
Model of Regressiveness (Quan-Haase 2016) • Technology is regressive because instead of aiming toward moral progress, by questioning the present production system with its inequalities, power relations, and injustices, the society puts technological progress and failure at the forefront (Baudrillard 2005).
Electronic Waste (Quan-Haase 2016) • It is important to consider the ethical implications of what happens to our gadgets when they are broken, obsolete, or simply no longer fashionable. • Many materials that compose electronics are either toxic or non-degradable. • This waste is often exported to other countries, such as China, that are poorly equipped to deal with toxic materials.
Electronic Waste (Quan-Haase 2016) • Electronic waste, also called e-waste or waste of electric and electronic equipment (WEEE), refers to scrapped electric or electronic devices. • It includes a wide range of discarded household and commercial technologies, such as computers, cellphones, televisions, and batteries.
Electronic Waste • The amount of electronic waste produced annually has steadily risen because of: • Globalization • Development • Population growth • Declining retail prices
A Society of Overload (Quan-Haase 2016) • What to some may seem excessive reliance on technology may simply seem to others like normal, everyday use. • Technology alone does not lead to social change; rather, change involves a coming together of multiple factors. • When we look at how and why our technologies lead to feelings of being overwhelmed, a complex picture emerges.
A Society of Overload, cont’d (Quan-Haase 2016) • Information overload is the inability to effectively make decisions because of too much information. • Deterritorializationdescribes how in a networked society we observe collisions of social spheres and social roles. • Time-space compressionresults from heavy reliance on technology that allows for interactions and the flow of information to occur at a faster pace and without constraints of distance.
Conclusions (Quan-Haase 2016) • Technologies and technological systems are embedded into the functions of our daily lives. • Our interactions with these devices have become almost second nature to the point that we think nothing of the interplay between ourselves and our instruments. • The question concerning technology is not a call for rejection or abandonment, but one of measured evaluation in order to maintain a healthy social, economic, and political relationship between ourselves and our technologies.
Technology vs Humanity (Gerd Leonhard 2016) “Technology has no ethics – but humanity depends on them” • Society will eventuallybe automated, hyperconnected, virtualized, uber-efficient • A society that sleepwalks and does not pause to consider the consequences for human values, beliefs, and ethics • A society that is steered by technologists, technocrats, venture capitalists, stock markets, and the military Is likely to enter the true machine age
Ethics (Gerd Leonhard 2016) • “is understood as the capacity to think critically about moral values and direct our actions in terms of such values, is a generic human capacity” Bio-ethicist Larry Churchill 2 questions: • Should we never expect machines or computers to really understand ethics and therefore be very cautious about their increasing self-learning capacities? • Try to encode some kind of basic ethics into software and teach our machines to at least understand and respect them – machine ethics? Ex. Care robots & autonomous cars
Asimov’s 3 laws of robotics • A robot may not injure a human being or, through inaction, allow a human being to come to harm. • A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. • A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws IoT? Smart Cities?
AI, geo-engineering, cognitive computing & human genome editing “one thing that is apparent…is that intelligent machines will embody values, assumptions, and purposes, whether their programmers consciously intend them to or not. Thus, as computers and robots become more and more intelligent, it becomes imperitive that we think carefully and explicitly about what those built-in values are”
Will digital ethicists become as important as data scientists?
Global Digital Ethics Council - Manifesto (Gerd Leonhard 2016) 1. The right to remain natural, i.e. ‘merely’ biological and organic. • We must continue to have the choice to exist in an unaugmented state. • We need to retain the right to work or be employed, use public services, buy things, and function in society without the need to deploy technology with, on or – most importantly – inside our bodies. • #WiredOrFired– we may increasingly be forced to wear augmented reality AR or VR glasses, visors, or helmets to qualify for employment, or even worse, be required to use or implant specific ‘wetware apps’ or BCI’s (brain computer interfaces) as a non-negotiable condition of employment.
Global Digital Ethics Council - Manifesto (Gerd Leonhard 2016) 2. The right to be inefficient if, when and where it defines our basic humanness • We must have the choice to be slower and less capable than technology, and we should never make efficiency more important than humanity. • While it may very soon be vastly more efficient and much cheaper to use digital health diagnostics via platforms like Scanadu rather than to see a doctor every time I have a medical issue or need a checkup, it should not become the only ‘approved’ way to do so. • Do we penalize people who choose to do otherwise, or force compliance upon those that don’t want their health data in the cloud?
Global Digital Ethics Council - Manifesto (Gerd Leonhard 2016) 2. The right to be inefficient if, when and where it defines our basic humanness • We must have the choice to be slower and less capable than technology, and we should never make efficiency more important than humanity. • While it may very soon be vastly more efficient and much cheaper to use digital health diagnostics via platforms like Scanadu rather than to see a doctor every time I have a medical issue or need a checkup, it should not become the only ‘approved’ way to do so. • Do we penalize people who choose to do otherwise, or force compliance upon those that don’t want their health data in the cloud?
Global Digital Ethics Council - Manifesto (Gerd Leonhard 2016) 3. The right to disconnect • We must retain the right to switch off connectivity, to “go dark” on the network, or to pause communications, tracking, and monitoring. • To be independent, self-contained and disconnected at the times of our own choosing is a fundamentally important right because disconnecting allows us to refocus on our unmediated environment and to ‘be in the moment’ which is essential to human well-being. It also reduces the risk of digital obesity and lessens the reach of surveillance. Offline may indeed be the new luxury, but it should also be a basic digital right. • Consider the likelihood that many employers and companies are certain to make hyper-connectivity (AR, VR, and even BCIs) a default requirement in the near future, or that drivers may become liable for ‘unauthorized disconnection’ if they can no longer be tracked on the network.