1 / 24

Computer-Mediated Communication

Computer-Mediated Communication. Hate Crimes, Illegal Activity, and Harassment: Trolling & Harassment/Dark Net. Trolling and the trollers who troll. Phillips – Why We Can’t Have Nice Things. “Trolls amplify the ugly side of mainstream behavior”

hkrause
Download Presentation

Computer-Mediated Communication

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer-Mediated Communication Hate Crimes, Illegal Activity, and Harassment: Trolling & Harassment/Dark Net

  2. Trolling and the trollers who troll

  3. Phillips – Why We Can’t Have Nice Things • “Trolls amplify the ugly side of mainstream behavior” • Emotional disengagement: their online self is not their offline self (online role playing) • Based on insider/outsider status • Real harm is not necessarily the intent (but not always . . . ) • Lulz are predicated on asymmetry – there has to be a dupe & an in-group

  4. Is anonymity the problem? • Does verified identity solve trolling? • Phillips (quoting Jonathan Zittrain): “If the internet is too open, if there are no behavioral checks, and consequently if the average user feels threatened on a regular basis, users will be much more willing to accept harsh preemptive and/or punitive measures.”

  5. Phillips’ conclusions on trolling • Problem isn’t anonymity: it’s social norms • “Ubiquity of trollish discourse” - points a finger at the media (specifically) for celebrating trollish activities • Suggests that our concern should be weighted against persistence (ephemeral vs. permanent abuse) • Less severe cases - look to the community!

  6. Citron: Hate Crimes & Cyberspace • In addition to several specific legal interventions (covered in other chapters): • Digital gatekeepers are not state actors: what kinds of speech should corporations privilege? • Suggests clearer community guidelines for users, more awareness/training about harassment/stalking issues for employees • Transparent/public “Due process” procedures • Community self-policing

  7. Interpersonal interventions – Twitter (3/16) vs. Facebook

  8. Twitter flow – Nov. 2016

  9. Interpersonal interventions – Twitter vs. Facebook

  10. Facebook’s Conflict Resolution System

  11. Facebook’s Conflict Resolution System

  12. What is the dark net? • Large scale organizing by criminals or criminal organizations, both in the ‘uncharted internet’ as well as on the backs of commercial platforms • Communities where criminals or others gather to engage in illegal or questionable behavior • Webs of online fake news/propaganda or spam created by professional trolls to influence public opinion or game search engines • Hidden services only accessible via Tor

  13. Social value of anonymity • Key conflict: the core architecture of the internet and/or the tools that allow/protect anonymity and privacy can also enable: • Privacy • Protest & collective action • Reporting on repressive regimes • Documenting abuses of power • Freedom to speak about controversial topics • And much more . . .

  14. The joy of user-generated content • ANY TIME you allow users to post unmoderated or anonymous content . . . something bad will happen! See Medium, ”The Inherent Problem with Anonymous Apps”, https://medium.com/@davidbyttow/the-inherent-problem-with-anonymous-apps-2795ef0c1855#.xct17t2kh

  15. What kinds of bad things, you ask? • Content defined as illegal in US (primarily child pornography) • Every form of sexualized content imaginable • Violence (both simulated and real), including terrorist activities • Interpersonal attacks (i.e., trolling, revenge porn, harassment, etc.) • Spam, social botnets

  16. The Internet’s toxic waste • Someone has to clean it up (content moderators), often at large cost and personal pain • In the US, private firms are put in the place of policing the limits free speech; in other countries, they are often ordered to remove content that might past 1st Amendment muster here • Adjudicating the grey areas (breastfeeding?)

  17. No bad tools, only bad users? • Commercial providers/platforms forced to deal with abusive & illegal behavior • Two major factors that enable this: • Free services (low barriers to sign up, little or no authentication, no cost) • Anonymity (whether by design, or through the user’s exploitation of loopholes in the system, e.g. IP masking, throwaway email accounts, etc.)

  18. Botnets, fake news, and trust, oh my! • Social botnets: impersonating humans vs. blanket spamming • Deceiving individuals: search results, legitimate appearing web/news sites • Social credibility: your friends as the disseminators of disinfo

More Related