1 / 37

Privacy, Cybersecurity and Data Analytics

Privacy, Cybersecurity and Data Analytics. Peter Swire Holder Chair of Law and Ethics Senior Counsel, Alston & Bird LLC Bowles Symposium 2017 : “Predictive Analytics and Risk Analytics” Georgia State University November 9, 2017. Overview.

bsammy
Download Presentation

Privacy, Cybersecurity and Data Analytics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy, Cybersecurity and Data Analytics Peter Swire Holder Chair of Law and Ethics Senior Counsel, Alston & Bird LLC Bowles Symposium 2017: “Predictive Analytics and Risk Analytics” Georgia State University November 9, 2017

  2. Overview • Much of the work on analytics examines the benefits of Big Data and Analytics • This session – some major risks that come from Big Data and Analytics • Data security risks • Data privacy risks • Data discrimination risks

  3. Peter Swire Background • As law professor, taught Banking Regulation, Corporations, etc. • President Clinton’s Chief Counselor for Privacy • WH coordinator HIPAA medical privacy rule • WH representative for GLBA privacy rules • One of first law professors to teach law of cybersecurity (2003) • Special Assistant to Pres. Obama, National Economic Council (2009-10) • President Obama’s Review Group on Intelligence and Communications Technology (“NSA Review Group”) • At Georgia Tech • Scheller College of Business • College of Computing • School of Public Policy • Assoc. Director of Policy, GT Institute for Information Security & Privacy

  4. December 2013: The Situation Room

  5. Case Study: “All Customer Funds” Wealth Management Tool • Hypothetical: “All Customer Funds” software tool • Major financial institution • Customers and brokers use the tool • Real-time balance sheet and profit/loss statements • Checking accounts • Savings accounts • Securities • Insurance • Other assets A wonderful Big Data tool: **Great analytics to help the customers plan their finances **Great analytics to help the company see what is profitable **More data is better! **ACF focus for discussion today

  6. Case Study: “All Customer Funds” Wealth Management Tool • Hypothetical: “All Customer Funds” software tool • Major financial institution • Customers and brokers use the tool • Real-time balance sheet and profit/loss statements • Checking accounts • Savings accounts • Securities • Real estate • Other assets Q: What other data goes into ACF? Information about customers, demographics, habits; set up a profile, survey; spending habits; benchmarking – zip, profile them based on categories, ratios; data brokers, public records

  7. Information Security • “Big Data” – good • “Big Data breach” – not good • When is more data better for your company? What types of data?

  8. Information Security • Need for information security for your systems • Firewalls – keep intruders away • Intrusion detection – find the intruders if they get in • Comprehensive information security plan and implementation

  9. Information Security & “All Customer Funds” Software • Suppose that hackers have entered the ACF data and copied the data there. Then, they post all of the data on the Internet. • What are the consequences to your customers and your company?

  10. Data Security & “All Customer Funds” • Consequences to the bank or insurance company if the ACF data is hacked and put on the Internet? • Loss of reputation – loss of trust, customers flee; loss of revenue • Legal and financial liability -- data breach laws, bank supervisors • Remediate – forensics, pay for an upgrade for the security system • Individual account numbers, drain the accounts and company may be liable • Security manager - fired

  11. Data Security & “All Customer Funds” • What consequences to the bank’s customers if their ACF data is hacked and put on the Internet? • Fraudulent account activity – hacker pretends to be Mr. Woodcock and withdraws $$ from the account • Identity theft – impersonate Mr. Woodcock using answers to “secret” questions • What if Mr. Woodcock is a senior government official, of interest to nation-state attackers? Are the defenses ready for that? • What if the CEO of the company has her information revealed to everyone? • These are some of the harms from Big Data Breach

  12. What Are the Information Security Risks? • Study the threat models: • Who has an incentive to attack you? • What do they want? • Criminals and other hackers: • PII (personal data), identity theft • Intellectual property

  13. What Are the Information Security Risks? • Insider threats are a large portion of the risk • Snowden was an “insider” • Role-based access controls and audits • Competitors or nation-state threats: • Is there any organization with advanced cyber skills that has a reason to attack your organization? • Sony, Saudi Aramco – major damage to servers, etc.

  14. Some Information Security Tips for Senior Managers • The friends and family test • Treat the data as you would want it treated for you, your family, and your friends • Or, what if the CEO found out you were doing these things with his or her data? Too risky for that?

  15. Big Data “Lake” or Not? • Limits of “M&M” defense (hard, crunchy exterior) • Data segregation • One attack doesn’t get everything • Data masking • Most analysts don’t need name & SSN • Data minimization • Costs and benefits from more data

  16. Big Data and Privacy • Privacy perhaps the biggest public policy constraint on Big Data/Analytics • If the data is about people, then likely have privacy issues • Fair Information Privacy Practices • Re-identification

  17. 1998 Privacy Laws Comprehensive Law Sectoral Law

  18. 2012 Privacy Laws Comprehensive Law Sectoral Law

  19. Fair Information Privacy Principles • Basic FIPPS, for handling PII, first announced in the 1970s: • Notice – tell customer how data is used, privacy policy • Choice – customer choice beyond intended uses • Access – customer sees financial and other records • Data Security – we have already discussed • Accountability – internal accountability for the organization, and enforcement by law

  20. What Changes with Big Data? • Let’s look at how fair information practices change with Big Data • EU Data Protection Directive & GDPR modeled on the 1980 OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data • Ban on personal data from EU to US and other countries unless provide “adequate” protection

  21. What Changes with Big Data? • Collection Limitation Principle – obtain data by lawful and fair means; where possible, with the knowledge or consent of the data subject • Purpose Specification Principle – use the data as initially intended or for other purposes “as are not incompatible with those purposes” (limit on “secondary uses” of data) • Use Limitation Principle – don’t disclose or use beyond those purposes except with consent or by the authority of law • The big questions: Are Big Data uses a permissible purpose? Do they require consent or legal authority before the Big Data is used?

  22. Privacy and “All Customers Funds” Software • What to do with the database of ACF software? Every asset, liability, and transaction for financial institution’s customers. • Consider an aggressive strategy: sell all the information from the database to another company, for a large fee • The other company can re-sell the information • The other company can do advanced analytics on the data, and can merge the data with other sources (such as other banks’ data) • Any objections to this strategy? What?

  23. Objections/support - Aggressive Strategy? • What risks accompany the benefits of monetizing the data? • Breach at the recipient/purchasing company? • Before you sell, perform due diligence on security • Contract terms to hold harmless (full transfer of liability) • Security in transit (encryption) • Do we have consumer consent? Need it? Go back and ask customers under new Terms of Service • GLBA opt out • PR problems – ArsTechnica, Wired

  24. Objections to Aggressive Strategy • The ACF database is a “crown jewel” of your company • Competitors may buy it, and try to steal your customers • Outsiders can study your company in detail, and develop business strategies to defeat you (trade secrets) • Customers may object: • Lost business – they prefer another bank that keeps their information private • Losses due to identity fraud • Regulators may object, with possible law suits • Could happen from privacy regulators in US, EU, elsewhere

  25. Other Strategies for Big Data? • Do all analysis in-house • Build capacity in your company for Big Data and advanced analytics • Even inside the company, consider masking names and other identifiers when transfer ACF data to the analytics teams

  26. Other Strategies? • Hire a Contractor to do Big Data and analytics • This company may have expertise you do not have in-house • Write a detailed contract saying what they can and what they can’t do with the data • OK to use for analytics purposes • Not OK to re-sell the data • Maybe OK to merge with data from other companies to create a richer data set, held by the contractor. If so, get better algorithms, but need strict controls against mis-use of the customer data.

  27. Masking Personal Data: De-Identification • Some data in ACF is highly identified – name, identity number, address, phone number • Can “de-identify” so that it is no longer linked to a specific, known individual. • That is safer to protect against identity theft • Once aggregated or de-identified enough, then no longer regulated by EU or others

  28. Re-Identification • Big Data makes it harder to keep data de-identified • Research studies show how to re-identify • Early study: medical records database, each person shows only gender, date of birth, and zip code (roughly 10,000 people). Seems anonymous. • Finding: could re-identify many people • Get a list of all adults in the zip code from the voting records, which contains date of birth and address • Date of birth is 366 days x 80 years = over 25,000 data cells • Most individuals were uniquely identified!

  29. How Big Data Helps Re-Identification • Usually have many more data points on each individual • Social media, location, public records, more • With good search (Google), a few of those data points might be linkable to a person • If so, can re-identify • Therefore, your business cannot assume that “anonymized” information is really anonymous • May need more privacy controls than you expected around the ACF data

  30. Summary on Information Privacy • Growing number of privacy laws • Big Data has many secondary uses of data – beyond the original purpose and thus possibly a problem under privacy laws • Big Data creates more detailed information linkable to the individual • Re-identification techniques mean you can’t assume the data is anonymous, especially if the data gets posted publicly • Consider whether you need stricter controls around the ACF and other customer and employee data

  31. Another Issue: Discrimination • Big Data enables many new analytic tools: • Which customer will pay a higher price? • Which customer will respond to this advertisement? • How can we lure a customer away from our competitor? • Big Data is generally “neutral” – the algorithms are created to find the profitable action for the company • The algorithms don’t care about the name of the customer

  32. Discrimination • In United States, attention to possibility that the algorithms will “discriminate” on the basis of race or national origin • Example, a lender says the mortgage rate will be 5% on average for one racial group and 6% on average for second group • Can members of the second group complain? Should this be illegal discrimination under the U.S. Equal Credit Opportunity Act?

  33. Discrimination • “Disparate treatment” – the company intentionally treats the customer differently if African-American, Hispanic, or Asian. • Clearly illegal. • “Disparate impact” – the statistics show a significantly different outcome, on average. • May be illegal. • The mathematical algorithms were not designed to discriminate, but the outcomes are different.

  34. Protecting Against Discrimination Complaints • Your analytics people can test the outcomes to see if have statistically large variations based on sensitive categories – race, national origin, gender, other things • U.S. companies are considering having “ethics review boards” on analytics projects, to detect problems and decide when not to use an algorithm with sensitive results.

  35. Which kinds of discrimination will be permitted for insurance? • As discussed in the prior session, history of debates in insurance about credit score, race, gender, and other types of information: • Can learn from these prior debates • Some times the data produces “efficient” scoring, tied to risk • But, concern about feedback loops: send more police to the precincts where have had previous arrests; that leads to more arrests from the previously-targeted population • Have the arrests (seems risk-based), but problematic • Distributional effects – can have consumers lose consumer surplus, suppliers gain supplier surplus, with modest efficiency gain

  36. Summary • Most of the training and research in analytics is on the benefits of Big Data • This session – some major risks that come from Big Data • Information Security risks • Big Data leads to risk of big data breach – hackers and others • Information Privacy risks • Growing range of privacy laws globally, • Big Data does not fit well under traditional privacy laws • Big Data makes it harder to keep data anonymous • New issues of discrimination are getting discussed • Financial institutions will need subject matter experts to address these risks

More Related