Shareholder activism has its roots in hostile takeover bids of the 1980s, when corporate raiders were viewed negatively
Assignment Solutions, Case study Answer sheets Project Report and Thesis contact aravind.banakar@gmail.com www.mbacasestudyanswers.com ARAVIND – 09901366442 – 09902787224 Business Ethics CASE STUDY (20 Marks) Shareholder activism has its roots in hostile takeover bids of the 1980s, when corporate raiders were viewed negatively by the media and shareholders. By the 1990s, thinking about the best structure for companies was shifting to reflect the idea that shareholders should have more of a say over companies' destinies. Today, the activists are massive corporations in their own right. Today, it is often the case that activist funds are worth more than the companies they are targeting. The amount of capital available to shareholder activists has gone up dramatically, including from asset classes with a lot of money, such as pension funds. Institutional investors are working with activists at an increasing rate. The campaigns are comprehensive. Activist shareholders now hire investment banks and the best law firms, outpacing the resources individual companies apply to the same analysis. The current wave of change is focused on boards. Shareholders want someone who is an investor on the board. It's becoming best practice to welcome someone with investing experience onto the board, in part to benefit from their perspective but also to preempt activist demands for an investor to join the board. "A lot of people defer to ISS." (ISS is the largest of the activist proxy advisory firms) "A big concern is that companies and activists are fundamentally operating from the perspective of a different timeframe." "We like to build value over a long period of time – to invest in a technology that may not have a payoff for 10 years, or to invest in research and development." "Typically the entry point for activists is sloppy corporate governance." "I see a lot of really good people that don't want to be on boards anymore." How that so-called big data is providing access to information that would not previously have been discovered, what are the ethical boundaries around companies’ use of this data? A panel of experts discussed “Ethical Use of Collected Data” as part of the Business and Organizational Ethics Partnership at Santa Clara University’s Markkula Center for Applied Ethics. The panelists included Shannon Vallor, associate professor of philosophy at Santa Clara University; Scott Shipman, general counsel and chief privacy officer at Sensity Systems; and MeMe Jacobs Rasmussen, chief privacy officer at Adobe Systems. The panel was moderated by Irina Raicu, director of Internet ethics at the Ethics Center. Vallor opened the discussion with an explanation of how big data is used and some of the ethical issues it raises. “The applications are virtually boundless, given that consumers are generating and we are collecting and storing unprecedented volumes of data in all sectors: private, public, heath care, education, commerce and entertainment,” Vallor said. “We are being overwhelmed by the promise of big data to solve persistent challenges in public health, criminal justice” and other areas. But there are risks as well as benefits to using big data. Privacy is one of the biggest issues, given the volume of data that is being collected. Consumers are justifiably concerned not only about what the company that collects the data will do with it, but also how it will be protected from third parties. The ethical issues go well beyond privacy, Vallor said. “It’s easy to lose sight of that because privacy is so significant and challenging.” For example, does big data offer a fair distribution of both risks and benefits? Will it exacerbate the digital divide, with consumer needs and desires being determined based on what those who own digital devices want? Accuracy and reliability are also issues, especially when institutions decide to make important decisions based on the data. “The principle of garbage in, garbage out applies to big databases as well as small databases,” Vallor said. Finally, Vallor said, big data could be used to discriminate, with “analytics as a quick and dirty form of redlining.” Certain types of people could be classified as poor risks for employment, health care interventions, or educational services, for example. Rasmussen, who works closely with Adobe’s digital marketing business, pointed to both risks and benefits of big data when describing her job. “We do collect a lot of data, but we retain very limited rights to use that data,” she said. “My team works with the business units to understand what they are doing with the data and to advise them on how to develop the products in a way that is privacy-focused.” However, Adobe requests permission to aggregate data from its customers, providing information that would not be otherwise available, such as how many PCs, tablets and mobile devices are accessing websites. Adobe tries to follow the dictum, “Say what you do, do what you say, and don’t surprise the user,” Rasmussen said. “’Don’t surprise the user’ sounded really good to me when I first started in this job. But I’ve learned over the years that transparency is hard when you’ve got complicated products. Figuring out whether what you’re doing will surprise the user is often difficult.” “We might draft a privacy policy that says we collect XX and do YY,” Rasmussen said. “Right after you publish it, the product team may come to you and say, ‘Can we do ZZ as well?’ You don’t want to stop innovation, but you can’t just keep revising your privacy policy.” Shipman, too, has a job that illustrates both the promise and the risks of using big data. Sensivity is an Internet of Things company, capitalizing on the transition to LED lighting to create Light Sensory Networks that will allow cities and other entities to deliver both energy-efficient lighting and a real-time, global database of information that allows customers to manage and understand their physical environment for greater productivity, efficiency and security. He was hired as chief privacy officer to build and govern a global privacy program, in which he will establish data protection standards and lead industry-wide privacy initiatives. One of the issues, Shipman said, is defining the audience. For companies like his, the customers — those who purchase the product — are not the only ones affected by it. “To improve operations and efficiency of a city, for example, it’s important to understand how city assets are used – roads, parking spots, utilities, etc. This means collecting data on behalf of our customer – the city – from the end user who is not our customer. Often it’s the non-customer that is misinformed,” he said. “You have to keep those perceptions in mind as well.” One challenge is that it’s difficult for everyone involved to envision either the benefits or the harms inherent in a use of big data, Shipman said. “Consumers expect companies to innovate. Seeking permission for every benefit would restrict innovation and limit the benefits.” One question from the audience centered around whether consumers are actually as concerned about privacy as they say they are, given that they often don’t bother to use encryption. Raicu said studies have shown people give up less data once they know what’s being collected, which suggests they do value privacy. And Vallor said her students are increasingly using apps like Snapchat, which they view as protecting their privacy. “It’s not true that people either protect themselves from all harm or don’t care about harm,” Vallor said. “The fact that there are people out there who are knowingly allowing their data to be collected doesn’t mean that those people expect to be harmed and aren’t going to be outraged when they are.” On the other hand, Rasmussen said, some people are less concerned the more they learn about how data is collected and used. They may prefer to see ads that are relevant to them, for example, even though it means their online behavior is being tracked. Vallor noted that predicting people’s responses will never be fully accurate. As a strategy, not surprising the user “has got its limitations, and then there has to be another strategy: How do we manage the situation when the user is surprised – or when we’re surprised?” And Raicu noted that there could be a “chilling effect if people are discouraged from using technology because of privacy concerns.” Answer the following question. Q1. Debate the dictum, “Say what you do, do what you say, and don’t surprise the user”, in relation to the ethical use of the big data being collected by the companies. Assignment Solutions, Case study Answer sheets Project Report and Thesis contact aravind.banakar@gmail.com www.mbacasestudyanswers.com ARAVIND – 09901366442 – 09902787224
★
★
★
★
★
78 views • 6 slides