1 / 14

Agenda for 28th Class

Review of the legal landscape surrounding autonomous vehicles and liability issues. Discussion of case studies involving accidents, negligence, and manufacturer liability. Exam format and details provided.

rrobbins
Download Presentation

Agenda for 28th Class

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda for 28th Class • Handouts • Slides • Autonomous Vehicles • Liability • Trolley & Transplant Problems • Exam • Monday, May 6, 2-4PM, this room (Law 223) • Format similar to midterm • Open book • Multiple choice, short answer question(s), essay(s) • Will cover whole semester, with emphasis on second half

  2. Review of Last Class I • Autonomous vehicles • Light regulation so far • Federal and state governments want to encourage new technology • No federal legislation yet • But probably some soon • Preemption of state regulation of design • Research, reporting, education • State regulation of testing

  3. Liability for Auto Accidents • Suits against driver • Negligence: Driver is liable only if negligent (e.g. at fault) • Effect of victim negligence (fault) varies by state • Contributory negligence: If victim was negligent, driver is NOT liable • Pure Comparative Negligence • Driver’s liability is reduced depending on “relative fault” • Modified Comparative Negligence • If victim was less than 50% negligent: comparative negligence • If victim was more than 50% negligent: contributory negligence • Suits against manufacturer • Strict liability for manufacturing defects (except negligence in 4 states) • Risk/utility (~ negligence) for design defects • Liability if there was a safer “reasonable alternative design” • Except, in some states, liability depends on “consumer expectations” • Effect of victim negligence varies by state • No effect OR comparative negligence OR contributory negligence

  4. Liability for Autonomous Vehicles • 4) A fully autonomous vehicle owned by Prof Erman collided with a tree. Prof. Erman was a passenger in his vehicle. He was severely injured, and his vehicle suffered significant damage. The accident could have been avoided if the vehicle had a higher quality sensor that worked better in cloudy conditions. That sensor was available as a $2000 option, which Prof. Erman did not buy. Is the manufacturer liable under current law? Should it be liable?

  5. Liability for Autonomous Vehicles • 5) A fully autonomous vehicle owned by Prof Erman collided with a conventional vehicle driven by Prof. Gross. Prof. Erman was a passenger in his vehicle. He was not injured, and his vehicle suffered only minor damage, but Prof. Gross suffered severe injuries, and her car was totaled. Prof. Gross was driving in a responsible, non-negligent manner.Prof. Erman’s vehicle collided with Prof. Gross’s vehicle because its sensors had not been properly recalibrated during a regular service recommended in the Owner’s Manual. The manufacturer viewed service as a profit center and charged $1000 to recalibrate the sensors. Wilton’s, a local service station unaffiliated with the car’s manufacturer, charged only $500 to recalibrate the sensors, but did so improperly. Wilton’s does not have sufficient assets or insurance coverage to fully compensate Prof. Gross. Who is liable for Prof. Gross’s injuries under current law? Who should be liable?

  6. Liability for Autonomous Vehicles • 6) A fully autonomous vehicle owned by Prof Erman collided with a conventional vehicle driven by Prof. Gross. Prof. Erman was a passenger in his vehicle. He was not injured, and his vehicle suffered only minor damage, but Prof. Gross suffered severe injuries, and her car was totaled. Prof. Gross was driving in a responsible, non-negligent manner. Prof. Erman’s vehicle collided with Prof. Gross’s vehicle because it lacked a sensor that could have been included in the car for an additional $10. Even without that additional sensor, Prof. Erman’s car was safer than a conventional car. Is Prof. Erman and/or the manufacturer of his vehicle liable under current law? Should Prof. Erman and/or the manufacturer be liable?

  7. Liability for Autonomous Vehicles • 7) A fully autonomous vehicle owned by Prof Erman collided with a conventional vehicle driven by Prof. Gross. Prof. Erman was a passenger in his vehicle. He was not injured, and his vehicle suffered only minor damage, but Prof. Gross suffered severe injuries, and her car was totaled. Prof. Gross was driving in a responsible, non-negligent manner, but experts testified that, if Prof. Gross has been driving an autonomous vehicle, the accident would not have happened, because an autonomous vehicle would have sensed the problem earlier than Prof. Gross and would have applied the brakes more quickly. Prof. Erman’s vehicle had the safety features and sensors typical of autonomous vehicles and was well maintained, but additional sensors, costing $10,000 would have enabled his car to have prevented the accident. Is Prof. Erman and/or the manufacturer of his vehicle liable under current law? Should Prof. Erman and/or the manufacturer be liable?

  8. Liability for Autonomous Vehicles • 8. Prof. Erman owns a vehicle with SAE Level 3 automation. That is, “Driver is a necessity, but is not required to monitor the environment. The driver must be ready to take control of the vehicle at all times with notice.” A sign on the dashboard reminds all drivers that they must be ready to take control at all times when the car’s driver alert alarm sounds. Prof. Erman has owned the car for over a year and has driven it more than 15,000 miles. During those 15,000 miles, the car worked flawlessly and the driver alarm never sounded. Prof. Erman grew quite confident in his car’s abilities. He decided to make a night time trip from Los Angeles to San Franciso. As planned, he slept when the car was on the 5 freeway. Midway through the trip. The driver alert sounded, because the car’s sensors indicated a situation that the automated system didn’t know how to handle – construction that required all drivers to divert into a lane that ordinarily reserved for vehicles going the opposite direction. The driver alert alarm sounded. Prof. Erman woke up and tried to assess the situation, but the car crashed into construction workers who were repairing the road before Prof. Erman could react. Is Prof. Erman and/or the manufacturer of his vehicle liable under current law? Should Prof. Erman and/or the manufacturer be liable?

  9. Trolley Problem • Suppose you are the driver of a trolley. The trolley rounds a bend, and there come into view ahead five track workmen, who have been repairing the track. … You can turn the trolley onto [a spur] and thus save the five men on the straight track ahead. Unfortunately, … there is one track workman on that spur of track. He can no more get off the track in time than the five can, so you will kill him if you turn the trolley onto him. Is it morally permissible for you to turn the trolley? …. • 1. In the first problem described above (“The Trolley Problem”), do you think it is morally permissible to turn the trolley? Is it morally obligatory to do so? Explain your reasons. Also set out at least one counter-argument.

  10. Transplant Problem • Imagine yourself to be a surgeon.... At the moment you have five patients who need organs. Two need one lung each, two need a kidney each, and the fifth needs a heart. If they do not get those organs today, they will all die; if you find organs for them today, you can transplant the organs and they will all live. But where to find the lungs, the kidneys, and the heart? The time is almost up when a report is brought to you that a young man who has just come into your clinic for his yearly check-up has exactly the right blood-type, and is in excellent health. Lo, you have a possible donor. All you need do is cut him up and distribute his parts among the five who need them. You ask, but he says, "Sorry. I deeply sympathize, but no." Would it be morally permissible for you to operate anyway? • 2. In the second problem described above (“The Transplant Problem”), do you think it is morally permissible for the surgeon to “cut up [the young man] and distribute his parts among the five who need them”? Is it morally obligatory to do so? Explain your reasons. Also set out at least one counter-argument.

  11. Questions • 3. Suppose an automobile manufacturer is writing the software for autonomous vehicles in various situations. Consider the following situation. The vehicle is driving the speed limit, which is 75 miles per hour. All of a sudden, the vehicle perceives that someone is in the road in front of the car. The only way to avoid hitting the pedestrian would be to swerve into the lane of oncoming traffic. In that lane, there are five cyclists, and they would be killed if the autonomous vehicle swerved. Is this situation more like the Trolley Problem or the Transplant Problem? Or is it significantly different from both? What should the automobile be programmed to do in this situation? • 4. Does your answer to question #3 depend on why the pedestrian was walking on the street? For example, would it matter if the pedestrian was jaywalking or walking in the street only because construction on the sidewalk forced her to?

  12. Questions • 5. Suppose an automobile manufacturer is writing the software for autonomous vehicles in various situations. Consider the following situation. The vehicle is driving the speed limit, which is 75 miles per hour. All of a sudden, the vehicle perceives that there are 5 cyclists in the road in front of the car. The only way to avoid hitting the cyclists would be to swerve into the lane of oncoming traffic. In that lane, there is a pedestrian would be killed if the autonomous vehicle swerved. The pedestrian is in the road because construction forced her off the sidewalk. Is this situation more like the Trolley Problem or the Transplant Problem? Or is it significantly different from both? What should the automobile be programmed to do in this situation?

  13. Questions • 6. Suppose an automobile manufacturer is writing the software for autonomous vehicles in various situations. Consider the following situation. The vehicle is driving the speed limit, which is 75 miles per hour, and there is only one person in the car. All of a sudden, the vehicle perceives that there are 5 cyclists in the road in front of the car. There are a number of cars in the lane of oncoming traffic, so swerving into that lane would likely kill many people as well. The car could, however, swerve off the road. Unfortunately, there’s a cliff, so if the car swerves off the road, the occupant will be killed. What should the automobile be programmed to do in this situation? • 7. How do you think consumers would want the car to be programmed in the situation described in Question 6? If you were advising an auto manufacturer, would you advise it to program the vehicle in the way you predict consumers would want? If not, why not? Should the law require automobiles to be programmed in a particular way in the situation described in Question 6?

  14. Questions • 8. Suppose an automobile manufacturer is writing the software for autonomous vehicles in various situations. Consider the following situation. The vehicle is driving the speed limit, which is 75 miles per hour. All of a sudden, it perceives that there is an elderly lady in front of the car. The only way to avoid hitting the elderly lady would be to swerve and hit a 20-year old college student. Both the elderly lady and the college student are walking in the road because construction has forced them off the sidewalk. What should the automobile be programmed to do in this situation?

More Related