60 likes | 77 Views
Explore the ethical dilemmas surrounding self-driving cars and artificial intelligence with Dr. Huma Chaudhry. Learn about the moral implications, safety concerns, and societal impacts of autonomous vehicles. Join the conversation on aligning machine ethics with human values.
E N D
Dr Huma Chaudhry Dr Huma Chaudhry Founder FeelAI A.I. based Consultancy company Lectures College of Engineering & Science, Victoria University at Victoria University and Melbourne Institute of Technology. Her work focuses on Computer Vision and Artificial Intelligence. Huma will explore the following issues: Do the Automated Vehicles, such a cars, have a mind of their own? How do we ensure that the machines ethics be aligned with human values? Discussion scenario such as an automated car on a road? Ethics vary from culture to culture. Whose ethics define good ethics?
Ethical Machines: Intelligent cars Dr Huma Chaudhry
Self-DrivingCars (aka driverless cars, autonomous cars,robocars) Self-DrivingCars (aka driverless cars, autonomous cars,robocars) • Utopianview • Save lives (1.3 million die every year in manualdriving) • 4D’s of human folly: drunk, drugged, distracted, drowsydriving • Eliminate carownership • Increase mobility andaccess • Save money • Make transportation personalized, efficient, andreliable • Dystopianview • Eliminate jobs in the transportationsector • Failure (even if much rarer) may not depend on factors that are human interpretable or under humancontrol • Artificial intelligence systems may be biased in ways that do not coincide with social norms or be ethicallygrounded • Security LexFridman (2018), MIT 6.S094: Deep Learning for Self-DrivingCars, Lecture 2, [Presentation Slides].. https://selfdrivingcars.mit.edu
Moral Machines: Self Driving cars “We are entering an age in which machines are tasked not only to promote well-being and minimize harm, but also to distribute the wellbeing they create, and the harm they cannot eliminate. Distribution of well-being and harm inevitably creates tradeoffs, whose resolution falls in the moral domain” Awad, E., Dsouza, S., Kim, R., Schulz, J., Henrich, J., Shariff, A., Bonnefon, J.F. and Rahwan, I., 2018. The moral machine experiment. Nature, 563(7729), p.59.
A massive experiment asked users who a self-driving car should save -- or not -- in various ethical dilemmas. (Moral Machine/MIT/Moral Machine/MIT) By Carolyn Y. Johnson October 24, 2018 Imagine this scenario: An autonomous vehicle experiences a sudden brake failure. Staying on course would result in the death of two elderly men and an elderly woman who are crossing on a ‘do not cross’ signal (left). Swerving would result in the death of three passengers: an adult man, an adult woman, and a boy (right). Should the car swerve to mow down the cats or plow into two people?
QuestionsWhat is an ethical decision in this scenario?How to ensure the car makes an ethical decision?In case of an accident, who is held responsible for it?