How should the law deal with the choices car manufacturers program into autonomous vehicles?

Learning Goal: I’m working on a criminal justice writing question and need an explanation and answer to help me learn.Self-Driving cars will ultimately have to make decisions, sometimes life-and-death decisions. Before responding to this post, visit MIT’s “Moral Machine” website: https://www.moralmachine.net/. The Moral Machine is a platform to gather a human perspective on the moral dilemmas that self-driving cars may face when forced to choose between the lesser of two evils–like hitting a child dashing into the roadway or swerving into another lane and killing three adults in an oncoming car. Spend 15 minutes deciding the moral dilemmas posed on the site.How should the law deal with the choices car manufacturers program into autonomous vehicles? For instance, do you think a computer program that compels a self-driving car to kill a jaywalking child instead of three adults in a lawfully operated vehicle should be considered a design defect? From a negligence standpoint, the law judges conduct based upon a reasonable person standard. But reasonable people make different moral judgments. Can we ever impose tort liability based upon moral judgments–particularly those programmed into a machine–that reasonable people can reasonably disagree over?

How to create Testimonial Carousel using Bootstrap5

Clients' Reviews about Our Services