- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
cross-posted from: https://derp.foo/post/81940
There is a discussion on Hacker News, but feel free to comment here as well.
cross-posted from: https://derp.foo/post/81940
There is a discussion on Hacker News, but feel free to comment here as well.
Current laws in most places still require a human with a license to drive self driving cars (and I don’t see that changing any time soon with how terrible self driving cars still are). That makes the human driver, who should intervene in these scenarios, responsible.
Once we remove the human override, I would consider a self driving car breaking the law to be a faulty product, possibly requiring a recall if it happens more often. If any other part of the car is prone to breaking, you’d demand a recall too.
As for the fines, you’d probably see something like “the driver receives a fine but they can hold the company that sold them the car liable for a faulty product”.
Fining the manufacturer directly is a nice idea, but if Tesla does go bankrupt, where do we send the fines then?
I’m pretty sure there are autonomous cars driving around San Francisco, and have been for some time.
EDIT: Here’s an uplifting story about San Francisco-ians(?) interacting with the self-driving cars.