Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.
In case the software makes severe mistakes surprisingly, normal drivers maybe don’t have a chance to regain control. Normal drivers are not like educated test drivers.
The article keeps calling it “Autopilot”, which is different from “Full Self Driving”.
If they are correct, then it’s all on the driver. Autopilot is just a nicer adaptive cruise control, and should be treated as such. Many cars have them, even non-smart vehicles. Even my seven year old Subaru had similar (much dumber but similar)
That being said, people seem to confuse the names of these different functionalities all the time, including throughout this thread. However, even if they were confused and meant FSD, my car has feedback to require your hands in the wheel, so I don’t understand how you can claim ignorance
My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.
Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn your steering wheel hard to the left.
You will have no chance.
What have you done wrong? What is it what you are accountable for?
Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.
In case the software makes severe mistakes surprisingly, normal drivers maybe don’t have a chance to regain control. Normal drivers are not like educated test drivers.
The article keeps calling it “Autopilot”, which is different from “Full Self Driving”.
If they are correct, then it’s all on the driver. Autopilot is just a nicer adaptive cruise control, and should be treated as such. Many cars have them, even non-smart vehicles. Even my seven year old Subaru had similar (much dumber but similar)
That being said, people seem to confuse the names of these different functionalities all the time, including throughout this thread. However, even if they were confused and meant FSD, my car has feedback to require your hands in the wheel, so I don’t understand how you can claim ignorance
No. That difference is meaningless, since both softwares provide autonomy level 2. The responsibilities are exactly the same.
My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.
Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn your steering wheel hard to the left.
You will have no chance.
What have you done wrong? What is it what you are accountable for?
For mine
So did the car think there was an impending collision? That should be obvious in the logs and the only reason for sudden maneuvers
Cars do not think LOL