I find myself asking "how is this legal" several times a day lately. Why does "tech" get to stress-test long standing rules and norms? It's an attitude of "I dare you to regulate us". I'm remembering a comment on HN a week or so ago . To paraphrase: "thing, but from the internet".
There is effectively 0 traffic to contend with in that video. I do not see how you could reasonably claim that video should alleviate fears about unsafe behavior.
It's both. It's not the first time. Tesla recalled 50,000 cars because they programmed them to illegally roll through stop signs at up to 5.6mph/9kph [0]
Can you ship some over here? Seems like if I want to get through a light before it turns red I find myself betting on the lane with the work van or heavy truck in stack rather than the one with the Tesla somewhere in the mix. In any sane world it'd be the opposite.
Good grief. Can't it just be whimsy? Must everything be a conspiracy these days? They have a tunable threshold for when the car will attempt a lane change. Set it too high and it makes too many and annoys the occupants. Set it too low and it fails to choose the right lane and spends more time in traffic. That's all it is.
They just named the low threshold case after a fun driving movie because joke. Is that really so hard to believe?
Why would you expect it to? Tesla has had (still has?) settings that allow the user to define how much it is allowed to break the speed limit and whether it can run red lights.
This is one of those things where if we had a normal functioning regulatory environment, Elon Musk would be in jail long before he got this obvious. Having an "Autopilot" mode people get killed by because it doesn't actually drive the car should've been plenty.
You would need more than one rate:
- percentage of product owner operators who are killed by the product they own/operate per year
- number of other people killed by the median product owner per year
- inflation adjusted property damage (belonging to other people, or to the public/govt) caused by the median product owner per year
Regulating products based on the potential to kill, maim, or injure, is not a terrible idea.
It’s why we require more training of people who fly 747, then people who operate cars.
But if it was going to work, we’d have to do it without carve outs - if it only applies to some products, then it’s really just politics.
If Tesla’s auto pilot is really so safe that it needs little or no regulation, then by definition, regular cars are so dangerous that they should be banned or require much more regulation. But I only ever hear the first half of this argument, which makes me worry this is not really an argument about safety.
I keep using this as an example - the Therac mechines for radiotherapy undoubtedly saved lives. They also undoubtedly administered radiation treatment better, faster and more accurately than any manual operator could have done.
And yet, they all got recalled when we realized they "sometimes" administer a lethal dose of radiation by mistake. Or do you think they should have continued operating? What was the "acceptable maximum rate at which your products can kill people" for them? Because I'd argue it's zero. And it should be zero for Teslas or any cars that have something called "autopilot".
A surgeon who performed a voluntary operation which caused unpredictable complications leading to death shouldn't necessarily stop operating on other patients. There's a line that has to be drawn somewhere, I'm not going to draw it.
A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Replace the surgeon with a robotic surgeon operating under some kind of autonomous mode and yes, I think every robot of its kind should be immediately pulled out of use.
>>A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Hard disagree, and I honestly hate it when people make that argument. The number should be zero.
Autopilot in planes also doesn’t actually pilot the plane gate to gate. That name actually seems consistent with the plane use of autopilot, as Autopilot in a Tesla merely follows the lane markings and keeps distance from the car in front.
I have no expertise in piloting or the details of autopilot in a plane, except that it does not fly the plane gate to gate, however it does assist the pilot in mundane tasks, such as cruising at altitude and following a path. Maybe modern ones land/take off, I don’t know, but a pilot is still required for many crucial tasks as far as I know.
Which is what the Autopilot function does in a Tesla, so I find it to not be a misleading name.
Except that obviously the difference is that plane pilots are rigourously tested, certified, and they have their every action including their voice recorded while they operate the plane, and they are held accountable with legally required self reporting for any mistakes no matter how innocent. There is no such scrutiny with car drivers, in fact in some places in the world you can operate a car with zero formal training and a short form test that tests bare fundamentals and nothing else - to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
But sure, Tesla's autopilot is the same as a plane autopilot in all the other respects.
>to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
This is irrelevant to the branding. Just as autopilot in a plane assists pilots in ideal conditions, Autopilot mode in a Tesla assists drivers in ideal conditions.
As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature, but merely assisted driving. Again, the broader point being that autopilot is not known to fly planes end to end, so there should be no confusion due to the name that Autopilot in a Tesla will drive end to end.
> Again, the broader point being that autopilot is not known to fly planes end to end
Is the public broadly aware of that?
There's a colloquial phrase in American English, "to be on autopilot", meaning when a person acts without awareness of what they're doing, often used when somebody makes a stupid mistake during a lapse of attention.
I don’t see why not. I didn’t go to pilot school or have any plane related interests, but from movies and tv shows and the fact that there are 2 or more pilots on every plane, it would be prudent to assume there are limitations.
The colloquialism of a person being on autopilot and making mistakes seems apt here, too. You use the Autopilot function in the car, and you don’t pay attention, then you will get in trouble.
>>As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature
And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
>>Autopilot mode in a Tesla assists drivers in ideal conditions.
That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
>And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
Drivers messing with sensors is irrelevant to Tesla informing drivers of the limitations, which the car clearly does.
>That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
Doesn’t seem like you have used a Tesla. There is no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point. Tesla locks you out of you Autopilot look away too much, and they make it clear how it is gimped in case you want to spend $200 per month for their “Full” Self Driving feature.
Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
>>no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point
And I hope no one does. But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
>>Drivers messing with sensors is irrelevant
Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
>>Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
> But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
No, which is why it tells you to keep your eyes on the road and pay attention. It’s literally a bunch of cheap cameras and some software trying to draw some lines and keep the car between them and a certain distance behind whatever is in front of it.
>Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
And you can do the same with any other car that has lane assist or whatever feature name that keeps the car in a lane and automatically brakes and accelerates.
> Uhm....good? That's great in fact?
What is the logic here? You are complaining about Tesla Autopilot being unsafe, but also complaining about the thing that makes Tesla Autopilot safer than other automakers’ lane assist/braking feature?
I find myself asking "how is this legal" several times a day lately. Why does "tech" get to stress-test long standing rules and norms? It's an attitude of "I dare you to regulate us". I'm remembering a comment on HN a week or so ago . To paraphrase: "thing, but from the internet".
The article says it “speeds” but provides no sources or evidence, you might want to take that with a grain of salt.
Here’s a video of the new mode, it seems pretty normal, you wouldn’t bat an eye to a friend driving in the exact same way: https://www.youtube.com/watch?v=C8uIPsaF-yY
Not that I would trust it with my life at this point in time, but the claims do seem exaggerated.
There is effectively 0 traffic to contend with in that video. I do not see how you could reasonably claim that video should alleviate fears about unsafe behavior.
It does not seem to go over the speed limit, even on an empty road?
[dead]
> Mad Max
Fellas, the "Torment Nexus" tweet was supposed to be a joke.
Either it's overhyped marketing, or Tesla has automated aggressive A-hole driving. I don't like either option.
It's both. It's not the first time. Tesla recalled 50,000 cars because they programmed them to illegally roll through stop signs at up to 5.6mph/9kph [0]
[0] https://www.tesla.com/support/recall-rolling-stop-functional...
In Sweden, Tesla is the new BMW. It attracts aggressive A-hole drivers, so no need to automate it.
Can you ship some over here? Seems like if I want to get through a light before it turns red I find myself betting on the lane with the work van or heavy truck in stack rather than the one with the Tesla somewhere in the mix. In any sane world it'd be the opposite.
Soon enough Tesla will be omitting turn signals to save on manufacturing costs
I think torque is what attracts (or provokes) aggressive drivers.
Good grief. Can't it just be whimsy? Must everything be a conspiracy these days? They have a tunable threshold for when the car will attempt a lane change. Set it too high and it makes too many and annoys the occupants. Set it too low and it fails to choose the right lane and spends more time in traffic. That's all it is.
They just named the low threshold case after a fun driving movie because joke. Is that really so hard to believe?
“Move fast and break things” is taking on a very literal meaning.
Does this mode comply with state traffic regulations like follow distance and lane change protocol?
Why would you expect it to? Tesla has had (still has?) settings that allow the user to define how much it is allowed to break the speed limit and whether it can run red lights.
How could they think that this is a good idea? Baffling.
Influencers will post about it and that will resuscitate their brand. Somehow.
Isn't it kind of dumb that only after something like this hits world wide distributed news does anyone investigate anything?
As far as I can see they rolled it out without any warning, and, well, presumably without asking the regulators.
The feature was only available this month
This is one of those things where if we had a normal functioning regulatory environment, Elon Musk would be in jail long before he got this obvious. Having an "Autopilot" mode people get killed by because it doesn't actually drive the car should've been plenty.
Agreed but we are through the looking glass at this point.
There has to be some acceptable maximum rate at which your products can kill people, and Tesla's autopilot is probably below that rate.
You would need more than one rate: - percentage of product owner operators who are killed by the product they own/operate per year - number of other people killed by the median product owner per year - inflation adjusted property damage (belonging to other people, or to the public/govt) caused by the median product owner per year
Regulating products based on the potential to kill, maim, or injure, is not a terrible idea.
It’s why we require more training of people who fly 747, then people who operate cars.
But if it was going to work, we’d have to do it without carve outs - if it only applies to some products, then it’s really just politics.
If Tesla’s auto pilot is really so safe that it needs little or no regulation, then by definition, regular cars are so dangerous that they should be banned or require much more regulation. But I only ever hear the first half of this argument, which makes me worry this is not really an argument about safety.
Thinks aren’t magically legal just as long they don’t kill “too many” people.
Things aren't magically illegal just because they sometimes kill people
I keep using this as an example - the Therac mechines for radiotherapy undoubtedly saved lives. They also undoubtedly administered radiation treatment better, faster and more accurately than any manual operator could have done.
And yet, they all got recalled when we realized they "sometimes" administer a lethal dose of radiation by mistake. Or do you think they should have continued operating? What was the "acceptable maximum rate at which your products can kill people" for them? Because I'd argue it's zero. And it should be zero for Teslas or any cars that have something called "autopilot".
A surgeon who performed a voluntary operation which caused unpredictable complications leading to death shouldn't necessarily stop operating on other patients. There's a line that has to be drawn somewhere, I'm not going to draw it.
A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Replace the surgeon with a robotic surgeon operating under some kind of autonomous mode and yes, I think every robot of its kind should be immediately pulled out of use.
>>A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Hard disagree, and I honestly hate it when people make that argument. The number should be zero.
Autopilot in planes also doesn’t actually pilot the plane gate to gate. That name actually seems consistent with the plane use of autopilot, as Autopilot in a Tesla merely follows the lane markings and keeps distance from the car in front.
Care to list the major functions of autopilot in airplanes and the Tesla equivalent?
I have no expertise in piloting or the details of autopilot in a plane, except that it does not fly the plane gate to gate, however it does assist the pilot in mundane tasks, such as cruising at altitude and following a path. Maybe modern ones land/take off, I don’t know, but a pilot is still required for many crucial tasks as far as I know.
Which is what the Autopilot function does in a Tesla, so I find it to not be a misleading name.
> I only have a vague idea of what autopilot does but I can confidently say that Tesla autopilot is the same.
That is not what I wrote, but feel free to interpret it that way if it makes you feel better.
> Having an "Autopilot" mode people get killed by because it doesn't actually drive the car should've been plenty.
Just like you need a pilot paying attention even when a plane is using autopilot, you need a driver paying attention when a Tesla is on Autopilot.
Where is the incongruency?
Except that obviously the difference is that plane pilots are rigourously tested, certified, and they have their every action including their voice recorded while they operate the plane, and they are held accountable with legally required self reporting for any mistakes no matter how innocent. There is no such scrutiny with car drivers, in fact in some places in the world you can operate a car with zero formal training and a short form test that tests bare fundamentals and nothing else - to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
But sure, Tesla's autopilot is the same as a plane autopilot in all the other respects.
>to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
This is irrelevant to the branding. Just as autopilot in a plane assists pilots in ideal conditions, Autopilot mode in a Tesla assists drivers in ideal conditions.
As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature, but merely assisted driving. Again, the broader point being that autopilot is not known to fly planes end to end, so there should be no confusion due to the name that Autopilot in a Tesla will drive end to end.
> Again, the broader point being that autopilot is not known to fly planes end to end
Is the public broadly aware of that?
There's a colloquial phrase in American English, "to be on autopilot", meaning when a person acts without awareness of what they're doing, often used when somebody makes a stupid mistake during a lapse of attention.
>Is the public broadly aware of that?
I don’t see why not. I didn’t go to pilot school or have any plane related interests, but from movies and tv shows and the fact that there are 2 or more pilots on every plane, it would be prudent to assume there are limitations.
The colloquialism of a person being on autopilot and making mistakes seems apt here, too. You use the Autopilot function in the car, and you don’t pay attention, then you will get in trouble.
>>As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature
And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
>>Autopilot mode in a Tesla assists drivers in ideal conditions.
That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
>And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
Drivers messing with sensors is irrelevant to Tesla informing drivers of the limitations, which the car clearly does.
>That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
Doesn’t seem like you have used a Tesla. There is no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point. Tesla locks you out of you Autopilot look away too much, and they make it clear how it is gimped in case you want to spend $200 per month for their “Full” Self Driving feature.
Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
>>no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point
And I hope no one does. But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
>>Drivers messing with sensors is irrelevant
Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
>>Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
Uhm....good? That's great in fact?
> But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
No, which is why it tells you to keep your eyes on the road and pay attention. It’s literally a bunch of cheap cameras and some software trying to draw some lines and keep the car between them and a certain distance behind whatever is in front of it.
>Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
And you can do the same with any other car that has lane assist or whatever feature name that keeps the car in a lane and automatically brakes and accelerates.
> Uhm....good? That's great in fact?
What is the logic here? You are complaining about Tesla Autopilot being unsafe, but also complaining about the thing that makes Tesla Autopilot safer than other automakers’ lane assist/braking feature?