Automated Cars - the future ?
Posted by: Don Atkinson on 02 July 2016
The driver of a Tesla car died in Florida in May after colliding with a lorry.
Under scrutiny is Tesla's Autopilot feature, which automatically changes lanes and reacts to traffic.
In a statement, Tesla said it appeared the Model S car was unable to recognise "the white side of the tractor trailer against a brightly lit sky" that had driven across the car's path.
The company said the crash was a "tragic loss".
There was a moral question posed that I was reading about earlier this week...
Should your car risk killing you, to save the life of others?
Eloise posted:There was a moral question posed that I was reading about earlier this week...
Should your car risk killing you, to save the life of others?
If the car could be smart enough to make split-second decisions on the action that would minimise loss-of-life, I'd say yes, they should do that, even if means risking it's occupants to save others. Of course, human drivers make no such decisions reliably, responding imperfectly to stressful and suddenly changing circumstances.
Don Atkinson posted:The driver of a Tesla car died in Florida in May after colliding with a lorry.
Under scrutiny is Tesla's Autopilot feature, which automatically changes lanes and reacts to traffic.
In a statement, Tesla said it appeared the Model S car was unable to recognise "the white side of the tractor trailer against a brightly lit sky" that had driven across the car's path.
The company said the crash was a "tragic loss".
The graphic I saw showed the tractor-trailer (semi-trailer to non 'mericans) approaching from the opposite direction then turning across the path of the Tesla that was travelling straight (at highway speeds) and which had right-of-way. This is a pretty common collision type that human drivers both frequently cause and fail to avoid. I'm not condemning the car because it wasn't perfect and was unable to cope with an illegal and dangerous turn by another road user. Other people may blame, sorry, "call into question" the Tesla and perhaps not hold the idiot in the truck accountable.
People will make shrill proclamations about the safety of computerised driving aids, failing to recognise that humans are generally pretty terrible at driving, and that things like anti-lock brakes, traction control and airbags (as well as better road engineering) save many, many lives each year in spite of drivers' best efforts to kill each other at a rate higher than they currently do. Maybe the truck needs autopilot.
There has been some suggestion the Tesla driver may have been watching a DVD at the time of the accident. If the case it would likely have impeded his reaction to a potential accident.
GregW posted:There has been some suggestion the Tesla driver may have been watching a DVD at the time of the accident. If the case it would likely have impeded his reaction to a potential accident.
The truck driver made that claim, but admits he didn't actually see it. It isn't possible to watch a DVD/movie on the screen of the Tesla while it is moving.
The article I read in the Grun suggested that rather than the Tesla's screen, it was a portable DVD player. Apparently the Florida Highway Patrol confirmed to Reuters that one was found in the vehicle. I hope it proves not to be the case, that he was watching a movie at the time of the accident.
GregW posted:The article I read in the Grun suggested that rather than the Tesla's screen, it was a portable DVD player. Apparently the Florida Highway Patrol confirmed to Reuters that one was found in the vehicle. I hope it proves not to be the case, that he was watching a movie at the time of the accident.
If the guy was doing that he's an idiot.
winkyincanada posted:GregW posted:The article I read in the Grun suggested that rather than the Tesla's screen, it was a portable DVD player. Apparently the Florida Highway Patrol confirmed to Reuters that one was found in the vehicle. I hope it proves not to be the case, that he was watching a movie at the time of the accident.
If the guy was doing that he's an idiot.
That depends. We should wait to hear what movie he was watching.
joerand posted:winkyincanada posted:GregW posted:The article I read in the Grun suggested that rather than the Tesla's screen, it was a portable DVD player. Apparently the Florida Highway Patrol confirmed to Reuters that one was found in the vehicle. I hope it proves not to be the case, that he was watching a movie at the time of the accident.
If the guy was doing that he's an idiot.
That depends. We should wait to hear what movie he was watching.
Harry Potter, allegedly.
Harry Potter? Then I'd say justice was served. Though I'd tend to view things more leniently if he'd been reading the book rather than watching the movie.
All the promotions I've noticed, about automated cars, imply that the "driver" and other occupants are relieved from all responsibility for highway safety and its consequences. The occupants merely enjoy their free time doing whatever they wish.
ie the car does the thinking and takes whatever action is required to ensure a safe journey for all, including others outside the vehicle.
That includes idiot drivers of other vehicles, (which according to winky is everybody) plus cyclists little children who unexpectedly run out into the road unaware of the dangers of life.
Perhaps I have been mis-led by these promotions ?
Mercedes have an autopilot system using similar technology to Telsa. A key difference is That Mercedes require the drivers hands are on the wheel for the system to function, Tesla does not.
At this point most of these systems are not autonomous, rather they are a sophisticated cruise control. It could be argued that until vehicles are fully autonomous the approach of Mercedes is better.
Sophisticated cruise control. Auto-pilot. Do we train drivers of these vehicles how to use these systems and explain their limitations ? Should we ?
I,m not sure which movie, but I think it was star wars, the young jet eye, was trying to master the force. Or, perhaps Tom Cruise, Top Gun, but the point being, it's difficult for a machine/computer to replace the human element.
A tragic loss, the Tesla incident!
Allante93!
The report I saw suggested the Tesla's auto lane-switching capability had been at fault and hadn't been able to distinguish the white side of the lorry and the bright sky.
I suspect that the radar detection system was aiming too low and so looked under the trailer, also if the truck was crossing at right angles to the direction of travel to the car then the detection system would only have seen the wheels of the truck for a very brief period and any braking would have been for a very short period.
Derek Wright posted:I suspect that the radar detection system was aiming too low and so looked under the trailer, also if the truck was crossing at right angles to the direction of travel to the car then the detection system would only have seen the wheels of the truck for a very brief period and any braking would have been for a very short period.
Do US trucks have to have side protectors/guards (no idea if there is a proper term) to stop smaller vehicles going underneath like they have to in Europe?
Eloise posted:Derek Wright posted:I suspect that the radar detection system was aiming too low and so looked under the trailer, also if the truck was crossing at right angles to the direction of travel to the car then the detection system would only have seen the wheels of the truck for a very brief period and any braking would have been for a very short period.
Do US trucks have to have side protectors/guards (no idea if there is a proper term) to stop smaller vehicles going underneath like they have to in Europe?
In most US films I have seen trucks don't even have mudguards.
Eloise posted:Derek Wright posted:I suspect that the radar detection system was aiming too low and so looked under the trailer, also if the truck was crossing at right angles to the direction of travel to the car then the detection system would only have seen the wheels of the truck for a very brief period and any braking would have been for a very short period.
Do US trucks have to have side protectors/guards (no idea if there is a proper term) to stop smaller vehicles going underneath like they have to in Europe?
If the Fast and Furious movies can be considered the definitive authority on such matters, then no.
Don Atkinson posted:All the promotions I've noticed, about automated cars, imply that the "driver" and other occupants are relieved from all responsibility for highway safety and its consequences. The occupants merely enjoy their free time doing whatever they wish.
ie the car does the thinking and takes whatever action is required to ensure a safe journey for all, including others outside the vehicle.
That includes idiot drivers of other vehicles, (which according to winky is everybody) plus cyclists little children who unexpectedly run out into the road unaware of the dangers of life.
Perhaps I have been mis-led by these promotions ?
No, not everybody is an idiot behind the the wheel, but enough are, such that on average it is by far the most dangerous thing we do. (Except for over-eating and not getting enough exercise)
As an aside there is an infuriating advertisement running here during the Tour de France. This stereotypical young mum is driving her SUV through the 'burbs. She is seen admiring (at length) her well-behaved and perfect children in the back seat via the rear-view mirror and a smug, self satisfied, fulfilled smile washes over her face. Her attention is rudely diverted back to task of driving her two-and-a-half tonne machine when it's automatic emergency braking rudely interrupts her day by stopping suddenly due to someone else's kid running into the street immediately in front of her car.
By the way, Don, what other "promotions" have you noticed regarding automated cars? I guess the example I just gave is one. The implication is that you can safely gaze wistfully rearwards instead of paying attention.
Eloise posted:There was a moral question posed that I was reading about earlier this week...
Should your car risk killing you, to save the life of others?
Absolutely not. The sole purpose of safety technology on a car is to protect its occupants, not to attempt to make subjective altruistic decisions. Anyhow, it's a philosophical discussion and I can't see any practical real world application. Would anyone really buy a car knowing it had safety features that might decide to sacrifice you and your occupants? I certainly wouldn't.
Derek Wright posted:I suspect that the radar detection system was aiming too low and so looked under the trailer, also if the truck was crossing at right angles to the direction of travel to the car then the detection system would only have seen the wheels of the truck for a very brief period and any braking would have been for a very short period.
It's also possible that the truck turned in front of the vehicle so suddenly that no evasive action would have been possible, by human or computer. At highway speeds you're covering ground pretty quickly. The computer's reaction time should be quicker than a person, though. Provided it has anticipated or seen the hazard, as you say. Prediction and anticipation are always going to be harder than simple reaction for computers. Lots of clues for a human driver if they're paying attention- oncoming vehicle slowing (but not enough), in turning lane, indicator on, smoke from exhaust stacks on downshifts, etc. We pick up on these things all the time, sometimes even subconsciously. The computer will be better at reacting, but perhaps not at anticipating (for a while).
The OP asked, The Future?
I can’t see a future, once the death toll rises and compensation payouts become astronomical, manufacturers will realise, automated cars are going nowhere.
When a human driver makes a mistake and somebody dies, the exact details of incident are locked in the participant’s brains. The exact sequence of events are not known 100% and culpability is not known 100%.
With an automated car, the exact sequence of events are known 100% and culpability is known 100%. The computer will record sensor inputs and subsequent actions in response to the inputs. You probably won’t need a Philadelphia lawyer to prove the system/software isn’t fit for purpose, with resulting huge compensation payouts.
fatcat posted:
When a human driver makes a mistake and somebody dies, the exact details of incident are locked in the participant’s brains. The exact sequence of events are not known 100% and culpability is not known 100%.
With an automated car, the exact sequence of events are known 100% and culpability is known 100%. The computer will record sensor inputs and subsequent actions in response to the inputs. You probably won’t need a Philadelphia lawyer to prove the system/software isn’t fit for purpose, with resulting huge compensation payouts.
Black boxes have been a mandate on new cars sold in the US since 2014 and many had them prior. The boxes store information on a required 15 variables for 20 seconds before a crash. Things like speed, braking, throttle position, and steering angles. More sophisticated vehicles store more information regarding the performance of advanced safety features. Still, these variables alone don't necessarily reveal culpability. Driver's permission or a court order is required for authorities to download the data and then crash experts must attempt to reconstruct the events, assuming the data are adequate to do so. Culpability surrounding an individual accident then becomes a question for the courts. The process is expensive and typically only used when a death occurs. The adequacy of a manufacturer's safety feature system/software seems to me be an entirely different question that would occur on a different level.
fatcat posted:The OP asked, The Future?
I can’t see a future, once the death toll rises and compensation payouts become astronomical, manufacturers will realise, automated cars are going nowhere.
When a human driver makes a mistake and somebody dies, the exact details of incident are locked in the participant’s brains. The exact sequence of events are not known 100% and culpability is not known 100%.
With an automated car, the exact sequence of events are known 100% and culpability is known 100%. The computer will record sensor inputs and subsequent actions in response to the inputs. You probably won’t need a Philadelphia lawyer to prove the system/software isn’t fit for purpose, with resulting huge compensation payouts.
Why will the death toll rise? Automated cars will do a far better job of not crashing than we do.