Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You don't have to look at other industries. Other auto-manufacturers do their testing on test tracks.



And their self driving is miles behind tesla


I remember literally over 5 years ago hearing from someone at a big auto-manufacturer, and they just explained, they can't afford to have their cars known for killing people. They sell a shit tonne of cars, and if they start running people over they're done. It'd be an extinction level event for their brand, and probably a serious knock to the entire industry. Apparantly Tesla is happy to take that risk. it's not that Tesla is more advanced, it's that they're happy making claims that no other company in an industry obsessed with safety would make.

Imagine Volvo, but instead of Volvo you have a company that distinguished themselves by their lack of interest in safety.


Only recently looking at cars, what I have found interesting is collision detection and automatic breaking. It seems some manufacturers have a reputation for getting it right, and other manufacturers a reputation for a terrifying feature that drivers disable due to it going off at exactly the wrong time.


I find my father-in-law’s Volkswagen T-Cross terrifying to drive. If it’s not distracting you with shrill warning beeps and bongs, it’s getting confused and slamming on the brakes at every slick or shiny surface. It is unquestionably more dangerous than if it just left the driving to me.

Hard to understand how people have affection for this brand.


...Because they weren't daft enough to commit to emplying blackboxes with no means of formal proofing to a safety-critical operation. Musk's approach is a massive public safety no-no. The cost of specifying and proving through trial the capabilities of what Musk is aiming for is the work of several lifetimes. Musk and Tesla just fucking YOLO it, yeeting out OTA's that substantially change the behavior of an ill-tuned system whose behavior can't even be reliably enumerated; and sinking the operational risk in drivers on the road.

Sometimes, conspicuous lack of progress is a good thing. It isn't something you necessarily appreciate until you suddenly start having to confront the law of large numbers in a very real and tangible way. Some incremental changes simply are not feasible to take until they are complete. Level 3 automation is one of those...


There is no solution to self driving that doesn’t involve a black box. The safety of the system is easy to measure. When there are fewer interventions than accidents for a solid chunk of time, FSD will be safer. It could eventually reach 1 intervention per hundred thousand accidents, if you would just let them continue.


> When there are fewer interventions than accidents for a solid chunk of time, FSD will be safer. It could eventually reach 1 intervention per hundred thousand accidents, if you would just let them continue.

And in the meantime, I and other drivers, cyclists, pedestrians are subject to increased danger for what? Oh, Tesla's profits? Forgive us if we don't all see this as an acceptable tradeoff.


They aren’t in any danger. The guy driving in the video is crazy and not disengaging when the car is misbehaving. With your hand brushing the wheel, a person can regain full control of the vehicle well before there is any danger. And yes, I would like to see not only Teslas profits go up, because they are the only company doing self driving, I would also just like to see this project move forward. It’s the coolest project in the world and if it succeeds it will save millions upon millions of innocent lives.

Furthermore if you really were so edge-of-your-seat scared of traffic fatalities then Tesla would be at the bottom of your list. Why don’t you go do something about the droves of people that stream out the back of bars and into their cars every night? They kill thousands every year meanwhile Tesla has killed roughly zero people.


It really doesn't matter whether the driver should or should not be disengaging, there are many, many studies categorically proving that "allowing the driver to be mostly relaxed and not required, only to require immediate intervention in dangerous situations" is absolutely, empirically less safe. You can't just white wash it away by "oh well, it will get better". When? And don't mention a word about Elon's opinion on when. The guy has been promising "this year" every single year for nine years now. More realistic estimates have this a decade, or two, away, at the very earliest. And I have huge doubts that when it does, Tesla will be nowhere near it. Their phantom braking fiasco proves just how horrific Tesla's approach to testing is, throwing multiple releases out into the wild with less than 72 hours between them, for absolute safety features. Anyone who claims that those releases were subject to any form of rigor in testing whatsoever is deluded, and anyone claiming that testing it on the public roads is somehow acceptable is equally deluded.

I am very, very well aware of exactly what causes traffic fatalities. According to the software at my work, I have personally responded to 378 fatality MVAs as a paramedic. Please don't try to assume everyone is ignorant about realities - we are not blindered, and only physically capable of recognizing and responding to one danger at a time.


You can't ask people to use a driving aid and not end up less focused. With advertising, "infotainment" (which is really disguised entertainment), music, oustide environment and passengers it is already hard for a driver to focus on his driving. You can't expect any human being short of people paid for that to keep hands brushing the wheel and feet ready to slam the brakes.

Having said that I am not sure most people are safer for cyclists and pedestrians. FSD is in such a bad state right now that the tesla is driving in the streets at the speed 80y old people do. What I saw in a video is a car that drive at a similar pace to a cyclist, it is even much slower in the crossing sections.


And however much Tesla likes to say "Oh, yes, yes, the driver should be paying full attention", everything else they say and do says the opposite. Latest example is the update that rearranged some of the climate control and added/updated some larger hot buttons at the bottom of the screen. Not all functions are available to be pinned at the bottom. You get a limited choice, which includes Netflix.

So to be clear you can have an always available hot button for Netflix, but not for climate control. All Tesla's handwaving is entirely bullshit. "The driver is in the seat for legal purposes only. The car is driving itself."

This is horseshit of the highest order.


> There is no solution to self driving that doesn’t involve a black box.

LIDAR greatly reduces the "black box" necessity. It basically allows you to do things like "if object is in the way then hit brake/move elsewhere", where the sensor doesn't really fail in good weather.

Given its safety over DL-only solutions, this should be step 1 to getting to FSD. Not reckless beta-testing with black box techniques.

Tesla has chosen the cheap way, which is also the irresponsible way.


I'd rather my car's safety systems be later to market but proven safe, than early to market and have me and the others around me as unpaid beta testers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: