1024core 2 hours ago

I got a 2026 model Y recently and tried out FSD. It made enough errors in the first few trips that I am surprised it's being touted as a "robotaxi".

For example: travelling West on 15th street in SF, at Guerrero the leftmost lane turns into left turn only and the Tesla happily continued straight through.

That jolted me out of complacence and the next time it was in the wrong lane, I quickly took over and corrected it. It's happened a few times and I don't use FSD that much.

  • tonfreed a minute ago

    I'm unsurprised by that. I'm really hoping it quickly improves now that more people are using it

  • imoverclocked 2 hours ago

    You have to let it crash a few times so it can trigger an internal review of the route. /s

    Having zero control of the software update process will stop me from ever owning a Tesla.

    • johnasmith an hour ago

      I'm curious, what does having control over the update process give you? Isn't it replacing one unauditable black box system for another? Are you concerned about a regression and don't want to be in the vanguard cohort?

      • imoverclocked an hour ago

        Well, there is the "my car has been disabled in an inconvenient time/location" problem for one. It would be nice to have more audibility but I use iOS/macOS/etc so it would be disingenuous to claim that as a show-stopper.

        If by "vanguard cohort" you mean "in the first wave to test the new software," then yes; I don't want to be in that group.

  • nrds 2 hours ago

    As you observed, lane selection is basically the one thing that FSD is completely incapable of. But other things it does well. It's important to note this is completely incompatible with the narrative spun by Tesla haters, that it all comes down to LiDAR. LiDAR cannot help with lane selection.

    • barbazoo 2 hours ago

      How does Waymo compare in these situations?

    • esseph 2 hours ago

      Why does Waymo not have a problem with it? It did really well in dense streets with people barely pulling over to stop and run into a storefront or picking people up from a restaurant. It would pause for a second, put on turning signals, and then pull around the stopped car. It did this several times, in fact in spots where I would have waited because its estimation of distance and obstacles in a 360deg around the vehicle is flat out better than me as a human. I was really impressed.

guywithahat 2 hours ago

How were the accidents hidden? It sounds like they were reported to the NHTSA properly, which is how the article knows about them. I wouldn't expect them to email a journalist every time there's an accident

  • ModernMech 2 hours ago

    According to the article, at first the accidents were "hidden" from the reporting system just because Tesla systems were not autonomous enough to qualify under the law.

    But now that Tesla is trying to be more autonomous robotaxi service, they're required to report more details about their accidents.

    According to the article, Tesla's competitors (like Waymo) are very forthcoming about the incidents. They are probably following the long tradition in engineering of learning from your mistakes by investigating them thoroughly and doing root-cause analysis.

    Tesla cannot do this, because if they do a thorough root-cause analysis of why their system fails more than others, they will inevitably arrive at the conclusion it's due to the sensor stack being camera-only. And Tesla cannot admit that because Musk can't admit he was wrong.

    So instead they're going down the path of being cagey about the details of their accidents. I don't know how long these reports take to generate but there are 2.5 months worth of reports that have not yet been released.

    Meanwhile, Musk has committed to ditching the safety monitors by the end of the year, and he's not going to be able to do that if Tesla's robotaxi service is unreliable. But he's also not willing to do what it takes to make the service more reliable, which is add LiDAR to the system. So... it will be interesting to see what happens at the end of the year.

    • pitpatagain 22 minutes ago

      It's already clear that there is no possible timeline in which they actually remove safety drivers by the end of the year, it's such a joke.

      The weird thing is that between the extremely underwhelming tiny supervised test they run in Austin and the nonsensical permitting games they want to play in California, they don't really seem like a company that actually wants to launch a robotaxi.

  • pitpatagain 2 hours ago

    "As it does with its ADAS crash reporting, Tesla is hiding most details about the crashes. Unlike its competitors, which openly release narrative information about the incidents, Tesla is redacting all the narrative for all its crash reporting to NHTSA"

  • josefritzishere 2 hours ago

    TLDR the article clearly states that Tesla misclassifies the severity of the accidents and redacts the narratives and disengagement data. Key sentence "Tesla has never released any significant data to prove that its system is reliable."

    • guywithahat 2 hours ago

      I read the article, it's just that the data was never, and is not, hidden. Data for vehicles that aren't fully autonomous isn't released, and they are releasing their fully autonomous data through the proper channels.

      At no point was tesla ever trying to "hide 3 robotaxi accidents", as the title claims (unless I'm missing something but I don't think I am)

      • pitpatagain 7 minutes ago

        Read the table of examples in the article. Other companies report crashes with significant detail visible to the public that Tesla is redacting.

        Compare Waymo report:

        "On [XXX] at 10:31 PM PT a Waymo Autonomous Vehicle ("Waymo AV") operating in San Francisco, California was in a collision involving a scooterist on [XXX] at [XXX].

        The Waymo AV was stopped at the curb facing north on [XXX] for a passenger drop-off when the passenger in the Waymo AV opened the rear right door. As the rear right door was being opened by the passenger, a scooter ....

        Waymo is reporting this crash under Request No. 2 of Standing General Order 2021-01. Waymo may supplement or correct its reporting with additional information as it may become available."

        Tesla reports is:

        "[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]"

        Tesla has consistently tried to have it both ways saying they are "not autonomous" and therefore don't have to report, but also then claiming in other contexts that they are driving huge numbers of "autonomous" miles.

        So now they finally do a handful of reports and it's all REDACTED? They are finally doing barely what's required but also not being forthcoming at all.

barbazoo 2 hours ago

> Unlike competitors, such as Waymo, Tesla’s Robotaxi still uses a “safety monitor” who sits in the front seat with a finger on a kill switch ready to stop the vehicle. Despite this added level of safety, Tesla is evidently still experiencing crashes.

> CEO Elon Musk has claimed that Tesla would remove the safety monitor by the end of the year and deliver on its “full self-driving” promises to customers, but he has never shared any data proving that Tesla’s automated driving system is reliable enough to achieve that.

Fricken 3 minutes ago

Magic didn't work, maybe having a Nazi as CEO will help get Tesla to SAE level 4.

its-kostya 2 hours ago

Any "self driving" from Tesla carries large amount of risk because it uses _only_ cameras. Visual anomalies happen and without radar/lidar as a second source of truth, the vehicles will always sketch me out. Some say "separate the art from the artist" but at the end of the day, Elon's stubbornness to only use cameras is the reason many people are apprehensive to shell out money for the vehicle and especially the any autonomous driving capabilities.

Even if future vehicles DID have lidar, every vehicle up to now does not and therefore will never be truly self-driving. Customers already paid for it with the promise that vehicle hardware is capable. So either they will have to be refunded, or retro fitted with new sensors - at the expense of Tesla I assume. Still no idea how they are valued so much.

  • taylodl 2 hours ago

    My benchmark for full self-driving is simple: if the manufacturer assumes full legal responsibility and liability for the vehicle’s actions, then it qualifies as autonomous. Otherwise, it’s just driver assistance and isn't capable of being an autonomous taxi.

    • jerlam 2 hours ago

      This begs the question whether the safety monitors in the Robotaxis are also there to take blame for any of the vehicles' accidents.

      • taylodl 2 hours ago

        I had never even thought of that! I had just assumed that Tesla would be responsible, but you know what they say about "assume!" Holy Cow! Now you've got me thinking there's another reason to not utilize a Robotaxi, buried in the terms and conditions that nobody reads could be a statement where you take full responsibility for everything the Robotaxi does. Yikes!

Simulacra 2 hours ago

Well they're not hidden now?

TheAlchemist 2 hours ago

I don't know when Tesla valuation will crash and Musk will go bankrupt, but once it does it will be one for the ages !

Company is still valued at >>1 trilion $, supposedly because they will soon roll out Robotaxis everywhere - 50% of US population before the end of the year, according to Musk !

Meanwhile, 3 months after the start of the operation, it's still open only for influencers, running with ±10 vehicles and operates with a driver in the front seat...

This is so absurd, that could make us forget the 2 million Cybertruck orders or the fact that all Teslas were to become Robotaxis with an OTA update in 2020.

  • xnx 2 hours ago

    It is ridiculous, but it is even odds that he'll just start promising something else to string people along: robots!, AGI!, Eloncoin!

frendiversity 2 hours ago

I was neutral about Mr. Musk up until now but the more hit pieces come out the more I find myself pricing out a new Tesla with FSD...