If you use Tesla's Autopilot on this road you will end up off the road

If you use Tesla's Autopilot on this road you will end up off the road

On Reddit, more precisely in the community dedicated to owners and enthusiasts of self-driving cars, several posts have been published regarding a mysterious road in Yosemite Valley that would be "not compatible" with Tesla's autonomous driving system. Apparently, according to early reports, at least five Teslas would have crashed without real reason and in exactly the same spot. Owners reported that, in conjunction with the Y-fork, the car would not understand exactly which trajectory to choose resulting in an accident.

This is what happened to the BBFLG user who owns a Model X Long Range of 2020; according to the story, even if he intervened he would not have been able to maintain the correct trajectory ending inexorably in the scree on the outside curve. Unfortunately, despite the sudden deceleration, it also hit a large boulder that irremediably compressed the continuation of the journey.

The local Rangers also specified that the problem had been present for some time, long before it spoke on Reddit. In short, a real "bug" in Tesla's software which, despite the reports and claims, would not have been solved yet. It is desirable that, given the problem, the manufacturer decides to disable the Autopilot function in the offending stretch of road while waiting for a real system "patch".

Judging from the image available on Street View, it seems that the left turn is much clearer and more pronounced than the right one, guilty of misleading vehicles. It is not the first time that similar episodes have been recorded and certainly for the developers it is not a walk to be able to have a 100% reliable software available, especially if we consider the differences in the road markings that there are between country and country. The question now arises spontaneously: who will pay the damage? Tesla, the owner or some insurance?