[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]  Whole Mars Catalog [@WholeMarsBlog](/creator/twitter/WholeMarsBlog) on x 656.9K followers Created: 2025-06-04 15:46:36 UTC Bloomberg is out with a new FUD story today, trying to smear Tesla ahead of their Robotaxi launch this month. To show why FSD will never work, they are highlighting a tragic crash... from November 2023, running FSD 11.4.4. Really? Forget about the next gen unsupervised model we haven't seen yet, they couldn't even find an example from FSD XX let alone XX or XX. Those using FSD know the system has gone through a complete rewrite since FSD XX. The tragic crash Bloomberg describes started in Arizona where two human drivers crashed into each other on the highway. As a result of this first crash, other cars on the highway started stopping, including a car with XX year old grandmother Johna Story. Story got out of the passenger seat, and walked out onto the highway, in direct sun glare. Tragically, she was hit by a driver named Karl Stock who was driving a Tesla Model Y with FSD on at XX mph. At that speed, the crash was fatal. Although the media will rush to point the finger at Tesla, this was a complex situation with many factors involved. First, if the human driven crash hadn't occurred, none of the cars would have been stopped on the highway in the first place. If the first cars that crashed had FSD, this incident may not have happened at all. Next, walking out onto the highway in direct sunlight was a risk that in hindsight should have been taken more carefully. Even a human driver might have their vision obstructed by direct sunlight, and end up in the same situation. Thirdly, the human driver should have intervened and stopped the car when he saw other cars stopping and people waving to him to stop. This suggests the human driver was not paying attention or was blinded by the sunlight himself. Finally, if FSD Beta XX had been able to recognize the situation itself, this could have been prevented. In a statement to the police Stock, the driver who hit Story said: "Sorry, everything happened so fast. There were cars stopped in front of me and by the time I saw them I had no place to go to avoid them". Story's family is suing Stock, as well as Tesla. The fundamental premise of the Bloomberg story is that Tesla couldn't see what happened or respond correctly because the cameras were blinded by sunlight. But this is incorrect. All of the information that the car needed to respond was in the footage, which Bloomberg published. The problem was that the AI, FSD Beta 11, wasn't advanced enough to respond correctly — not that the cameras couldn't see. First, you can see other cars begin to brake and slow down. Although the sun is blocking off the current lane, seeing other cars slow down is a good signal that you may need to slow down too. Recent versions of FSD, like 13, would take this as a signal to start applying the brakes. Next, you see more vehicles with hazard lights off on the shoulder. If that first car wasn't a sign to slow down, by now an advanced AI or human should be sure that something is wrong. The human driver ignored these signs, and FSD Beta XX did not respond either. Finally, you can see cars stopped and a person waving on the side of the road. I don't see how you can argue that there wasn't any information coming into the cameras that could have tipped the system off that it needed to slow down. There was plenty. The Tesla tried to move to the left to avoid the crash, but at that point there were two cars with a person standing in between them and it was too late. This was at it's core a distracted driving crash that resulted from another earlier crash on the highway. It's clear to me that a more advanced vision system like FSD XX or maybe even FSD XX could have prevented this crash. With over X million Teslas on the road, there will be crashes just like any other car. But the numbers don't lie — FSD dramatically reduces the number of crashes and fatalities compared to when humans are driving the car themselves. FSD has now driven around X billion miles. Given that the US sees a traffic fatality roughly every XXX million miles, you would expect to see XX traffic fatalities while FSD was on, if FSD was no more or less safe than a human driver. Instead there were just 2, meaning that XX% of expected fatalities you would see over that driving distance just didn't happen. Bloomberg will write a FUD story about this crash, but what you won't see anyone writing about is the XX fatal crashes that didn't happen because FSD was on. Why did this tragic crash happen? For the same reason as XXXXXX other fatal crashes in the US and XXX million around the world: humans. FSD can't stop every crash, and this was definitely one of the more challenging ones — but it can prevent the vast majority of crashes. Bloomberg's implication that a vision based system can't detect this type of crash is easily disproven by their own dashcam footage. They say that cars like Waymo without LIDAR and Radar would be able to see it more easily, but this is pure speculation as Waymo doesn't run on the highway yet — at least not with real customers (they are testing and doing some rides with employees). The fact is vision systems are here to prevent crashes on the highway today, while other systems are not. The clowns in the media will be pulling out everything they have to try and smear Tesla and FSD ahead of the Robotaxi launch. But FSD Beta XX is not representative of what FSD (Unsupervised) XX in Austin is going to be able to do. The data is clear, Tesla FSD is already dramatically reducing expected traffic fatalities. If you think a vision only system can't handle some sunlight, think again. If this was really a common issue, the media would be able to find an example more recent than FSD Beta XX.  XXXXXXX engagements  **Related Topics** [bloomberg](/topic/bloomberg) [mars](/topic/mars) [tesla](/topic/tesla) [stocks consumer cyclical](/topic/stocks-consumer-cyclical) [stocks bitcoin treasuries](/topic/stocks-bitcoin-treasuries) [Post Link](https://x.com/WholeMarsBlog/status/1930290096680644733)
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
Whole Mars Catalog @WholeMarsBlog on x 656.9K followers
Created: 2025-06-04 15:46:36 UTC
Bloomberg is out with a new FUD story today, trying to smear Tesla ahead of their Robotaxi launch this month.
To show why FSD will never work, they are highlighting a tragic crash... from November 2023, running FSD 11.4.4.
Really? Forget about the next gen unsupervised model we haven't seen yet, they couldn't even find an example from FSD XX let alone XX or XX. Those using FSD know the system has gone through a complete rewrite since FSD XX.
The tragic crash Bloomberg describes started in Arizona where two human drivers crashed into each other on the highway. As a result of this first crash, other cars on the highway started stopping, including a car with XX year old grandmother Johna Story. Story got out of the passenger seat, and walked out onto the highway, in direct sun glare. Tragically, she was hit by a driver named Karl Stock who was driving a Tesla Model Y with FSD on at XX mph. At that speed, the crash was fatal.
Although the media will rush to point the finger at Tesla, this was a complex situation with many factors involved. First, if the human driven crash hadn't occurred, none of the cars would have been stopped on the highway in the first place. If the first cars that crashed had FSD, this incident may not have happened at all. Next, walking out onto the highway in direct sunlight was a risk that in hindsight should have been taken more carefully. Even a human driver might have their vision obstructed by direct sunlight, and end up in the same situation. Thirdly, the human driver should have intervened and stopped the car when he saw other cars stopping and people waving to him to stop. This suggests the human driver was not paying attention or was blinded by the sunlight himself. Finally, if FSD Beta XX had been able to recognize the situation itself, this could have been prevented.
In a statement to the police Stock, the driver who hit Story said: "Sorry, everything happened so fast. There were cars stopped in front of me and by the time I saw them I had no place to go to avoid them".
Story's family is suing Stock, as well as Tesla.
The fundamental premise of the Bloomberg story is that Tesla couldn't see what happened or respond correctly because the cameras were blinded by sunlight. But this is incorrect. All of the information that the car needed to respond was in the footage, which Bloomberg published. The problem was that the AI, FSD Beta 11, wasn't advanced enough to respond correctly — not that the cameras couldn't see.
First, you can see other cars begin to brake and slow down. Although the sun is blocking off the current lane, seeing other cars slow down is a good signal that you may need to slow down too. Recent versions of FSD, like 13, would take this as a signal to start applying the brakes.
Next, you see more vehicles with hazard lights off on the shoulder. If that first car wasn't a sign to slow down, by now an advanced AI or human should be sure that something is wrong. The human driver ignored these signs, and FSD Beta XX did not respond either.
Finally, you can see cars stopped and a person waving on the side of the road. I don't see how you can argue that there wasn't any information coming into the cameras that could have tipped the system off that it needed to slow down. There was plenty.
The Tesla tried to move to the left to avoid the crash, but at that point there were two cars with a person standing in between them and it was too late.
This was at it's core a distracted driving crash that resulted from another earlier crash on the highway. It's clear to me that a more advanced vision system like FSD XX or maybe even FSD XX could have prevented this crash.
With over X million Teslas on the road, there will be crashes just like any other car. But the numbers don't lie — FSD dramatically reduces the number of crashes and fatalities compared to when humans are driving the car themselves.
FSD has now driven around X billion miles. Given that the US sees a traffic fatality roughly every XXX million miles, you would expect to see XX traffic fatalities while FSD was on, if FSD was no more or less safe than a human driver. Instead there were just 2, meaning that XX% of expected fatalities you would see over that driving distance just didn't happen. Bloomberg will write a FUD story about this crash, but what you won't see anyone writing about is the XX fatal crashes that didn't happen because FSD was on.
Why did this tragic crash happen? For the same reason as XXXXXX other fatal crashes in the US and XXX million around the world: humans. FSD can't stop every crash, and this was definitely one of the more challenging ones — but it can prevent the vast majority of crashes.
Bloomberg's implication that a vision based system can't detect this type of crash is easily disproven by their own dashcam footage. They say that cars like Waymo without LIDAR and Radar would be able to see it more easily, but this is pure speculation as Waymo doesn't run on the highway yet — at least not with real customers (they are testing and doing some rides with employees). The fact is vision systems are here to prevent crashes on the highway today, while other systems are not.
The clowns in the media will be pulling out everything they have to try and smear Tesla and FSD ahead of the Robotaxi launch. But FSD Beta XX is not representative of what FSD (Unsupervised) XX in Austin is going to be able to do.
The data is clear, Tesla FSD is already dramatically reducing expected traffic fatalities. If you think a vision only system can't handle some sunlight, think again. If this was really a common issue, the media would be able to find an example more recent than FSD Beta XX.
XXXXXXX engagements
Related Topics bloomberg mars tesla stocks consumer cyclical stocks bitcoin treasuries
/post/tweet::1930290096680644733