dmix 1 day ago

A Chinese TV channel spent a bunch of money doing ADAS tests and Tesla came out on top of all the Chinese brands, including all the LIDAR systems. Although tests were all in the day time.

https://www.youtube.com/watch?v=0xumyEf-WRI&t=1203s

https://electrek.co/2025/07/29/another-huge-chinese-self-dri...

XPENG (major chinese ADAS brand) recently decided to copy Tesla's vision-only+AI world gen data approach, after originally focusing only on LIDAR https://electrek.co/2026/04/29/xpeng-vla-2-test-drive-tesla-...

There's also been talk of companies pushing a hybrid LIDAR+vision approach using custom hardware since it's complex to merge the two datasets. So the answer might eventually be somewhere in between instead of companies choosing one or the other depending on costs.

  • kvuj 22 hours ago

    My god, from this video I learned two things:

    - Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.

    - Virtually all of the Chinese car infotainment were basically a 1:1 copy of Tesla's. I couldn't find any that genuinely tried something unique lol

    • fooker 22 hours ago

      It'll be difficult for any single company to compete with Tesla on scale and the AI we have so far rewards scale like no other technology before it.

      Yes Waymo exists, but the amount of training data they have is a few orders of magnitude lower.

      • jdlshore 21 hours ago

        And yet Waymo is operating a real commercial service in multiple cities, and Tesla is in just one.

    • guywithahat 22 hours ago

      Yeah it's interesting hearing their engineering logic, that fewer sensor types means less sensor collision and faster iteration, where iteration speed is really what matters. I also think people overhyped lidar because they don't understand it, and human behavior is to associate things we don't understand to magic. It's not magic, it performs poorly in inclination weather and can have issues with resolution over range and data processing (although lidar does do a lot of things well).

      All of this said, once Karpathy left they have slowly looked at adding new sensors (recently radar), so who knows what the future for Tesla's sensor suite holds.

      • BugsJustFindMe 8 hours ago

        > I also think people overhyped lidar because they don't understand it

        Speaking as a person who understands it extremely well and who has an advanced degree in computer vision, I'm sure that internet randos did, but I promise the people who actually know about the failure modes of the different modalities did not. I don't really expect you to take my word for it, but maybe this will spark an interest in investigating the failure scenarios of 3D reconstruction using cameras in computer vision. Just know that Google is an absolute top tier juggernaut in the CV/ML/AI research world, and they don't use lidar out of ignorance.

        > less sensor collision

        This isn't a real thing for anyone doing a good job. A sensor can be good for a scenario or it can be bad for a scenario. More sensors feeding input only gives you gradations of accuracy instead of binary accuracy. Having gradations of accuracy is an unambiguously good thing. When you only have one sensor, you have no way to know whether in the moment it is feeding you an optical illusion. That's what it means for something to be an optical illusion. But when you have multiple sensors of different modalities, then you have meaningful information about whether local disagreement between the different modalities means that one is better or worse than the other, because you can contextually characterize the failure scenarios of each.

        > It's not magic, it performs poorly in inclination weather and can have issues with resolution over range and data processing (although lidar does do a lot of things well).

        Inclement, not inclination. And I hate to be the bearer of bad news, but cameras also do poorly in inclement weather and have issues with resolution over range, and the solutions are identical for both (superresolution, temporal blending, alternate wavelengths, stereo correspondence, etc).

        Tesla people always say (said?) things like "Well humans only drive with their eyes, so cars should be able to as well," but that's not a true statement about what humans have in relation to what Teslas have. Humans have many more different sensor modalities than what Tesla's cameras give. Teslas have single-view fixed-focus cameras that, for much of the FOV, can only reconstruct structure from shape assumptions (object detection and classification) and inter-frame changes (optical flow) coupled with sensation of the vehicle's motion. That's all they get. It's not bad at all, especially coupled with advanced machine learning, but you do have more than that coupled with even more advanced machine learning. When you as a human drive, in addition to what Teslas have (you do also have them), you also have binocular stereopsis cues, autofocus lens convergence cues, vehicle-independent motion parallax cues, and the ability to manipulate shade cover so you don't get blinded. Are all those extra cues necessary for every scenario? No, obviously not. Do they help though? Yes. Try driving with only one eye open and without moving your body or head at all. You can absolutely do it, but you won't be as good as you would with both eyes open and free movement.

    • DiogenesKynikos 22 hours ago

      The article notes that these tests were all done in daylight, where Lidar provides less of an advantage.

    • BugsJustFindMe 22 hours ago

      > - Tesla's vision only approach seems a lot more competent than the Lidar suites from smaller Chinese makers. Perhaps I misjudged how necessary Lidar was to achieve safe driving.

      Three things can be simultaneously true:

      * Tesla's cameras are sufficient for some scenarios.

      * Tesla's cameras are insufficient for other scenarios.

      * A system with good data and bad algorithmic processing is still going to be bad. The Chinese vehicles almost always fail the tests because they see the obstacle but drive into it anyway.

  • lern_too_spel 18 hours ago

    The Tesla vehicles failed 4 of the 9 SAE Level 2 tests, during ideal weather conditions. How is it reasonable to operate an autonomous taxi service with these results?

amazingamazing 1 day ago

Neat. I wonder which others will pass. I wonder if safety sense 3 cars will pass too. Speaking of which it’s insane a sienna doesn’t have that. I wish Tesla made a van instead of the cyber truck. Americans and their truck obsession…

  • MrBuddyCasino 23 hours ago

    A Tesla van would be amazing. Unfortunately not going to happen.

    • paradox460 23 hours ago

      I keep hoping Edison motors up in Canada comes out with a Van conversion kit like their pick up kit

      • to11mtm 22 hours ago

        ... Do the Edison kits still use the original transmission?

        Cause if not, it would be hilarious to do that to a clapped out van...

        • paradox460 22 hours ago

          Looks like they replace the entire drivetrain. Some older Ford vans, like the 90s windstars, were built on small truck platforms, so you could probably do one. But I'm not sure they'd sell one to ya

          Still would love to see it. The idea reminds me of farm truck https://okcfarmtruck.com/pages/about

    • dzhiurgis 17 hours ago

      Why would they make something that's like 5% of market share? Tiny car + huge SUV (YL doesn't count) makes far more sense.

laweijfmvo 1 day ago

yet i still can’t use basic autopilot on the highway because it phantom brakes every 2 hours

  • radial_symmetry 1 day ago

    Are you on an extremely old version or something? I have had my model Y for 5 years and it only phantom braked once ever.

    • aetherspawn 1 day ago

      Many countries in the world are on a 6+ year old version of Autopilot, yeah..

  • ajross 1 day ago

    So, that's not my experience with current FSD versions. But whatever, sure. Let's accept your data point as measured:

    Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation. The human failure tends to be failing to detect a real obstacle, vs. slowing for a phantom one.

    This is just too much. If you don't like it don't use it. But to pretend that stomps-the-brakes-every-few-hours is a stop ship kind of safety bug is quite frankly ridiculous.

    • tzs 1 day ago

      > Every... TWO HOURS?! I mean, come on. Put a camera on yourself or another human driver. There's an unexpected braking event at least that often, almost always in a more dangerous situation

      Wait...what are counting as an "unexpected braking event"? I can't think of anything I do with brakes that would not be counted as ordinary braking that happens anywhere near as often as every two hours.

      • ajross 21 hours ago

        I mean that you stomp on your brakes to avoid rear ending a car you didn't notice stopping. That is objectively more dangerous a situation than "phantom braking" and far, far, FAR more common.

        I know, I know, you're a perfect driver and would never fail to notice an obstacle. But the rest of us aren't.

        • sumeno 21 hours ago

          You should not be driving if you are having to slam on your brakes to avoid rear ending someone every two hours. You are a danger to yourself and everyone around you

        • Dylan16807 21 hours ago

          I, uh, don't think the median driver goes through that. Maybe you tailgate really aggressively?

          • ajross 21 hours ago

            It's like clockwork. Everyone who hates Tesla just so happens to be a preturnaturally fantastic driver. Yet everyone I drive with, and me, and everyone around me isn't. I just can't understand it.

            To wit, I, uh, don't believe a word you're saying. No one drives in traffic without an occasional oops. And the people who claim not to are, uh, almost certainly the worst of the bunch.

            • Dylan16807 21 hours ago

              > It's like clockwork. Everyone who hates Tesla just so happens to be a preturnaturally fantastic driver.

              I said nothing about Tesla and I said nothing about my driving.

              If you really want to know, I would not claim I'm a better than average driver. I don't think "very rarely has to unexpectedly brake hard" is something you need to be a particularly good driver to accomplish.

        • tzs 19 hours ago

          I'm pretty sure I am in fact only very rarely stomping my brakes, because since I drive with one pedal driving mode on I use my brakes rarely enough that I notice when I'm using them.

          Maybe I simply don't drive often in circumstances where there are cars suddenly stopping ahead of me? I keep a pretty decent gap between me and the car ahead, unless I'm in stop and go traffic in which case I usually am using adaptive cruise control.

          I've been planning to get an OBD2 scanner and an app that can report details on EV battery health, because a battery health report is the only thing that I'd get if used the dealer that is 15 miles away (and an hour bus ride away if I don't want to wait there) for scheduled maintenance that I don't get if I go to the independent service center a 10 minute walk away.

          Perhaps I'll try to find one that can also log my brake activity to see if I'm somehow just not noticing a lot of brake stomping.

    • sumeno 23 hours ago

      If you are having unexpected braking events every 2 hours you should be paying better attention to your driving. I go months without them

  • aetherspawn 1 day ago

    Autopilot (no longer for sale) is so unsafe I’m surprised there’s no class action for owners to force Tesla to upgrade it to FSD for free.

    Especially in right hand drive markets (non US) it’s even worse than Toyota’s radar cruise.

    I’ve nearly been killed by it about 5 times because it randomly steers into fences and things. It also randomly fails to change lanes (1 in 100), and then just randomly steers full lock and goes out of control.

    Model 3 - Highland

    • lotsofpulp 23 hours ago

      Autopilot in my 2024 Model Y never changes lanes. That has always been a feature restricted to “Full” Self Driving. Autopilot is just lane assist and cruise control.

      I can’t recall anytime either Autopilot or FSD put me in danger though.

      • aetherspawn 23 hours ago

        The branding is confusing. I’m talking about the paid version of Autopilot, which was for sale for $5000, sometimes called “Autopilot Plus”.

        For right hand drive markets, it seems to be a stripped down version of FSD 10 or 11. It automatically changes lanes, takes corners and highway exits, but does not stop at traffic lights. It drives exactly in the middle of the lane, doesn’t shuffle over for trucks, and is easily confused.

  • amarant 22 hours ago

    Really? That's weird, I owned a Tesla in Sweden for 2 years and had perhaps 3 ghost break events total. I used autopilot(not fsd) a lot.

readthenotes1 1 day ago

"four newly integrated advanced safety tests:

    Pedestrian automatic emergency braking 
    Lane keeping assistance 
    Blind spot warning, and  
    Blind spot intervention 

"

Don't most cars do something like that now? I'm curious what's different between Tesla and, say, a Honda Accord?

  • sumeno 1 day ago

    How much did Honda's CEO give the president in the last election?

    • kyleee 1 day ago

      10% for the big guy, iirc

  • calchris42 1 day ago

    The article is vague, but I suspect this is referring to FMVSS-127 which makes certain active safety features mandatory in 2029 and also increases the difficulty of some required to pass scenarios. The new scenarios require responding from higher initial speeds which effectively requires longer sensor ranges and/or lower latency.

  • brandonagr2 1 day ago

    There is a big difference between "something like" and actually passing the tests, I would be surprised if any non vision based system has the reaction time needed to pass the new pedestrian tests.

epoxia 23 hours ago

I wonder if we're going to see a different spin on dieselgate in the future. Where a car company collects all the data from the NHTSA's test environment through the cars cameras/sensors and then includes that data into the training datasets for other cars/sfw updates. (I'm not implying that this happened, but I imagine it would at some point)

rootusrootus 20 hours ago

Is this only when used with an FSD subscription, since the Model Y no longer has autopilot?

flippyhead 1 day ago

wth man I was told mh Model Y that I bought around 2021 was going to do all this but it's now too old or something?

  • ricardonunez 1 day ago

    It seems you got musked, overpromised and underdelivered.

    • kyleee 1 day ago

      He at least bought me a horse

  • cevn 1 day ago

    I'm in the same boat, this is a whole thing right now. There is some kinda class action in Europe which will hopefully make them pay up or deliver something useful. I think a Refund plus interest plus a hefty fine for lying would be a good start.

  • emmelaich 23 hours ago

    Apparently some HW3 cars can get it. It's listed available for my 2022 Model 3 (Australia/Sydney). However the cost is twice what they charge for HW4, I believe.

    It seems other HW3 might get a FSD-lite version. There's no official way to upgrade HW3-HW4.

gamblor956 1 day ago

A Tesla still can't detect a motorcycle next to it, so I can't see how it would ace the blind spot warning test.

Any other administration and I would be willing to grant the benefit of the doubt, but Musk's spent a lot of money to corrupt government agencies over the past year and a half so that he could get silly pronouncements that the most dangerous "advanced" driving system in the world is somehow also the safest. (More people have been killed by Tesla's ADAS systems than every other automaker's ADAS systems, in the world, combined.)

  • brandonagr2 1 day ago

    Obviously your priors are wrong, it can ace a blind spot warning test because it can detect a motorcycle next to it.

    • gamblor956 19 hours ago

      As of today it can't. The Uber driver was as surprised as the guy he almost hit.

  • delabay 22 hours ago

    This is obviously a factually incorrect post as anybody who uses the product can attest