Tesla Vision fails as owners complain of Model 3 cameras fogging up in cold weather::A number of Tesla owners have taken to Reddit after their front cameras fogged up and stopped working in cold weather, leaving several features, including the US$10,000 FSD Beta, inoperable. Tesla has declined to assist to these customers, despite many of their vehicles being covered under warranty.
Every other car uses LIDAR and Elon thinks he’s such a forward thinker for shunning it. So dumb
Radar. Only a small handful of cars have LIDAR. But your point still stands. Outside of Elon being a humongous douche and completely unpredictable, the lack of sensors is the major reason for not wanting a Tesla.
Very few production vehicles have lidar. Like almost zero
The driving assist features of my Honda CR-V also stop working whenever there’s snow or ice on the front of the car. Bad design for cold climates is not just a Tesla issue.
His argument makes sense. Human vision is not too different from just a camera. I see the argument for lidar but it can also be a bit more expensive to accomplish the same task. I’m open to listening to your argument as to why lidar technology would be a better path for self driving cars.
Oh yeah, human vision also causes people to mistake a blue truck for the sky and drive right into it. /s
It was a white/gold truck, not a blue/black truck…
Hah even worse
That’s literally what happened.
Literally yes? Humans hit way dumber shit every single day.
Sure but usually because they weren’t looking or couldn’t see it…not because they mistook a truck for the sky or some of the other dumb shit computer vision algorithms do.
Not seeing something and mistaking something for another thing are pretty different problems. One can be corrected with glasses while correcting the other requires a brain transplant (or a brain in the first place).
Edit: or, ya know adding another sensor would work and make it so the vision system wouldn’t have to be so good at object recognition and could just not hit things…but we can’t add the couple hundred dollars worth of parts for that.
The obvious argument is that eyes are far from perfect and fail us all the time, especially when going fast. We are quite good at making up for it, but saying “We have eyes so my self driving cars will have eyes too” is pretty fucking dumb.
We also recognized that we need to keep our windshields clear of fog in order for our eyes to work properly.
That argument doesn’t make sense because human vision isn’t that great either. When it’s dark or raining or snowing or foggy our vision is pretty shit.
I’m not saying LIDAR is better but rather point out that actually you want different types of sensors to accurately assess the traffic, because just one type of sensor isn’t likely to cut it. If you look at other manufacturers they’re not using only LIDAR or only camera. Some use LIDAR + camera, some user RADAR + camera, some user LIDAR, RADAR and camera. And I’m pretty sure that as manufacturers will aim for higher SAE levels they will add even more sensor into their cars. It’s only Tesla who thinks they can somehow do more with less.
I think it’s undeniable the combination of camera and lidar will be the best solution. I just hope this can be coss effective. Maybe over time we can be able to adapt and improve the technology and make it more economical so that it is safer for our roads.
People here have no idea what they are talking about, or how absurdly difficult it is to actually deploy lidar to a consumer vehicle. There’s a reason why Tesla is shipping more advanced driver assist tech than anyone else, and it’s because they went against the venture capitalist Lidar obsession which is holding everyone back. There’s a reason why there are basically zero cars shipping with lidar today.
You don’t need mm depth maps to do self driving. Not that you get that from lidar on rough roads anyway.
There are some test cars with lidar. It has the spinny thing on top and looks pretty interesting. I believe those cars are pretty successful. I don’t think they’re being mass produced though, because the costs might be a little prohibitive.
The most advanced that’s not even on autonomous level 3. It’s funny that Mercedes is the first to get level 3 approval in California and they don’t even boasting that as much.
That aside, a secondary sensor that help verifying if the vision get it right would be nice. It could be just a radar or whatever. Imagine if the vision fail to recognize a boy in a Halloween costume as a person, at least the secondary sensor will the car to stop due to contradict perception.
I might be misremembering but I think Teslas are actually more capable, they’re just deliberately stating they’re SAE level 2 so they could skirt the law and play loose and dangerous with their public beta test.
I haven’t researched this enough, but Tesla says that they are level 3, but never bother to get the actual approval is like how I kept saying that I’m smart, but too lazy back in my school years.
Put your money where your mouth is. Life are at stake here.
Think of that Coyote and the roadrunner cartoon. If there’s a graffiti that looks like a tunnel the coyote may run into the tunnel based on vision alone, but a secondary sensor will help telling that there’s a wall.
Irl, If the vision failed to recognize that there’s something on the road, at least a secondary sensor will protest that there’s something on the road.
You can also test driving in direct sunlight without sunglasses or the suncover. You get notifications and beeping noises whenever the sun hits them directly, making the lane assist (I refuse to call it autopilot) quite irrational in most weather… It’s actually worse for me than driving in cold weather.