Tesla Self-driving LIDARs dont' detect motorcycles! | GTAMotorcycle.com

Tesla Self-driving LIDARs dont' detect motorcycles!

Nauticat

New member
I was going with traffic in Left lane of 404 near Hwy 7, following the car ahead in the right tire track. A white Tesla 3 behind me passed me in my lane. I barely had time to move to the Right. The driver(?) hardly swerved. I get really nervous, then upset with a car next to me, in MY lane.
As I caught up to him, he motioned that he had no control. The car was self-driving. He then turned it off and dropped back.

See USA Today post: Tesla was in full self-driving mode when it fatally hit Seattle-area motorcyclist: Police
These Tesla Self-driving Lidars don't detect motorcycles! I need a rear dashcam and a Tesla Alarm.
 
The problem started when they marketed "self driving" cars as auto pilot.
Auto pilot in a plane requires two (at least) highly trained pilots, a host of aircraft controllers, and at least two handlers when rolling on the ground, backed by more computing power than god... to operate "autonomously"...mostly in empty sky
... and the Tesla 'tards think their idiot car can steer through a busy city, without monitoring
It's not the car's fault. Google gave up on the system as unworkable... then Elon buddy bought it and put it in his cars. This was less about "self driving" cars as much as musk trying to be a "disrupter"
 
Tesla Self-driving means after it clears the obstacle in the lane (Me), it will not stop or call 911, but will drive itself home!
Self-driving or not, one should never rear-end anyone at routine traffic speed. Tesla 3 are cheaper, so many can afford them. Load them up with lazy options and now they think they are snoozing King of the Road.
Tesla is now a Chinese car right? I tried my Chevy with Adaptive Cruise Control. It sensed a motorcycle and kept distance!
 
Self-driving or not, one should never rear-end anyone at routine traffic speed.
And yet, it is one of the most popular collisions
Tesla is now a Chinese car right?
The ones we get in Canada are assembled in China, American ones are assembled in the US... at least for now. From what I have heard, the Chinese ones are better (less problems... not NONE, just less) than the US ones. It doesn't help Tesla that the folks working at the US Tesla plants and service centers are very vocal about how bad their cars are... and it doesn't help when Elon buddy goes on x (formally Twitter) and blows up on them, amplifying the media coverage tenfold
 
I was going with traffic in Left lane of 404 near Hwy 7, following the car ahead in the right tire track. A white Tesla 3 behind me passed me in my lane. I barely had time to move to the Right. The driver(?) hardly swerved. I get really nervous, then upset with a car next to me, in MY lane.
As I caught up to him, he motioned that he had no control. The car was self-driving. He then turned it off and dropped back.

See USA Today post: Tesla was in full self-driving mode when it fatally hit Seattle-area motorcyclist: Police
These Tesla Self-driving Lidars don't detect motorcycles! I need a rear dashcam and a Tesla Alarm.
Tesla ditched the actual distance measuring sensors long ago and went to pure image processing. It doesn't work well in many scenarios but they don't gaf. It should never have been allowed in public. Beta-testing on the backs of dead people should be the end of Tesla but so far they have dodged that axe.
 
The problem started when they marketed "self driving" cars as auto pilot.
Auto pilot in a plane requires two (at least) highly trained pilots, a host of aircraft controllers, and at least two handlers when rolling on the ground, backed by more computing power than god... to operate "autonomously"...mostly in empty sky
... and the Tesla 'tards think their idiot car can steer through a busy city, without monitoring
It's not the car's fault. Google gave up on the system as unworkable... then Elon buddy bought it and put it in his cars. This was less about "self driving" cars as much as musk trying to be a "disrupter"
A friend bought a Tesla and it came with self drive free for a month trial. It was interesting, driving better than most humans. Do I trust it? Nope.
 
In my experience, 9 out of 10 times when a car tries to get me to race from a stop light, it's a Tesla.
no one tries to get a 300cc Honda to race them off the light 😏 but I've also learned to pretend a little and that gives me a nice gap to pull into ahead of the second car ( we filter from the bike lanes ).
I think all the Tesla drivers like the rush and to show it off.
The Tesla3 Performance does zero to ticketing speed in under 3 seconds.(60mph which is slightly over 100kph...the highest legal speed here) ...there is simply no point for a bike to challenge. I think aside from the fuel savings, the crazy acceleration is the secret sauce for EVs
 
This "Tesla is a self-driving car" concept has been around now for about 8 (or more) years now and there doesn't seem to have been a lot of progress towards a really competent and foolproof system. Back then (2016) I wrote the article below - which I updated in August of 2022 for the vintage club (CVMG) newsletter.

"The Tragedy of the “Autonomous” Car - that Wasn't – and Still Isn't.

It would seem to be an automatic response. A small child chases a ball out onto the road in front of you as you drive along. Depending on the time and distance available you brake to a halt in time or swerve to avoid a collision. With a “self-driving” car such as a Tesla using the vehicle's “Autopilot” system, cameras detect the child and the car brakes rapidly to a halt before striking the child.

In the near future (some car makers said) there will be driverless (autonomous)“robot” cars and trucks on our roads. But if Isaac Asimov’s first ethical Rule for Robots :”An autonomous machine may not injure a human being or, through inaction, allow a human to be harmed.”, is programmed into a car, in the same situation as above, how does the car choose between protecting the person or persons being carried in it and the possibility of injuring a smaller human being. What ethical choice should a “robot car” be programmed to make?

Certainly, in the more distant future, when all cars and trucks are robot-controlled so as to completely avoid colliding with others of the same type and size, there may be significant benefits to robot cars. But to me there will always be what bureaucracy calls “vulnerable road users” (pedestrians, bicyclists and motorcyclists) who will not be recognized by the robot cars. Some software people indicate that such ethical programming for every conceivable situation is currently impractical. Further, our old bikes would not have the electronics which would allow our bikes to communicate their presence to these future “Robot automobiles”.

On May 7,2016, on a Florida highway, a 2015 Tesla S car, operating on their “Autopilot” Beta-stage system, crashed at high speed into the trailer of a crossing tractor-trailer, passing under the trailer, which sheared off the top of the Tesla. The car continued for 700 feet along the highway, gradually going into and out of a fenced field, travelling 200 feet more, then hitting and shearing off a utility pole and finally coming to a halt. The driver of the Tesla was killed, presumably when the upper part of the Tesla body was torn off by the deck of the trailer. The $140,000 Tesla has computer, radar, photo and wireless sensing systems which are supposed to warn the driver of possible collision and then, if the driver does not respond, control the car to take avoidance action and brake itself to a halt.
The accident happened on May 7th but an investigation was not started by the US National Highway Traffic Safety Administration (NHTSA) for about 10 days until Tesla notified them. Public notice only occurred when media reports began to come out on July 1.

What appears to have occurred was :

The Tesla was doing 85 mph on unsigned Florida SR500 (a 4 lane divided-with-median highway with many unmarked intersecting or access paved or gravel roads controlled, in some cases, by Yield signs), when it came over a hill and down to where the tractor trailer was making a legal crossing turn into a side road. The radar send/detect system mounted below the front bumper on the Tesla did not detect the trailer as it could see under it. The camera system mounted higher up at the windshield top seems to have failed to detect as well, claimed by the Tesla firm to be due to lack of contrast between the white trailer and the Florida sky. However, Mobileye, the company that makes camera-based computer-vision systems for autonomous driving has stated,”We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.” So the camera was not designed to see a “crossing” problem at all.

But what of the driver who, according to Tesla's instructions, should always have his hands on the steering wheel and be paying attention to the road. Tesla also tells their car purchasers that their “Autopilot” may not detect “stationary vehicles or obstacles, bicycles, or pedestrians.” The NHTSA in the US indicates that the Tesla S ranks at only level 2 or 3 out of the 5 levels of technology required to be a fully “self-driving car”. Florida State police found a portable DVD player in the wrecked Tesla just after the crash. The truck driver involved in the accident went to the Tesla and has stated that he could hear sound, coming from the wrecked Tesla. It was, he reported, a Harry Potter movie playing. It would seem that the Tesla driver may have been distracted, at very least, from watching the road.
Incidentally, about a month before this fatal accident, the driver had posted a YouTube video showing how his Tesla had “reacted” to save him from a highway accident – with a white truck. Apparently he was listening to an audio book at the time of that encounter."

Fast forwarding to August 2022 and it appears that the state of California, in the wake of several fatal night-time crashes where Tesla cars failed to avoid running into motorcycles travelling in the same direction on the road ahead, is now considering revoking the permit allowing the designation of the Tesla automobile by the manufacturer as qualifying as a “self-driving automobile.” Apparently the US Federal Traffic Safety authorities are investigating 38 “self-driving accidents” which have resulted in 19 fatalities involving Tesla cars."

I have not closely followed what the US (or Canadian) authorities are doing at the moment about the "Tesla blindness" issue and other such self-driving vehicle problems in the last couple of years. But when I drive and ride these days, I try to keep in positions on the road to have at least some possible "escape route" available. Tough to do in much of to-day's traffic.

AFJ
 
I went for a ride Saturday and met some guys at Tim Horton's. As we were about to leave, a minivan started backing out of the spot across from where we were parked. As it crept closer to us, it didn't look like it was going to stop. We had to run up to it and bang on the van to get it to stop, which by then ended up only a couple of inches away from one of the bikes. If that was a Tesla, I'm pretty sure it would have just kept going and mowed down all of the bikes.
 

Back
Top Bottom