Tesla owners have been wowed by their cars’ new abilities, but some say they have also been alarmed and frustrated by the accompanying flaws. One second drivers find themselves praising the cars’ skills; the next moment they’re grabbing the wheel to avoid crashing or breaking the law.
“Full self-driving” is a suite of driver-assist features that Tesla hopes can one day enable cars to drive themselves. (It’s not fully autonomous today, but that hasn’t stopped Tesla from calling it “full self-driving,” which has angered some self-driving experts.) Other automakers like Mercedes-Benz, GM, Ford and Volvo offer cars with similar features that may change lanes, parallel park, identify speed limit signs and brake for pedestrians. But Tesla has gone further with “full self-driving,” theoretically enabling people to plug in a destination and have the car drive them there. Tesla CEO Elon Musk has spoken of cars in the future driving themselves across the country, and traffic fatalities possibly being reduced by 99%.
But the company has only managed to slowly roll out “full self-driving” to roughly a thousand “beta testers,” who are still required to occasionally intervene. Meanwhile the cost of the “full self-driving” option has risen to $10,000. And while the feature is a major step, it still has major issues. Last week Tesla recalled a version of “full self-driving” within hours of its release because drivers were reporting false forward-collision warnings and automatic emergency braking. The issue was addressed in a new version released the next day. Tesla owners said they were impressed how quickly the company responded.
The software is inconsistent at best, according to interviews with owners of Tesla with “full self-driving,” as well as a review of more than 50 videos posted on social media by members of the public who have been using versions of it since it was rolled out to about 1,000 owners in early October. The videos are believed to be authentic because of the presence of details typical of “full self-driving” use, the complexity of manipulating such a video and the social media histories of the video creators, who are generally Tesla enthusiasts. Tesla did not dispute the authenticity of the videos.
Tesla’s “full self-driving” may excel in one scenario one day but fail the next. Turn signals go on and off randomly at times. “Full self-driving” has been seen neglecting “road closure” signs, attempting to steer around them or crash into them. Sometimes it brakes unexpectedly, even when the road ahead appears clear to drivers.
Teslas in “full self-driving” mode sometimes plot a course directly into other fixed objects, including poles and rocks, videos appear to show.
The technology has also shined at times, however, in one case identifying a cyclist ahead even before the human driver reported seeing the person. And drivers say the technology is generally improving.
“It drove like a 9-year-old who had only driven in [Grand Theft Auto] before, and got behind the wheel,” said John Bernal, who owns a Tesla Model 3, of when he first got “full self-driving” early this year. “Now I feel like I’m driving with my grandma. Sometimes it might make a mistake, like, ‘no grandma, that’s a one-way, sorry.'”
Tesla did not respond to a request for comment and generally does not engage with the professional news media. It warns drivers that the technology “may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road.” Drivers are told to be prepared to act immediately, especially around blind corners, intersections and narrow situations.
Some Tesla drivers say they’re concerned the feature’s inconsistent behavior is sometimes annoying and rude for other drivers. Videos posted online show it’s common for cars in “full self-driving” to drive down the middle of unmarked residential streets, in no apparent rush to move over for traffic coming towards it.
“It waits until like the last second to get over,” Matt Lisko said during the recording of a recent drive he posted on YouTube. “I’m sure they were like, what is this person doing?”
The cars also appear to befuddle drivers in other situations, such as being slow to take its turn at a four-way stop.
“We’re trying. We’re sorry!” one auto reviewer, Kyle Conner, said to his camera as he sat behind the wheel of a Tesla slowly pulling through a four-way stop. “Everyone’s rolling their eyes at us. We just pissed off about 12 people right there.”
In at least one case, it’s appeared to jump in front of a waiting driver at a four-way stop. In another video a Tesla using “full self-driving” attempted to pull around a vehicle in front of it that was waiting its turn at a four-way stop.
“Full self-driving” has been shown to sometimes stop twice at Chicago stop signs — once before it enters the intersection and then again before it pulls all the way through. The technology seems to be confused by Chicago’s practice of sometimes placing stop signs where drivers should stop and also at the far side of the intersection. “Full self-driving” seems to think it must stop at each of the stop signs, on the near and far side of the four-way stop.
Teslas with “full self-driving” often stop farther behind stop signs than typical drivers, drawing criticism from their owners. The cars then slowly creep up to make their turn, and then accelerate quickly once it’s made a turn onto high-speed roads. Many drivers love the acceleration. But it’s so pronounced that some drivers have worried about tires slipping, or wearing out quickly.
In several videos “full self-driving” has lingered behind a vehicle that’s double parked in the street, seeming to not know if it should go around the car or truck. At least three drivers have documented in YouTube videos “full self-driving” trying to pull around vehicles it shouldn’t pass including, in one case, a vehicle waiting at a stop sign with a turn signal on.
Kim Paquette, one of the first non-Tesla employees to test “full self-driving” when it was rolled out to a select group a year ago, says she uses the feature for nearly all of her driving in her Tesla Model 3. She was frustrated when she recently had to drive a loaner car that didn’t have the technology she’s grown used to. Paquette said she can sometimes drive the 85 miles from her home to her job at Boston’s airport without having to intervene because the car made a mistake.
Paquette can type an address into the screen on her Model 3, or hit a button and use Tesla’s voice recognition to tell the car her destination. Then she pulls down twice on a stalk on the steering wheel to activate “full self-driving.” The car lets out a chime, a blue steering wheel logo lights up on her screen, and the car starts taking her where she wants to go.
In some ways a system like this can seem like magic. But that magic is still flawed in both minor and serious ways.
Paquette has been frustrated, for instance, with her car’s tendency to drive in the parking lane on one-way streets in Newport, Rhode Island, where she lives.
“I just want it not to make that mistake and it’s been doing that for a year,” Paquette said. Some videos she’s made of herself using the feature show her car trying to pull in front of cars it shouldn’t, forcing her to intervene. And, Paquette told CNN Business, she tends not to use the system on trips that involve left turns with limited visibility, which she knows it struggles with.
One of the inconsistencies in “full self-driving” is how it handles pedestrians. Several drivers have had to slam on the brakes to prevent the car from hitting people in crosswalks, videos appear to show.
But in most cases it is overly cautious around pedestrians, drivers say. Paquette recalled a recent drive in which she was cruising down a street as a person got out of a parked car. Paquette said her car stopped four car lengths behind the parked vehicle and exiting driver. To Paquette, it seemed clear the person exiting their car was going to walk to the adjacent sidewalk, rather than cross in front of her. The car could be cautious without leaving such a large gap, she felt.
She’s noticed that “full self-driving” struggles to sense social cues, including being waved through a four-way stop by another driver, or knowing what a pedestrian will do next. Paquette said she regularly takes manual control of the car to prevent it from making the wrong decision or irritating other drivers.
“If someone is standing on the corner, are they just standing on the corner or waiting to cross the street?” she said. “It’s a student driver for sure. It’s like teaching a 15-year-old.”
Tesla isn’t alone in struggling to get its cars to recognize social cues. Machines work best in predictable environments that lack complexity, and this has been a challenge for all autonomous vehicle developers.
Human drivers communicate to pedestrians and other drivers with things like hand signals, horns and flashing headlights, and there is as yet no similar system for autonomous vehicles to talk to each other, much less to understand the signals people are using. Some companies have experimented with displaying messages on vehicles, like “waiting for you to cross,” or including a light bar atop a car windshield that flashes differently depending on if the car is stopping or not.
Oryginalne źródło: ZOBACZ
Zgłoś naruszenie/Błąd
Oryginalne źródło ZOBACZ
Dodaj kanał RSS
Musisz być zalogowanym aby zaproponować nowy kanal RSS