Tesla owners. autonomous buffs, car accident lawyers--any opinions? | FerrariChat

Tesla owners. autonomous buffs, car accident lawyers--any opinions?

Discussion in 'Technology' started by bitzman, Jan 1, 2022.

This site may earn a commission from merchant affiliate links, including eBay, Amazon, Skimlinks, and others.

  1. bitzman

    bitzman F1 Rookie
    BANNED

    Feb 15, 2008
    3,287
    Ontario, CA
    Full Name:
    wallace wyss
    Self Driving: A Key decision yet to be Made


    We all know that Elon Musk sells an option on Teslas called Autopilot and that it has the potential to someday operate their cars fully autonomously.
    Of course many of his competitors have similar systems under development, but what all the automakers know is that there is an iceberg in the way of its full implementation, an iceberg bigger than the one that sank the Titanic.
    That iceberg is The Decision.
    I'm talking a court decision on a case that hasn't happened yet. About an accident that hasn't happened yet. It will most likely involve a Tesla but it could be Cadillac or some other automaker that has a system close to robot driving that's already being used providing a human driver is ready to take the wheel.
    The occurrence leading up to the court case hasn't happened yet. But it will be a case with a scenario like this: said vehicle is preceding along a two lane road behind a garbage truck, the kind of dump truck that hangs a fully loaded dumpster by chains in the air out behind the truck.
    Then the chain holding the dumpster snaps and the car, at that instant on Full Self Driving, has to decide. It knows it's only 50 ft. behind the dump truck doing 70 mph. It knows, even with brakes full on, it needs 200 ft. to avoid impact.Its cameras look to the left and see a old lady, 70-ish, who has no chance in hell of scampering out of the way should the car go there to avoid hitting the dumpster. The car's cameras then look to the right--there's five pre-schoolers in line behind their minder. Their little legs can't get them out of the way if the car goes right to avoid hitting the dumpster.
    So it makes a decision and hits a person or persons, fulfilling its mission of preserving the car. One death could spark a national debate but if there's multiples if will stand more of a chance of being a test case that will go all the way to the Supreme Court.
    One issue will be who to blame it on? If the car still had a driver, who had decided to elect full self driving, most of the blame will go on them for not recognizing an emergency and reclaiming control of the wheel. But if the driver is in the back seet sleeping, or the back seat passenger is an Uber/Lyft type customer with no intention of taking the wheel, then who do you blame?
    Or what if the full self driving car is at that moment driverless, operating fully remote, summoned somewhere by cell phone by a customer who wants to be taxied somewhere? Do you blame this customer for the accident merely because they ordered the car--a car they haven't ever laid eyes on?
    I predict--again on the number of and ages of the deceased-- when this trial takes place, the whole nation will be waiting for a decision. Because if the financial liabilities fall upon the automaker, then Tesla, among others, won't want to go to FSD with no one at the wheel.
    Yet Tesla's goal is to see fully autonomous cars allowed in the US, to make it possible for someone who wants a taxi like service to summon a fully robot car without a driver. If they can make available rentals without a human driver, then the Uber/Lyft type companies will make much more money than they do now, having to pay drivers. Tesla has described how full self driving owner rentals will work. They plan to keep track of the driving habits of all the people who own Teslas, They might decide, if you have a squeaky clean record, you can qualify for a program where you can offer your Tesla to robot service ride-share drivers or users when you're not using the car. I don't know how much money you would make, but let's say your car is driven 500 miles by others or rented as a robo-driver by customers the very same same day you are toiling at the office 9 to 5. You come out of work and there it is, waiting for you, proud it just made enough to pay a good chunk of the next car payment,
    It's not such a totally foreign concept--it's how Air b & b works. You own a vacation house. Vacationers book that house online, go stay a few days and their rental fee allow you to make your next house payment. Maybe 3 days a month pays your whole payment.
    Now me, as far as cars, it wouldn't work. My car is not always squeaky clean. The back seat might be full of books or art. It's not ready to rent. But for those who want their vacation house/car to pay for itself , this is a solution.

    Now Tesla and other automakers are champing at the bit and see all these worries as negligible. This feature, if it gets green lighted, will allow many more people to buy cars as the car will pay for itself by working just a few days a month.
    Sounds great, right? But you forgot. That court decision? Hasn't been made yet. The horrific accident? Hasn't happened yet. After it does happen, if the court rules the car owner legally responsible--even if he or she isn't there--it will kill off the idea of fully autonomous
    cars being offered by their owners as rentals. If the court rules the rental firm, coordinating daily rentals possible-- they will go under. If the court rules the automaker responsible they will drop back to Level 4, requiring a driver behind the wheel.And lose that share of the market that wants to sit back and let the robot drive.

    So ironically here we are in 2022 with the technology to go fully autonomous with no driver necessary, prevented from having it be made available in the U.S. because of an accident that hasn't happened yet. When it does, we need a weighty decision on who takes the blame if it all goes wrong?
     
    Waggntx and Texas Forever like this.
  2. Bas

    Bas Four Time F1 World Champ

    Mar 24, 2008
    41,438
    ESP
    Full Name:
    Bas
    If you want a self driving car, get a taxi/uber.

    I will admit, self driving cars are very useful in situations such as a traffic jam.

    IMO ''self driving" creates lazy drivers that rely completely on the car saving them. Yes Tesla (and I'm sure others) requires your hand to be on the wheel at all times, so you can intervene when needed...but you can get devices that you strap to your wheel that make the system think your hand is on the wheel. And people use them. Why? Because humans are inherently lazy. If they observe the system intervening everything for 5, 10 times, after the 11th they'll have their face deeply buried into their ipad watching netflix...utterly unaware when they do need to intervene.
     
  3. LVP488

    LVP488 F1 Rookie

    Jan 21, 2017
    4,874
    France
    Today the self driving feature is not working (that's why Tesla is the only one to commercially propose it, with appropriate disclaimers).

    However, someday it will work - then the question asked is interesting.

    In the case of an unavoidable accident caused by an external cause, I believe the self driving car should never try avoiding it at the expense of other innocent people.
    So the user of the self driving car should be aware of that policy (that could make him a victim of the external cause, but there is a responsible owner of that external cause) and accept it (or one could imagine an option where the user of the car rejects this policy BUT accepts the responsibility - after all, if you kill someone who is on the sidewalk to avoid being hit by a crazy truck, you are accountable for it).

    Then the manufacturer of the self driving car will be accountable only for accidents it actually causes (like the ones with which Tesla currently gets away by using its disclaimers).
     
  4. Innovativethinker

    Innovativethinker F1 Veteran
    Silver Subscribed

    Aug 8, 2009
    8,668
    So Cal
    Full Name:
    Mark Smith
    That scenario (same issue) was brought up by me on the board six years ago and has a been discussion for two decades.

    certainly someone will have to program that in.

    however, the system could also ask you to “set up” the driving parameters which may include the driver answering that question. Does the driver sacrifice themselves or do we take out others?

    That said, if it hasn’t happened in real life, it probably won’t happen. If it has happened then you can perhaps research the case.

    How would you set up the driving parameters?
     
  5. Tegethoff

    Tegethoff Formula Junior

    Jul 19, 2014
    301
    Los Feliz
    Full Name:
    Adam
  6. bitzman

    bitzman F1 Rookie
    BANNED

    Feb 15, 2008
    3,287
    Ontario, CA
    Full Name:
    wallace wyss
    I think the core value of the AI software giving the autonomous car its marching orders will be to protect the body shell from impacting solid objects that would wreck the car or kill the driver and/pr passengers. So before I could be confident it would protect pedestrians I'd have to know if it will still crash the car against a solid object rather than hit unprotected pedestrians. I'd like to see videos of tests with robot people where it chooses to hit the solid object every time rather than take out a pedestrian
     
  7. BJK

    BJK F1 Rookie

    Jul 18, 2014
    4,792
    CT
    These are new articles (8/22)

    Seems that Autopilot has a problem detecting motorcycles from behind .... and children. :confused: Hey, c'mon, nothings perfect. :eek:

    >>> Just for the record, I am Tesla fan, but NOT a fan of 'AUTOPILOT' (worst name ever for something that DOES NOT WORK! ..... yet )

    ..... and Ralph Nader is still at it! Doin what he does. :eek: If Ralph Nader has your product in his sights you better have a lot of good lawyers on retainer. :D

    Ralph Nader urges regulators to recall Tesla’s ‘manslaughtering’ Full Self-Driving vehicles
    Nader called FSD ‘one of the most dangerous and irresponsible actions by a car company in decades’

    https://www.theverge.com/2022/8/10/23299973/ralph-nader-tesla-fsd-recall-nhtsa-autopilot-crash

    Tesla Investigated For Two Motorcyclists Killed In Autopilot Crashes
    https://insideevs.com/news/603364/tesla-probe-motorcyclist-crash-autopilot-higwhway-night/

    Tesla’s self-driving technology fails to detect children in the road, group claims
    https://www.theguardian.com/technology/2022/aug/09/tesla-self-driving-technology-safety-children

    .
     
  8. bitzman

    bitzman F1 Rookie
    BANNED

    Feb 15, 2008
    3,287
    Ontario, CA
    Full Name:
    wallace wyss
    Now it might come back to Elon personally in the form of criminal charges...cold be the Trial of the Century

    here's the story from a Tesla fan site called Teslarati

    driving claims trigger US criminal investigation: report

    CREDIT: @EVAMCMILLAN333/TWITTER)

    [​IMG]

    BySimon Alvarez

    Posted on October 27, 2022

    A simple look at Tesla’s official pages for Autopilot and Full Self-Driving would show that the company is very clear in warning customers and would-be vehicle buyers that the advanced driver-assist systems do not in any way make cars autonomous in their present state.

    Yet despite these, recent reports have suggested that Tesla is under criminal investigation in the United States over claims that the company’s electric vehicles are able to drive themselves.





    These are some facts even some diabetics don't know but really should.

    The information was shared with Reuters by several people familiar with the matter. The publication’s sources claimed that the US Department of Justice had launched the previously undisclosed probe last year following over a dozen crashes that involved Tesla Autopilot. Some of the crashes were reportedly fatal.

    But while Tesla CEO Elon Musk has frequently estimated that the company’s vehicles would eventually be able to drive themselves without a human behind the wheel, Autopilot’s numerous warnings before and while using the system might make it a bit challenging to pin the electric vehicle maker.

    Reuters’ sources, for one, noted that Autopilot and FSD warnings on Tesla’s official website, which indicate that the advanced driver-assist system’s present capabilities “do not make the vehicle autonomous,” might complicate any case that the US Department of Justice might wish to bring.

    Tesla has not issued a comment about the reported DOJ probe. CEO Elon Musk has also not issued a statement on the matter as well.

    While there have been accidents involving Tesla Autopilot in the past, Elon Musk highlighted in an interview with Automotive News in 2020 that Autopilot problems typically stem from customers using the system incorrectly. This could very well be a fair assessment from the CEO, seeing as aftermarket companies even promote defeat devices that are designed to “trick” Autopilot into thinking that drivers are paying attention to the road even if they’re not.

    Don’t hesitate to contact us with news tips. Just send a message to simon@teslarati.com to give us a heads up.
     

Share This Page