Not Ferrari but Tesla - I believe some people here drive them. Bob Z. Image Unavailable, Please Login
I'm sorry but you gotta be a dumb azz to sit in a car going 65 MPH and trust your life to a computer. Darwinism
As apposed to trusting the typical poorly trained, discourteous, cell-phone talking human that the states typically give licenses out to these days? I'll take a Commodore 64 over these people.
Currently, we are killing almost 40,000 people EVERY year in car "accidents" in the USA alone, along with about 2.3 million injured - that is ever year. Can't even imagine what the numbers are for the rest of the world; but, i imagine probably even worse. And keep in mind, this is happening in relatively safe modern cars. If today's typical moronic driver was driving cars from just 20 years ago, there would be 100,000 deaths per year. It is too bad we don't regulate driving like we do flying. You can be sure the government will move to mandatory autonomous cars as soon as the technology is certified. And it will be.
So you're saying that you would trust a computer to drive the car far better, and far safer than the average Human can? Hang on though, isn't that exactly what the driver in this unfortunate incident did? - Trust the computer to drive the car better and safer than he could himself? And wasn't the outcome that the (in your opinion) superior computer driver completely failed to recognise the danger ahead, resulting in a fatal accident? I can't help feeling that had the driver put less trust in the technology on board his car (technology that makes the Commodore 64 look like a simple on/off light switch!), and trusted his own natural instinct for self preservation, then he would still be around today to tell others just how the technology failed!
Read my post, just prior to yours. You may be right int his particular case. But, overall it is irrelevant. Technology is going to win, because people just don't care about being good drivers.
If someone does something stupid and kills themselves driving their car, the only have themselves to blame If someone does something stupid like letting a computer drive their car at 65 MPH and it ends up killing them, they only have themselves to blame.
so far the data seem to indicate computer is safer than human a driving. The delta will get bigger as computers become better https://www.wired.com/2016/02/googles-self-driving-car-may-caused-first-crash/ I can't think of any of my friends and family who has driven 1.3m miles and not caused an accident.
Here is the problem with computer driving cars A computer cannot evaluate RISK. It can only evaluate what's happening. A human can evaluate a truck ahead making a turn from the left hand lane that MIGHT pull in front of them. We make these kinds of decisions every day and most we don't even think about. We slow down a bit, move to a different lane, avoid some guy we think is driving like a nut, see there's water from a broken pipe on the road, or some kids playing with a dog on the side of the road who might just jump into the street without looking The computer only reacts it senses something has already gone wrong. I actually have no problem in self driving cars under 25 MPH like bumper to bumper traffic. But at speeds above that, the computer cannot anticipate a possible problem like a human can, and probably never will. Humans sense something is wrong or fishy. Computers only analyze data.
They should spend the $$ on letting a computer decide how we should eat, since over 850,000 per year die of heart disease/diabetes. Interesting that car accidents comprise 26% of all accidental deaths, yet mandatory autonomous driving is seen as the answer to the "problem". "WALL-E" world is just around the corner...
This is an outdated view of what's possible with even today's hardware and software. A computer can absolutely make all of those judgments and much more. Technology is advancing steadily to the point where it will do all of that far better and far more reliably than any human. Sensors are improving. Self-driving cars will see and hear everything around them at a "superhuman" level and process that data thousands of times a second, recognizing situations as early if not earlier than any human driver. Software is improving. The continual progress made in the areas of AI and neural networks means that self-driving cars will be "taught" and can continue to learn how to make the right decisions not unlike how actual people learn. However, unlike humans, self-driving cars will benefit from the experiences from every mile driven by every other car in the network and will never "forget" them due to fatigue or any other human-like factor. For example, your car in particular will not have had to drive in areas of black ice to know how to detect it and make the proper maneuvers. And in that moment, it will see it sooner than you, react to it faster than you, and navigate through it more skillfully than you. Human drivers are not perfect. Therefore, self-driving cars do not need to be perfect to add value. They simply need to be as good or better than their human counterparts. Self-driving cars will absolutely get there, and much sooner than you think.
1. Like it or not, this is all 100% true. 2. The current Tesla cars equipped with various driving aids are not self-driving cars. 3. The driving software of the future will even be intelligent enough to choose the proper forum in which to post non-Ferrari discussions.
What will happen to motorcycles in the new driving era? Fully autonomous or also outlawed? What will happen to all of the vehicles currently owned that become outlawed? What will the automobile industry of the near future look like?
to add, they don't drive while drunk they don't drive while sleepy they don't drive while dealing with 3 kids screaming from the back seat they don't drive while eating they don't drive while texting they don't speed trying to impress a girl they don't drive 80mph in a 30 zone racing another car etc..
There was a fatality near Akron Ohio where the car could not tell the sky from the solid white semi truck that it ran into.
And I think from what I have read 25 to 30% of those killed are as a result of distracted driving... if a computer can help, Im all for it, facts speak for how well people manage...and the truck driver that pulled out in front of this car will be held liable in some fashion, failure to yield is what it is...
I think it all boils down to poor driver training and the refusal to take accountability for anything. I'm not sure what our friends over in Europe think about autonomous cars, but they are much better drivers because they don't just hand out licenses to whoever walks in the DMV. Also, to the best of my knowledge, it is illegal to use your phone while driving throughout most of Europe, although I think Bluetooth might be permitted. If we adopted a driving education system akin to Europe and adopted some of their driving laws, I suspect our crash rates would be much lower. But many people in the US seem to be born thinking that driving is their right and therefore never take it seriously because they don't think it can be taken away from them. Just my two cents.
Who knows? I would hope the government would impose tougher training laws for people driving non-autonomous vehicles. I would love to ride a motorcycle - but not around today's idiotic drivers. Just too dangerous. Unfortunately, the future does not look too exciting for enthusiasts.
+1 +2 +3 Flying is known to be much safer than driving, but so many people have a fear of flying. Why? Because you're putting your life in the hands of the pilot and the computers. If something bad happens, there's nothing you can do. When you are driving, you feel like you are in "control", and it feels safer. Even though truthfully there are so many factors out of your control (mostly other drivers. Also deer, weather, etc). Self-driving cars are exactly the same. They've already been proven to be much safer than human-driven cars, but 1 story of an accident comes up and everybody loses their minds. Also, people don't like change and don't trust new technologies. As well they shouldn't. Whenever a new version of iOS comes out, tons of people jump to download it immediately. And they end up with frustrating bugs and have to wait for the next release. Self-driving cars are the same... although the risk is much greater of course. This guy should not have entrusted his life to such a new technology. Tesla warns the drivers not to, to be prepared to take over. I'll wait several years before i get a self-driving car, just like I always wait for the bug fixes before downloading a new version of iOS. Of course it sucks to be that 1 guy who was killed by his computer, but it's better than the 100s killed by other drivers, etc.
Actually, I think The Mayor is correct, if you take the short term view. Eventually, a computer will know every car on the road, if it is being operated by a human or not, that human's complete driving record, driving tendencies, past mistakes, etc. It will sense minute variation in track, speed, where hands are on the steering wheel and will be able to "intuit" most (and probably eventually all) things that a person can. However, that is a long ways down the road (no pun intended). In the short term the integration will not be nearly that complete, and the engaged careful driver will still be able to contribute positively to the road situation.
Anyone wishing for self driving cars should consider just taking the bus. Seriously. You can sit in the back and click away at your wireless device and leave the future of actual driving to enthusiasts. Tesla should and will be sued big time for their flawed approach at autonomous driving. Raises concern now that any of the ones you see on the rode are in ignorant/autonomous mode.
Doesn't work as well, buses don't go from your point A to your point B. And most studies show that buses seriously impede traffic on local urban routes. They tend to effectively block one full lane. (Stop, start, stop, start.) A transit strike I read about actually aided traffic flow in a major city. (NYC?) They don't particularly impede on multilane limited access arteries, though.
It was a metaphor. A driverless car is not a car but merely a conveyance and has no place in the consciousness of the automotive driving enthusiast. I could easily has said take a train, taxi, shuttle, etc. Ferraris are driver cars so the concept is kind of creepy.