Along For the Ride

Guest Post by Eric Peters

A guy was killed by his auto-piloted Tesla last week (new story here) and Uncle is looking into it. Of course he won’t do anything about it. Because “if it saves even one life” is very selectively applied. It applies only when whatever the danger happens to be is something Uncle wants to use as an excuse to impose yet another mandate.Tesla carsh 1

Never to rescind one.

See, for instance, air bags.

Or – lately – cars that drive themselves.

Uncle very much wants such cars and so is prepared to do nothing about their potential – and now actual – lethality.

Because, you see, the point is not that cars that drive themselves are “safe” (they’re not, bear with me) though that is much talked up (like air bags, which also aren’t “safe”) and used as the pretext for force-feeding them to us. Note that. We are never given the choice. Never offered whatever the thing in question is and allowed to weigh the pros and cons and then choose for ourselves. Free people are  not merely allowed such latitude, they are entitled to it. It is not bequeathed, conditionally, by political parents in a remote bureaucracy but respected as an inviolable moral principle.

In a free society, that is.Uncle pic

But we are not free except to do as we are told.

Back to this Tesla thing.

The “driver” (who wasn’t) in the recent lethal incident was reportedly doing something else besides paying attention to what was in front of him. In this case, a big rig making a left turn in his path of travel. The Tesla’s autopilot did not grok the big rig and drove right into it.

Right under it, actually.

According to Tesla, “autopilot is getting better all the time but it is not perfect and still requires the driver to remain alert.” (Italics added.)

Really?

Then why bother with autopilot?

Isn’t the touted benefit of cars that drive themselves this idea that “drivers” no longer have to remain alert? That they can take a siesta or play Candy Crush or watch a DVD or do some work on their laptops? If they have to remain alert, they cannot do those things, too.

Remaining alert means keeping one’s eyes on the road at all times – not occasionally. It means being prepared to react to changing conditions.

Like a tractor trailer turning left in front of you.   

The imbecility of all this makes my teeth ache.Tesla crash 2

Vehicular autopilot is often likened to autopilot in commercial airplanes but the parallel doesn’t parse. Airplanes don’t just take off and fly wherever, however. Their flight plans are filed in advance and strictly adhered to, their course (speed and altitude) strictly monitored the entire time. Spur-of-the-moment deviation is not allowed. The airspace is controlled at all times to keep one airplane away from another airplane. Pilots have very little latitude to control their aircraft’s flight path.

And that’s what we’re really getting at here.

Control.

Autopilot in cars makes sense if all cars are similarly under control. If you have to file a “flight plan” before you go anywhere – and your course is monitored and subject to control the entire time. Then it might be possible to avert incidents such as the one described above. The auto-driving Tesla would have known about the big rig’s intention to turn long before the turn ever occurred – and accommodations could have been made by each vehicles’s computer brain.

Lovely, if you don’t mind mind the idea of no longer being in control of your vehicle.hands free

Ever.

For automated cars to be “safe,” it is necessary – mandatory –  that the caprice of human drivers be taken entirely out of the picture. Else, the vagaries of human imperfection will lead to accidents – and that is not “safe” and so Uncle will step in.

And human imperfection behind the wheel continues to worsen as less and less is expected of these human drivers as drivers. As even former basic competences like the ability to parallel park a car are no longer required because technology can handle that now. Every incremental dumbing-down – starting with ABS back in the 1980s through to the present day and cars that automatically brake, can come to a complete stop without the “driver” even touching the brake pedal  – has been at least tacitly an effort to get the driver out of the driver’s seat.

To render him a passenger.

It is possible that Tesla and Google and the rest of the juggernaut don’t consciously grok the fact that what they are pushing requires the driver to become a passenger. You can’t, on the one hand, fit cars with systems that invite the driver to stop driving – and at the same time expect him to “remain alert.”

You’re either a driver.

Or you’re not.

What’s it going to be?


Subscribe
Notify of
guest
15 Comments
Peaceout
Peaceout
July 2, 2016 4:53 pm

Auto pilot cars ES. Tesla ES. Elon fuckin’ Musk ES.

Anonymous
Anonymous
July 2, 2016 5:09 pm

So maybe it wasn’t an accident, maybe the car just decided it didn’t like him and whacked him?

Those things don’t have any sense of self or self preservation so it wouldn’t have worried about getting itself killed as well.

RT Rider
RT Rider
July 2, 2016 6:41 pm

1. Anyone who trusts a computer to drive his car against ongoing traffic is a fool,
2. Anyone allowing a computer to drive his car while watching a movie, and gets killed, deserves the Darwin award, posthumously,
3. Elon Musk is perfect for the times, a grifter and parasite has made himself worthy of P.T. Barnum’s maxim.
4. It’s been evident for years that we don’t live in a free society. Don’t pay your property taxes, lose your house. Don’t pay your income taxes, find yourself in jail.
5. In a free society, taxes are discretionary, as in consumption taxes. Don’t buy it, don’t pay tax.

Anonymous
Anonymous
July 2, 2016 6:49 pm

An airplane cannot takeoff or land “by itself”. The pilot must perform the takeoff, but he can engage the autopilot at 100′. The landing may be done on autopilot but the human must set the automation up for the approach and subsequently configure the airplane for landing. The automation will not lower the flaps, gear, arm the autobrakes, auto spoilers, or arm the automation to fly the approach. In any case the pilot should monitor the autopilot in all phases of flight, especially on final. This doesn’t include watching “Harry Potter”.

Brian Reilly
Brian Reilly
July 2, 2016 8:09 pm

Baaaah! Baaahhh! (Sheeptalk for those of you raised in the city.)

DavidC
DavidC
July 2, 2016 10:21 pm

The dangerous thing about so-called auto-pilots on cars is that people who are marginal drivers are now deluded into thinking that skill doesn’t matter. It used to be that anybody who took their car for granted and who were marginal drivers usually went through the windshield or otherwise perished in some form of accident. The overall skill level went up because there were no second chances. Now, even in a regular car, you see people applying make-up, reading maps or books or newspapers, texting, or otherwise acting like morons because cars have supposedly become safer. Can you imagine the fatalities once these computer gimmicks become more and more popular?

And, as a guy who will never use one of these things, if I or anyone in my family am ever in an accident caused by a moron using the auto-pilot, I will sue everyone from Musk to his company to the software maker to the computer maker to the camera maker, on down the line to the operator themselves.

Watermelon Man
Watermelon Man
July 3, 2016 1:11 am

Pretty much anything can happen over in Levy County.

lysander
lysander
July 3, 2016 9:09 am

TPTB want to marginalize humans because psychopaths like them can deal with machines much easier than those annoying, pesky humans who have all those ‘needs’ that machines don’t have.

TPTB are trying to make self driving tractor-trailers so that they can eliminate 3 million decent paying jobs for blue-collar Americans.

Here’s a snippet from a Popular Science article;

“Like Olympic skiers racing in single file to reduce air resistance, two 18-wheeler trucks in Nevada recently proved that uncomfortably close convoys can save drivers fuel and money. The key, instead of bold Olympic athleticism, is robotic assistance. A computer-assisted truck was able to follow closely behind a human-driven truck perfectly, maintaining exactly 33 feet of distance between the vehicles. The promise is a future of safer, more fuel efficient, and more robotic trucking.”

Check out the rest at http://www.popsci.com/article/cars/robot-truck-convoy-tested-nevada

I wonder what TPTB plan on doing in the future with all the excess and otherwise useless humans that have been replaced by robots?
Keep us as pets? Exterminate us? This robotic world is still a long ways off, but TPTB are doing their best to make it reality.

susanna
susanna
  lysander
July 3, 2016 10:27 am

lysander,
I’m pretty sure the plan is to spray us with some lethal variant
of what they are spraying us with now…anthrax-like crap. Then
TSA gets to do cleanup operations. See the latest theory from Dave
Hodges; I often thought the man was a kook, but then his prediction/theory/data comes true. It is crazy out there. Stay
safe.
Self driving cars are an absurd idea. And Elan gives me the (serious) creeps.

Anonymous
Anonymous
  susanna
July 3, 2016 10:38 am

http://bigstory.ap.org/article/81b31f2909c943b1a824a3ecb80fa40d/startup-wants-put-self-driving-big-rigs-us-highways

FWIW, this was predicted by The Simpsons and if The Simpsons predicted it, it will happen.

Stucky
Stucky
  lysander
July 3, 2016 2:09 pm

The assumption in the popsci article is that the human driver will NEVER have an accident.

But, what if the human driver does … while doing 70mph on a freeway, which is about 100 ft/sec. … and a driverless truck JUST THIRTY THREE FEET or 1/3 second behind him ……. which translates into One Giant Clusterfuk Accident.

Stucky
Stucky
July 3, 2016 11:21 am

Libtard asshole buys a Tesla. Libtard asshole dies while driving shitfuk car on autopilot.

Finally … a feel-good story on TBP!!!

jamesthewanderer
jamesthewanderer
July 3, 2016 12:07 pm

Even autopilot with pre-planned routes can and will fail. You cannot automate your LIFE.

Imagine, you start out at 7:30 AM headed to work in the Nawth. When you started, a cold front just hit town, three hours earlier than expected by the weathermen. The pavement ten miles from home is now a sheet of ice, and two seconds before you found it a big rig went 90 degrees sideways across the road. Will the autopilot:
(1) Try to slam-stop you, sending you into the big rig at 40 mph sideways:
(2) Try to slow-stop you, sending you into the power pole beside the road at 50 mph:
(3) Try to turn 90 degrees and partially succeed, sending you into the river beside the road at 65 mph:
(4) Manage a perfect stop without damage or contact for your car?
I know what I believe; autopilots (beyond cruise control, which can be useful on dry, sunny Interstate highways out west) are a work of the devil.

Stucky
Stucky
  jamesthewanderer
July 3, 2016 1:09 pm

Nice examples.

You’re basically saying the same thing as EP. That is, autopilot cars only work WHEN EVERYTHING (100%) IS PLANNED OUT AHEAD OF TIME. Yeah, that works well in real life. Sarc, off. The moment there is even the slightest deviation from the plan … all bets are off.

It will be a cold day in hell before I ever ever sit my fatass in one of those cars deathtraps.

jamesthewanderer
jamesthewanderer
July 3, 2016 2:22 pm

That’s just the mechanical side. Lavoy Finicum was ambushed when the police knew exactly where he was going on a quiet road, and died. Can you imagine the police and government knowing exactly when and where you are driving at all times? Can you imagine Google not gleefully SELLING them the information? Can you imagine how many “enemies of the people” would be “disappeared” in this scenario?
Driverless cars are bad enough; control-freak oppressive government is even worse. TOGETHER they will give us nightmares – unto death.