Software bugs and letting computers control too much

Just a matter of time…

1 Like

And none of the above addresses intentional code viruses.

Once you have a self driving car, it will become an IoT…

Another ethical issue has been bouncing about:

You are zooming along a winding mountain road in your self-driving car. As you round a sharp curve, you come upon a bus broken down in the middle of the road. Because of the rugged and remote location, your car had no prior warning. Between the rock wall on one side and a steep dropoff on the other, there is not enough room to go around the obstacle. You do not have enough time to stop and your car knows it. Does it try its best to slow before a collision or does it decide “the good of the many outweigh the need of one,” namely you, and send you sailing off the cliff? Is it able to send information to “the collective” about the problem or is your swan dive in vain? How many other cars meet the same fate before “word gets out?”

If the bus were properly equipped with V2V and satellite uplink, as shall be mandated, as made obvious by your next-to-last inquiry:

it would already have warned everything in the area, thus negating this exercise…
:wink:

Now, back to Corvairs and “swing-axle jacking”

oh wait.
wrong thread…

You are zooming along a winding mountain road in your self-driving car. As you round a sharp curve, you come upon a bus broken down in the middle of the road. Because of the rugged and remote location, your car had no prior warning. Between the rock wall on one side and a steep dropoff on the other, there is not enough room to go around the obstacle. You do not have enough time to stop and your car knows it. Does it try its best to slow before a collision or does it decide “the good of the many outweigh the need of one,” namely you, and send you sailing off the cliff? Is it able to send information to “the collective” about the problem or is your swan dive in vain? How many other cars meet the same fate before “word gets out?”

Asimov was way ahead of the game on this one.

2 Likes

You will likely see a ban of any non-autonomous vehicle shortly after these self driving cars take off. Discussions about such legislation is already appearing in industry journals.

Long term, perhaps, but I don’t think it will happen in my lifetime (assuming 60 or so more).

I would hope your right, but I doubt it. I think early large scale deployment of self driving cars will have many accidents involving non-self-driving cars. And those manual cars will be deemed at fault (after all computers are perfect). It isn’t a stretch at that point to claim that we need to eliminate manual vehicles for safety. I think this will occur first on major highways and gradually spread from there.

Sad thing, is the above scenario is already being discussed in transportation planning and engineering conferences…

Obviously, you run closer to these circles than I, but I cannot imagine this scenario has NOT been kicked around, though perhaps not in any “official” capacity, in traffic planning & engineering conferences for at least 1-2 decades, and yet here we sit, no clear pathway forward…

And I agree with you. Based on the accelerated rate at which Americans have been willing to hand over their freedoms in recent years makes me think you’re exactly right, and it will come faster than people like me think, let alone hope.

1 Like

I first sat in on a lecture on the law associated with the subject at an APA conference in the 1980’s. But these are not the folks driving the technology. They are just the folks that are planning on using the government to regulate and control it–so the time frame is not up to them. Like they to regulate and control most aspects of our transportation system.

FYI, one of the hot topics for the last few years has been adding a mileage tax to vehicles to account for the increased mileage in modern cars and the resulting decrease in gas tax revenues. I believe bills on this have already been discussed in several state legislatures, including Texas.

1 Like

From the Science article:
We found that participants in six Amazon Mechanical Turk studies approved of utilitarian AVs (that is, AVs that sacrifice their passengers for the greater good) and would like others to buy them, but they would themselves prefer to ride in AVs that protect their passengers at all costs.

1 Like

Yep, yet another version of NIMBY…

In the end the truck driver will end up getting charged for not clearing the lane before proceeding.

On the side of the truck driver, the Tesla driver wasn’t paying attention as the notices say to do in an autopilot enabled car.

I also believe that the car isn’t at fault because of the fact that at a low-angle, the sun blocks out most, if not all, of the windscreen in most vehicles. At that angle, I wouldn’t be able to see the truck either even if I was in one of our other trucks.

If you can’t see your supposed to slow and proceed with caution. But most relevant to this discussion is the quote concerning a radar detector which should have detected the truck, even if the sun was blinding…

In a tweet, Tesla CEO Elon Musk said that the vehicle’s radar didn’t help in this case because it “tunes out what looks like an overhead road sign to avoid false braking events.”

The radar in question still wouldn’t have detected the truck based on it’s location on the car and it’s range. It would have seen the truck’s wheels and if it had a skirt (for lack of a better term) would have stopped.

Wheels should be enough to cause the software to trigger a collision warning, but the cars creator seems to be saying this was a bug. What is a bug is the cameras detections of sun glare should have caused the car to slow until a clear path could be determined just like a driver is supposed to.

No, It should have triggered the autopilot to disengage and force the driver to take over.

1 Like

Do you know of any posted specs on the radar unit in a model s? I couldn’t find anything official on what its fov, range or limitations are.

Apparently its limitation is seeing a tractor trailer.

4 Likes

I’ve seen this image in a number of articles (such as the source article). The green cone is the radar, the blue cone is the camera, and the yellow circles are the ultrasonic sensors. This suggests lane-keeping due to the very limited perspective available to the car as opposed to the broader capabilities that the term auto-pilot suggests.

How far down the road it can see and the precise swept areas are not something I’ve seen myself either.

Does anyone have any idea what the power output of these “radar” emitters are? What is considered a safe level - it would be cumulative for all the vehicles in a given area - image being on the highway with traffic stopped and everyone’s car beaming each other. I’m sure the output is pretty low but I wonder how long before cancers will be associated with people living/working near roads.

Look at high-power transmission lines and cell phones that are accused all kinds of diseases. I’m guessing with all the folks carrying phones in their back pocket a pending epidemic of rectal cancer is looming in the future … along with the brain cancer from when it’s not in your pocket. Faraday knickers may be the next big thing in safety wear.

I’m sure someone will find the ultra-sonic emitters will be found to confuse bats, dolphins, and some endangered insects hence be “bad” negating any positive benefits electric cars have. Of course people driving their cars will be “deniers”.

The future that tort, environmental lawyers and government regulators must see is their fortunes on auto-pilot as a major growth industry and will flourish like locusts of biblical proportions.

I wish my satire were not so likely to become true. Hopefully, the technology will at least improve to the point it can reliably see a tractor trailer.