“The accident served to galvanize public interest…in public transit. It also, of course, drew attention to its dangers”
~ FT, describing an 1830 railroad fatality
You are about to cross a street, using the designated crosswalk. Given that you have the right of way, do you:
- Proceed. You have the right of way.
- Look both ways. Then proceed.
- Proceed only when you are sure the oncoming traffic is actually slowing down.
- Continuing texting while listening to your playlist.
Unless you’ve decided to completely unplug from the world around you, you know exactly why I’m asking this question.
Uber. Tempe, Arizona, March 18. It was the shot heard round the world: “Robot Cars Test Humans” was the latest headline on the tragic event.
“Test”? How about “Run Over”? That would be closer to the truth.
You may never approach a crosswalk the same. You should never approach a crosswalk the same. Unless your approach has consistently been to assume that no oncoming car will stop….until it actually stops.
There’s a Spanish proverb that should govern all such activities: “Among the safe courses, the safest of all is to doubt.” If it did, the world would be a much safer place to live and work.
Sadly, it does not. People routinely put themselves in harm’s way, making and relying on assumptions that are later proven wrong.
Uber: The Perfect Example
In another of a long list of ironies, the word “uber” comes from the German word meaning the biggest and best example. The Uber first of its kind of event – a collision involving one of their driverless cars and a pedestrian – is likely to be one those “uber” examples people will be talking about a century from now.
“The risk of introduction of new technology” was how the Financial Times framed the situation, citing in the lead the first rail passenger fatality on the new line between Liverpool and Manchester. In 1830.
Another similar uber example.
Two centuries removed, the current event is stirring up all kinds of discussion and debate about automation, safety, field testing of new products, and the relative contribution of robots versus humans to reliability and safety. As well it should. Testing has been stopped in Arizona: that would be a decision.
You might have already engaged in the debate over a cup of coffee: Humans are lousy drivers, easily distracted, inclined to speeding, and sometimes drive under the influence. Robots don’t suffer from any of those problems.
If machines never failed in service, it would be game, set, match. Turn the keys over to R2D2, and take a comfortable back seat.
You might. But I am not about to buckle into seat 13 B on the next flight to Keokuk Iowa on the basis of that theory. I want to see two humans sitting up on the flight deck, as they say in the safety briefing, “in the unlikely event that….” ….. the flight computer goes down.
But Wait, There’s More
Obviously there was some kind of a computer glitch that prevented the Uber vehicle from detecting the presence of a human in the crosswalk. It sailed right through the intersection at 40 mph without blinking. Eventually the computer programmers and robotic engineers will figure out the cause, and come up with the fix. You can do that with a machine. The fix will be a small step, at the cost of human blood and treasure, but at least some progress will be made.
It doesn’t always happen that way. Of late, when the National Transportation Safety Board has investigated similar kinds of failures where the human (not the computer) was at the wheel – e.g., the Midland and Philadelphia rail accidents – they just threw up their hands, as if to say, “When it’s people at the wheel, stuff just happens.”
Btw, I’m predicting the same finding in the recent Washington State derailing, where a train going way too fast came off the tracks and fell down on the I-5. Stay tuned.
As to making progress, the industry safety experts will point to technology as the way forward. Spend more money on some new gadget to prevent these kinds of things from happening. In railroading, Positive Train Control is the fancy name for taking control away from the engineer and giving it to a computer.
It might be an improvement, but, once the train leaves the station, it’s not a guarantee.
What Are The Odds?
Back in 1966, when I was taking Driver Ed in High School, our instructor had a brake pedal on the passenger side of the car, just in case some sixteen year old student driver with a learning permit drove…..like a sixteen year old driver with a learning permit. Ironically (sorry to use that again, but this story’s full of them) there was a backup driver sitting behind the wheel of that Uber vehicle in Tempe. Called a “Safety Driver” the role of the human was to be at the ready should the computer fail to detect a hazard or drive the vehicle safely.
Executing that role properly demands constant vigilance by the Safety Driver. No telling where or when the computer driving the vehicle might miss something. Just like the Driver Ed instructor. A former Safety Driver explained it perfectly: “The computer is fallible, so it’s the human who’s supposed to be perfect. It’s kind of the opposite of what you think about computers.”
The dash cam in the vehicle showed the Safety Driver looking down in the seconds before the collision. A robot would never do that.
So, the technology failed, and backup system known as the Safety Driver simultaneously failed. In statistics, it’s called the Law of Independent Probability; there’s a simple formula to calculate the odds.
They are not zero. Nor will they ever be zero.
Equipment failure and human failure: two problems every operations leader on the planet is intimately familiar with. All the Six Sigma and Human Factors design work in the universe will never eliminate either.
Which means things will still go wrong, just less often and less severely, assuming that design is based on solid engineering and people management principles. That’s simply being honest about the way things really are. In matters of safety, there is no Zero Risk.
Of the things that matter to safety, risk is the most misunderstood, and misapplied. If it were otherwise, no one would ever enter a crosswalk without first seeing the whites of the oncoming driver’s eyes.
That’s my rule in these situations, and yes, I know it’s becoming problematical. Half a year I live on a busy street in a city chock full of safe spaces known as Pedestrian Crosswalks. Guess what: If there’s a driverless car approaching, I’m not crossing. Heaven can wait.
Think otherwise, and you’re setting yourself up as a potential victim for what is known in Economics as Moral Hazard.
There are those who freak out whenever economics and safety are mentioned in the same sentence: “When people’s lives are at risk, don’t you dare bring up money!” That I understand. But if you want to really understand safety, there’s a lot to be learned from the dismal science, better known as economics. One very useful thing is to understand Moral Hazard.
In the aftermath of the sub-prime mortgage meltdown (check the return on your 401k plan, and you’ll know the year) the term Moral Hazard was used to describe the root cause of the problem. When people think there is no risk of losing money, people will put money into bad investments.
And did they ever! Some of the smartest people left in the room after the demise of Enron took their clients, companies, and in some cases, their own bank accounts down for the count, buying mortgages that would never be repaid. Even innocent people – like you and me, who owned nothing even remotely related – suffered. Collateral damage is what that’s called.
As to the hazard part, a hazard is a source of danger; something that can hurt you. In safety, it’s physical harm. For investing, the harm comes in the form of losing your hard-earned money. Both hurt, albeit in different places.
As to why “moral” was hung out front, that’s because people who meant well did their dead level best to keep anyone from feeling any pain or suffering from their investment decisions. Guarantees made buying mortgages a “safe space.” No risk, no worries.
Just like a pedestrian crosswalk: No risk, no worries. Right?
The Last Word
Lost in all of the post incident reporting, discussion and debate is the “innocent victim” who walked a bike across a pedestrian crosswalk at night, with an oncoming car approaching at 40 MPH. She bet her life that the driver would yield right of way, and stop.
An uber example of the Moral Hazard of safe spaces, Zero Risk, and the value of that Spanish Proverb: “Of all the safe courses, the safest of all is to doubt.”