By Chris Stonor
Autonomous flight and self-driving vehicles may initially prove a psychological hurdle for the public to overcome. No pilot, no driver… come on!? Yet, once through that barrier, flyingmag.com asks a sensible question in its headline, “How will self-flying aircraft make ethical choices?”
This is an excellent, well-researched and extensive feature that looks at the technical challenges that autonomous flight must overcome, while looking at the progress already made. To use a music analogy to describe the article, not so much a three-minute pop song, but an intricate long-lasting prog-rock track.
Daedalean AI is developing a sensor-based detect-and-avoid flight control system for self-flying eVTOLs (pic: Daedalean AI)
The article begins, “Let’s imagine the year is 2040. You’re a passenger in a small, autonomous, battery-powered air taxi with no pilot flying about 3,000 feet over Los Angeles at 125 mph. Air traffic is crowded with hundreds of other small, electric aircraft, flying and electronically coordinating with each other, allowing very little separation. Suddenly, an alarm goes off, warning about another passenger aircraft on a collision course. But amid heavy traffic, there are no safe options to change course and avoid the oncoming aircraft. In this scenario, how would a machine pilot know what action to take next?”
It continues, “For engineers and ethicists, this scenario is commonly known as a “trolley problem,” an ethics thought experiment involving a fictional runaway trolley, with no brakes, heading toward a switch on the tracks. On one track, beyond the switch, are five people who will be killed unless the trolley changes tracks. On the switched track is one person who would be killed. You must decide whether to pull a lever and switch tracks. Should you stay the course and contribute to the deaths of five people or switch tracks, resulting in the death of just one?”
And adds, “The analogy is often used in teaching ADM. Someday, aircraft controlled by artificial intelligence/machine learning (AI/ML) systems may face a similar quandary.”
Luuk van Dijk
Luuk van Dijk (no relation to Virgil), Founder and CEO of Daedalean AG, a Swiss company developing flight control systems for autonomous eVTOLs, is quoted. Amongst his comments he says, “I don’t want to trivialise this problem away, but we should not have a system that has to choose every time it lands, who to kill.”
Areas covered include AI learning, the mysterious neural net, ethical piloting decisions, guidelines and ordering, as well as the merging of AI/ML with Air Traffic Control.
Well worth a look.
For more information