::: nBlog :::
We’ve recently put quite some effort to our Spime Enablement Services (SES) framework, which, among few other interesting things, is now able to control our semi-autonomous drone Ohan-1. The local drone program handles avionics and real-time control, while being continuously updated by the SES master we call Sandbox. Mission planning and longer term weather adaptation happens on the cloud side entirely, while being backed up locally to withstand connectivity glitches.
The Ohan-1 can take off autonomously and feed continuous video to the SES Sandbox, which in turn uses an adjacent machine learning service to recognize e.g. a car license plate. Then, the SES Sandbox can instruct the drone to follow a particular car and provide 360 degree imagery and relative positioning.
While this is fairly straightforward use of AI and machine learning, it has required a lot of fine tuning and elimination of false positives. At the end, the machine learning framework is almost entirely customized for this application.
Technological successes like this easily make us think that many seemingly mundane tasks like controlling sovereign airspace will soon be handled by AI, robots or even smart missiles; all much cheaper than human-controlled fighter aircraft. However, humans still by far outperform AI in unexpected and changing situations, which reconnaissance and intercept flights are often prime examples of.
It is true that guided missiles are now very fast, cheap and almost unmitigatable, and can be used to enforce borders during wartime operations. However during a crisis or peacetime, it’s a different story. The missile has one goal – to ensure the destruction of the enemy object. As opposed to a human pilot, it does not assess the situation and conduct international diplomacy when it meets the foreign object. Firing a missile against a target, which may also be a lost civilian aicraft, is immediately an act of war. Drones, especially when dependent on wideband comms link back to the command center, have similar issues when the connection is disrupted, accidentally or intentionally.
Given the complexity of the intercept situation and the difficulty to obtain AI training data for it, I foresee that it’ll take a very long time before we have autonomous AI interceptor aircraft. Before that, AI has to learn a lot about human unpredictability and diplomacy. The latter still makes us humanly unique. AI will, though, complement many things around us.