Sensor Technology Facilitates AI in Unmanned Aircraft Systems

Publish Date:
March 19, 2025

Of the many lessons that the conflict in Ukraine has taught us is the importance of unmanned systems in modern conflict.  Even before the conflict, it was clear that unmanned systems were force multipliers.  However, the remarkable videos of unmanned aerial vehicles or boats delivering munitions, performing attacks on armored vehicles or even major surface ships, as well as the deep strikes made by larger UAVs, have really highlighted the importance of these systems.  Whether the application is autonomous or remotely operated, artificial intelligence (AI) is playing a role in their success.  

But AI demands very high performance from advanced processors to provide the computational horsepower to get the job done in real time.  Its importance to modern sensor-based systems cannot be understated.  

Sensors Play a Crucial Role

As sensor technology continues to advance with improvements in resolution, sensitivity, miniaturization, and energy efficiency, AI-enabled systems are rapidly gaining access to increasingly rich and detailed information. Sensors aid in a variety of critical functions that lead to more actionable intelligence through sophisticated machine learning:

·      Data acquisition: Raw sensor data provides the essential inputs from which AI systems process and learn (i.e. cameras, microphones, environmental sensors, accelerometers, etc.)

·      Context awareness: AI systems use sensor data to understand the context in which they're operating, thus generating more appropriate and adaptive responses.

·      Environmental assessment: Sensors generate rich information about the surrounding environment, which AI-based systems use to create an awareness of its surroundings.  With that, the system can respond to and manage interaction, navigation and avoidance, especially critical in autonomous systems.

·      Feedback mechanisms: When sensors provide feedback on actions taken, AI systems can apply better reasoning to reinforce learning and implement corrective behaviors.

Applying AI in the Sensor Domain

Integrated embedded computing hardware, such as sensor payloads and vehicle control systems, is an integral part of enabling AI in today’s military systems. In the area of unmanned aircraft systems (UAS), AI is opening an even greater expanse of new possibilities as the technology advances and helps to create efficient streams of computationally-intensive data processing networks.

There are numerous opportunities to apply AI-based computing techniques in every sensor modality, including radar, EO/IR, electronic warfare and SIGINT. All of these require pattern recognition at some point in the collection-to-exploitation chain, but AI is where it becomes especially valuable.  

Computational hardware for deep learning tends to be power hungry however, and generates heat. UAS platforms are often extremely SWaP sensitive, so mitigating power demands and thermally-intensive hardware needs consideration.

AI techniques are proving to be extremely useful for this sort of situation, with lower power hardware, like GPGPUs and specialty System-On-Chip devices,  designed specifically to serve as deep learning engines, while optimizing the heat profile and power needs of UAS platforms. In addition, other lower power components are more rapidly becoming available for deployment on severely SWAP-constrained platforms.

Implementing A Deep Learning Strategy

There are a few aspects that should be addressed when applying a deep learning methodology to UAS.

Training Data. Deep learning does not rely on explicit coding; rather, the networks are trained.  As a result, deep learning requires huge amounts of tagged training data, the quality of which is also important – like all computing, garbage in will result in garbage out.

Deep learning can only work with what it is trained on. For instance, if a network is trained to recognize horses, cows and pigs, then the system is presented with an image of a giraffe and a hippopotamus, it may tag them as a horse and a pig, but most likely at a low probability.  

To overcome this, the two low-probability identifications could be recorded for later analysis, and the network updated either through explicit additional training or by gradual training based on real-world data collection.

Human Supervision. The role of user oversight or intervention often depends on the mission and criticality of the information being processed. Typically, the question is not if/or, but rather to what degree is Human-in-the-Loop (HITL) needed. For UAS, this is a critical piece of the AI infrastructure.

For real-time processing, HITL is absolutely mission-specific, since some sensor missions could be 100% autonomous, while others will require human monitoring and involvement. The question becomes: what actions can reasonably be taken without the risk of faulty or flawed decisions? The target tends to move, based on available inputs and defined levels of criticality.

For continuous training of the network, human intervention will almost always be necessary. Although the network may be able to identify a feature of interest (i.e. a new and novel waveform), and over time be automatically trained to identify this new waveform, it is unlikely to classify it or determine its purpose or importance without human study, classification and further network training. Humans inherently need to direct or steer the algorithms towards deeper learning methods, and build on the data continually being input.

AI in Future Unmanned Platforms

While the AI and deep learning revolution may be in its early days, it’s clear that deep learning-based AI will find application in interesting and surprising ways. As AI becomes more sophisticated, the priorities for data-based implementations may shift. Newer, more powerful, and less power-hungry processing will emerge, and even more clever and novel applications as these very powerful techniques are more broadly applied to embedded computing applications.  

For now though, whether the sensor processing is for surveillance or a part of the vehicle control system requires high performance computing close to the sensor.  This computing must work in a Space, Weight, and Power (SWaP) constrained environment, and stand up to harsh environmental conditions such as thermal extremes, shock and vibration, and contaminants such as dust, rain, salt fog, and various vehicle fluids.  It takes skill and craftsmanship to design a platform that can deliver the computational performance needed while standing up to challenging environmental conditions and SWaP constraints.

Downloads

No items found.

Read More Blog Posts

Meeting Power Requirements in Remote and Harsh Conditions

Meeting power needs in harsh environments is all about identifying environmental challenges and addressing needs such as MIL-STD-810 and MIL-STD-461 and more. An engineer must consider design factors for a power system that offers consistent performance and reliability in tough conditions

Designing power solutions for harsh and unpredictable environments requires creating systems that deliver ruggedness, flexibility and scalability. Overcoming the technical challenges entails innovative approaches in mechanical design, heat management, power source compatibility and regulatory compliance. By addressing these areas with precision, engineers can ensure that equipment remains reliable, even in the toughest conditions.

VNX+ for Ground Vehicle Applications

VNX+ or VITA 90 is a small form factor standard useful for ground vehicle computers

VITA 90, also known as VNX+, was developed to offer an alternative form factor to 3U OpenVPX for space, weight, and power (SWaP)-constrained applications. Incorporating nearly all of the technical features of 3U OpenVPX, it boasts a form factor roughly 30% less on a per-slot basis.