Home / In The News

In The News

Filter by tag:

Human Machine Teaming May be the Future of Drones in the Defense Industry

Human Machine Teaming May be the Future of Drones in the Defense Industry

Originally published to DroneLife

Next-Level Human-Machine Teaming

By: Dawn Zoldi (Colonel, USAF Ret.)

Since the beginning of warfare, combatants have sought ways to increase that reach while removing themselves further and further from the battlefield. Distance can mean the difference between life and death to a soldier, sailor, airman, marine or guardian.

Robots extend human reach and possibilities.

human machine teaming XTEND Modal AI drones for defenseFor this reason, the U.S. Department of Defense (DoD) has embraced the concept of human-machine teaming (HMT) and continues to invest in multi-domain autonomous platforms, including drones, to aid the warfighter. Those drones must be secure, reliable, interoperable and, ideally, future-proof. These requirements extend to all components, including the operating system (OS) and autopilot. Two companies driving the next-generation of NDAA-compliant sUAS, XTEND and ModalAI, have partnered to provide the required elements to enable reliable and secure next-level advanced human and machine collaboration for the battlefield.

Designing For Success

Autonomous drones play a significant role in today’s conflicts. From a distance, drones provide intelligence, surveillance and reconnaissance (ISR) to warriors, relaying crucial information about enemy locations, whether over-the-hill or in a remote cave. They augment their human operators, delivering critical goods to dangerous locations (e.g., blood to the front lines), detecting dangers (e.g., hidden landmines) and rapidly resupplying units. In the right scenarios, drones can conduct life-saving missions.

These systems operate using clear repeatable rules based on unambiguous sensed data. In accordance with current DoD policy, a human remotely guides them through the operational task life cycle. This ensures that commanders and operators can exercise appropriate levels of human judgment in any given mission. This necessitates that military organizations responsibly design and develop, as well as deploy and use advanced capabilities.

Responsible design includes ensuring that such systems remain safe, secure and effective so as to avoid unintended consequences. This demands incorporating appropriate safeguards, and humans in the mix, to mitigate risks of serious failures or incidents.

As a result, experts predict that the DoD’s biggest in research and development (R&D), will flow into improvements in HMT and machine intelligence. But effective HMT is about much more than just the automated platform; it’s about the entire system.

The Whole and Its Parts

On the civil side, drone regulations distinguish between the uncrewed aircraft (UA) itself and the uncrewed aircraft system (UAS). The UA, according to these regs, “means an aircraft operated without the possibility of direct human intervention from within or on the aircraft.” A small UAS includes not only the UA itself, but “its associated elements (including communication links and the components that control the small unmanned aircraft) that are required for the safe and efficient operation…”

So, too, HMT is about much more than just partnering a person with the right machine. It’s about the components in and on that machine and how humans interact with them. An entirely new R&D discipline has arisen around the concept of systemic human-robot interaction (HRI). Its focus areas, relevant to military autonomous systems, include, among other things, how humans and machines communicate.

In machines, operating systems (OS) provide the key to this communication and to effective HMT. An OS manages all other applications and programs in a machine’s computer and enables applications to interact with a computer’s hardware through a designated application programme interface (API). The OS manages hardware resources (e.g., CPU, memory); runs applications to enable user inactions and provides a user interface, usually a graphical user interface (GUI) through which the user interacts with the computer.

For drones and robots, several other critical systems come into play, such as a core autopilot and on-board companion computer that integrates flight controller, CPU, video encoder, GPU, Neural Processing Unit (NPU) and electronic speed controllers (ESCs).

To tackle current design challenges, the Defense Innovation Unit’s (DIU) Blue UAS Framework sources the type of safe, secure and reliable components that DoD requires for its drones. DIU on-ramps commercial off-the-shelf (COTS) systems through its Cleared List and, as part of its Foundry, engages with companies to modify its tech to meet DoD standards. As part of these efforts, DIU matches plug-and-play small drone components with systems on its Cleared List. Having a “Blue” component integrated into one’s system increases the chances of successful employment within DoD channels.

Two autonomous drone companies, XTEND, creators of the immersive XTEND Operating System (XOS), and ModalAI, a California-based startup whose VOXL® family of autopilots, have joined forces to take HMT to the next level.

human machine teaming, drones for defense, XTEND, ModalAI

A Winning Combination

Founded in 2018, XTEND’s co-founders, Aviv Shapira (CEO) and Rubi Liani (CTO) originally planned to develop a mixed reality game with drones. Liani had previously founded Israel’s drone racing league and Shapira brought significant augmented reality/virtual (AR/VR) experience to the table.

The co-founders discovered their game use case for drones could also be applied to the defense industry. In short order, they secured a contract with the Israel Defense Forces to provide revolutionary human-guided autonomous machine systems to enable any operator to perform extremely accurate maneuvers and actions, in any environment, with minimal training.

In just five years, XTEND has grown to over 100 employees in offices in the U.S. and Singapore, in addition to its Headquarters and an R&D Center in Israel. It produces three models of unique human-guided drone systems that have been produced in the 1000s: the Griffon counter-UAS drone (its first offering), the Wolverine a multi-mission workhorse that can be outfitted with a claw or other tools, the related Wolverine ISR lightweight outdoor drone and the XTENDER micro tactical ISR drone, which can operate in tight GPS-denied spaces. The company’s keystone product, the XOS, powers all of its drones.

The XOS provides a unified core across all of these platforms, offering advanced capabilities that include:

  • AR GFX SDK – Enables adding real-time augmented reality 3D graphics from various external data sources via SKYLORD’s™API | SDK
  • Robust OS architecture – Allows integration with multiple aerial and none-aerial platforms with minimal configuration.
  • Distributed OS – Allows integration with multiple aerial and none-aerial platforms with minimal configuration.
  • Dynamic payload API – Enables adding physical payloads with variable configurations and data connectivity in order to transform SKYLORD™ platforms into infinitely capable tool sets for variable situations.
  • ML-based dynamic sensor fusion – Machine Learning based proprietary sensor fusion that allows SKYLORD™ drones to operate with great spatial accuracy in complex dynamic environments under variable lighting conditions.

XOS works alongside XTEND’s autonomous drones and handles the pipeline from the human, his or her mission decision and final action. One operator can use all of these drones together and switch between them. For example, soldiers have used the Wolverine to carry an XTENDER drone to the door.

According to Shapira, “XTEND tries to enable humans by providing robots that can act autonomously in life threatening situations.” He continued, “These tools enrich users to complete more complicated tasks by combining human discretion and machine autonomy. In a military use case, you can send a drone to complete a task instead of a human. This saves lives.”

XTEND sought a computing platform powerful and secure enough to be worthy of their human-guided drone ecosystem of products. They discovered ModalAI, and the company’s VOXL 2 autopilot, through their connections with DIU.

human machine teaming, XTEND, Modal AI

ModalAI announced its VOXL 2, with more artificial intelligence (AI) computing capability than any other similar product globally. DIU partially funded the development of VOXL 2 to advance domestic autopilot capabilities, as part of its Blue UAS Framework 2.0.

Weighing only 16 grams and powered by the Qualcomm® Flight RB5 5G platform, VOXL 2 integrates a PX4 real-time flight controller, a state-of-the-art CPU, video encoder, GPU and Neural Processing Unit (NPU), with ModalAI’s open VOXL SDK.

The VOXL SDK comes complete with autonomous behaviors required to safely and reliably fly BVLOS and avoid obstacles. The included mapping and planning software provides a route for a given desired trajectory, mapping and navigating around obstacles to achieve the best path. Collision Prevention sets the parameter for minimum allowed approach distance. This tech supports robust support for 4G/5G-based beyond visual line of sight (BVLOS) flight.

Liani recalled, “At XTEND, we are focused on software. We didn’t know how to minimize drone hardware into a 16g autopilot. With Raspberry Pi we need to create the carrier board; with Jetson we need to build the drivers. VOXL gave us everything we needed, including the cameras, because ModalAI designed it to simply be popped into a drone. It was a perfect fit.”

Other considerations that the XTEND weighed, in engaging in this partnership, included that VOXL 2 is NDAA-compliant, has the right size, weight and power (SWaP), pre-integrated algorithms and GPS-denied capabilities specifically designed for the drones and could easily integrate into XTEND’s products. The company’s support and communications efforts factored in as well. ModalAI provides an active forum for its users.

Chad Sweet, CEO of ModalAI, explained, “Hardware is hard. At ModalAI, we’ve already done the difficult work for you. We’ve thoughtfully engineered autonomous capabilities into a small package which is ready to integrate with your software. This can cut your development time in half.”

Elevating Performance

Now, that XTEND’s XOS runs on VOXL 2, it can handle an unprecedented amount of onboard processing. This enables localization, GPS-denied navigation and swarming with XTEND’s drones. Conversely, because XOS is compatible with VOXL, other OEMs that already in the ModalAI ecosystem can add XOS to their fleets.

This combination of best-in-class components not only reduces training time for operators, it also reduces the barrier to entry for any organization that wants to employ autonomous drones in dangerous environments…and take human-machine teaming to greater heights.

Liani said, “The interaction between the human and the machine is very important. With XTEND’s XOS and ModalAI’s VOXL 2, you can put the human in the loop and let the machine do the dangerous work. In terms of productivity, the resultant equation is 1+1=3.”

ModalAI Launches 11g VOXL® 2 Mini to Advance the Industry Towards the Smallest AI Drones

SAN DIEGO--ModalAI today unveiled the powerful new VOXL 2 Mini, ushering in a new era for smaller, smarter, and safer drone autopilots. The newly designed VOXL 2 Mini features autonomy, communications, and power management, providing a 30% reduction in area compared to VOXL 2, to only 42mm x 42mm. At only 11g, VOXL 2 Mini fits all the technology of the Blue UAS Framework 2.0 VOXL 2 into a delightfully compact size while still delivering an impressively large repository of autonomous AI and computing capabilities with the VOXL SDK. VOXL 2 Mini also features an industry-standard 30.5mm x 30.5mm frame mount, offering autonomy to any drone. The new VOXL 2 Mini is available to order beginning today starting from $1,169.99 at www.modalai.com/voxl-2-mini

“The response to VOXL 2 has been incredible, and we are thrilled for customers to experience the same powerful computing in an even smaller design,” said Chad Sweet, CEO and co-founder of ModalAI, Inc. “Weight is the dominant factor when designing a drone as they need to fight gravity in flight. VOXL 2 Mini is only 11 grams and delivers the same performance as its VOXL 2 predecessor in a smaller form factor, ideal for the next generation of smaller, smarter, and safer drones.”

VOXL 2 Mini is a breakthrough in autonomous UAS components

VOXL 2 Mini is powered by the same technology as VOXL 2, and features the powerful Qualcomm QRB5165 with 8 cores up to 3.091GHz, four MIPI-CSI image sensor inputs, pre-configured accessories for WiFi, 4G/5G, and Microhard connectivity. This computing-dense autonomous stack is as small as an Oreo cookie and includes a new 5.8g VOXL ESC Mini with integrated power management system (APM) and closed or open loop RPM control with feedback. VOXL 2 Mini leverages the same image sensors as VOXL 2 for state of the art computer vision performance. All together, the power from VOXL 2, coupled with extensive perception sensors and new power management system, delivers obstacle avoidance, obstacle detection, and GPS-denied navigation.

VOXL SDK brings powerful performance to VOXL 2 Mini

VOXL SDK enables developers looking to create, validate, test, and integrate their autonomous flight software into their small UAV solutions. VOXL 2 Mini’s open software and compatibility with popular development tools such as PX4, ROS 2, Open CV, Open CL, and TensorFlow Lite, make it the ideal development platform for industry problem solvers.

The combination of developer-friendly, open software and VOXL 2 Mini’s hardware design with 30.5mm x 30.5mm industry standard mount, make it easy to upgrade the smallest UAV designs.

About ModalAI, Inc.

ModalAI® accelerates development of smaller, smarter and safer drones and robots with SWAP-optimized Blue UAS Framework autopilots built in the U.S.A. From home and business security to retail and government applications, the company’s highly-integrated AI-powered modules empower a variety of industries to utilize aerial and ground autonomous navigation systems that communicate on 4G and 5G cellular networks.

Based in San Diego, California, ModalAI was formed by former Qualcomm Technologies, Inc. R&D leadership in 2018 and builds on their prior research and development in the drone and robotic markets. ModalAI’s VOXL product line helps manufacturers and independent builders get to market quickly and affordably. For more information, visit www.modalai.com.

Contacts

Lauren Young
lauren.young@modalai.com

Innervating Smaller, Smarter, Safer…and Bluer Robots and Drones

Innervating Smaller, Smarter, Safer…and Bluer Robots and Drones
ModalAI and Doodle Labs recently announced the compatibility of the Helix Smart Radio and the VOXL 2 autopilot. This means developers can now leverage the Blue UAS Framework-approved combined tech of both companies to fully innervate their drones and robots. The DIU Blue UAS Framework focuses on best in class, secure supply chain, interoperable UAS components, and software. This year, DIU publicly launched the Blue UAS Framework 2.0 effort.