For much of the relatively short history of civil and commercial drone flight, safety protocols have been fairly straightforward and easy to implement. Most operations are mandated to take place within the pilot’s line of sight, which enables potential collision hazards and obstacles to be spotted and reacted to in a timely manner. These hazards include other drones, manned aircraft, birds, and non-flying obstacles such as trees, buildings and power lines.
However, as the industry has grown, two emerging factors have meant that we are pushing up against the limits of this way of operating. The first is the raw increase in numbers – drone technology is more accessible and popular than ever, and the number of aircraft in the skies continues to increase year after year.
The second is the need to move beyond the current VLOS (visual line of sight) paradigm. In order to make commercial drone operations more scalable, and thus more profitable, companies need to be free to fly BVLOS (beyond visual line of sight). While this will enhance almost every sector of the drone industry, from industrial inspection through to cargo delivery, it will also require a considerable leap in the complexity of safety equipment and procedures.
Going beyond traditional collision awareness
Many manned aircraft use technologies such as ADS-B and TCAS (traffic collision avoidance systems) to broadcast information including position, heading and identification that can be picked up and used to reduce the risk of mid-air collisions. These co-operative solutions work especially well at higher altitudes, where there are fewer obstacles. Aircraft operating at these altitudes are also typically required by law to utilize them.
However, most commercial drones work at very low altitudes, where there are not only more obstacles, but also smaller aircraft that are not mandated to use one of the aforementioned situational awareness technologies. Combined with the lack of onboard pilot to manually observe and react to potential issues, this makes the situation more complicated for BVLOS drones.
One method of mitigating risk is to use either ground-based human observers or radar systems along the drone’s flight path. However, for truly long-distance drone operations, the installation of these solutions at the level required to provide complete coverage is clearly not practical.
For these reasons, specialized detect-and avoid (DAA) solutions, also known as sense-and-avoid (SAA), are being developed.
Different approaches to detect-and-avoid
A DAA system uses onboard sensors to gather information about the environment surrounding a drone, and to spot potential hazards in a timely manner. This information is processed and may either be sent back to the drone’s remote pilot so that a manual evasive maneuver or course correction can be performed, or used to issue commands to the autopilot so that the situation can be resolved autonomously. The detection function can potentially be carried out by a number of imaging technologies.
Onboard radar sensors in the K-band can provide detection at long ranges. However, radar systems are quite bulky, and require significant power for reliable detection at greater distances. They are thus likely to have too high a SWaP (size, weight and power) footprint for many small commercial UAVs, but may be suitable for larger drones.
LiDAR scanners provide greater resolution than radar, and allow very accurate measurement of the distance and speed of other objects in the vicinity of the drone. However, they have a relatively narrow field of view, meaning that more scanner units will be required to provide complete coverage. As they are also quite large and heavy, this also makes them unsuitable for drones with low SWaP budgets. LiDAR scanners are also relatively expensive.
Electro-optical cameras, backed by image processing, provide a third option. They are the lightest and most compact option, and can provide good accuracy and detection ranges, especially at low speeds. Many UAV platforms are already equipped with an onboard camera with sufficient resolution to handle detection. However, the performance of these sensors can be significantly affected by weather conditions and low light, and they are essentially useless for night-time operations.
The amount of coverage required for a DAA system depends on the jurisdiction – some require complete 360-degree coverage, while others only mandate that the aircraft must monitor in the forward direction. Many civil aviation authorities have used standards published by ASTM International and other bodies to determine acceptable separation distances between aircraft.
Sensor and data fusion for versatile DAA
DAA systems may thus have to be optimized for specific drone operations and environments in order to account for the advantages and disadvantages of different methodologies. They may also use fusion algorithms to combine information from multiple sensors and sources, in conjunction with sophisticated AI (artificial intelligence) and machine learning techniques.
A complete DAA would also make use of ADS-B and TCAS data from cooperative aircraft, as well as information from a UTM (unmanned traffic management) supplier. UTM is a concept that is currently in its infancy, but is likely to become a crucial part of integrated drone operations in the near future. It describes a suite of services that will be used to manage drone operations at scale in low-altitude airspace, in order to maintain safe separation of all manned and unmanned aircraft. These services will include automated flight authorizations, strategic deconfliction, and real-time notifications concerning weather, airspace restrictions, emergencies and other crucial information.
Click here to find out more about UTM concepts and services
The United States Federal Aviation Administration (FAA) is currently developing a new airborne collision avoidance system for all aircraft that is designed to replace the existing TCAS system. ACAS X will collate information from a variety of sources, including both cooperative transponder data and non-cooperative sensors, and use this information to build a probabilistic model of the aircraft and its position within the environment. This model can then be used to determine the best course of action for any particular situation.
ACAS X has already been shown to have significant safety improvements, and unlike TCAS it is designed for use with unmanned aerial systems as well as manned aviation. Two variants of ACAS X will deal specifically with the requirements of UAS – ACAS Xu for larger fixed-wing drones, and ACAS sXu for smaller platforms.
Robust communications for certifiable DAA solutions
Elsight has recently partnered with Sagetech to provide a robust CNPC (command and non-payload communications) link for a new range of ACAS X-based DAA systems. The Halo connectivity platform has been integrated into Sagetech’s products in order to provide a foolproof always-on connection that informs the remote pilot of alerts and guidance information produced by the ACAS X system, and also transmits the pilot’s control inputs back up to the aircraft.
Halo is a network-agnostic cellular connectivity solution that can harness up to four unique connections from multiple carriers, using powerful AI-based cellular bonding technology to provide reliability and redundancy for safety-critical UAV applications. The combination of Halo with these ACAS X-based DAA systems creates a highly reliable and low-SWaP solution that is ideal for manufacturers seeking to gain airworthiness and type certification for their BVLOS drone platforms.
To find out more about how Halo could be the ideal fit for your safety-critical UAV platforms or equipment, please get in touch.