The Bigger Picture: A Helicopter Crash Investigation with Drones
Written by Eloise McMinn Mitchell   


This article appears in the September-October 2021 issue of Evidence Technology Magazine.
You can view that full issue here.

CRASH SCENE INVESTIGATIONS are a complicated matter. Finding out what happened at the scene of an accident requires careful analysis of all the factors around the site. This can be even more difficult in situations where there are no eyewitness accounts to fill in the gaps of what occurred.

This was exactly the case when a helicopter crashed near Pioche, Nevada. The helicopter had gone down in an isolated area several hours away from Las Vegas in an area with no people. The crash site was a mountainous area with lots of shrubbery and desert plant life, making it even more difficult to access and assess. The decision was made to use UAVs or drones to get the bigger picture.

This image demonstrates the difficult terrain and remote nature of the crash site. Image courtesy Sundance Media Group.

Drones have rapidly become a proven asset in accident investigations due to the flexibility of the tools. Images captured by drones can be used for professional reconstructions of a scene, creating a 3D digital twin. This can then be analyzed, with screenshots used to identify debris, marks made by a crashing helicopter, and to measure distances. In a case like this where there is limited access, the use of drones is ideal for recreating the scene without missing any details, as it can be examined from multiple angles without anyone needing to scramble over rocks to get access. In addition, with today’s cameras, the distance between two pixels (also known as the Ground Sample Distance, or GSD) can be as precise as less than 1 centimeter, meaning that despite it being all image-based reconstructions, it does not compromise on the accuracy of capturing the site. It is a digital twin of reality that accurately captures the scene.

So, Sundance Media Group (SMG), who has experience in investigating accidents with drones, was called to this helicopter crash to analyze what happened and how it occurred, with the goal of reporting back to the owners and insurers.

Gathering Data from the Scene

To create an accurate 3D model with drone imagery, the data collection needs to be done in a way that will render correct results. First, the SMG team needed to access the site. They drove three hours from Las Vegas to the crash area and started to assess the area. The debris from the crash was scattered across the mountainside at an altitude of 2,054 meters (6,741 feet). Getting to it was only possible with a four-wheel drive vehicle that could not access the entire location. This enforced the need for using a portable drone, since even getting to the site was difficult, let alone transporting specialized equipment.

The team of three people climbed to the highest point of the crash site to ensure they had good visualization of the site. At a basic level, their workflow looked something like this:

  1. Assess the scene
  2. Set up Ground Control Points (GCPs) for the site
  3. Plan the flights
  4. Fly

A GCP can be identified from above, and then processed in photogrammetry software to ensure coordination accuracy in the final results. Photo courtesy Sundance Media Group.

Once they had set up an area where they could take off, the team laid out seven GCP disks. GCPs are a critical part of ensuring accuracy in drone mapping. GCPs are points on the ground for which the team knows the exact coordinates. These pinpointed locations are used as a reference in processing to ensure that the overall 3D model is accurate and geolocated. This provides a credible resource for forensic investigations that can even be used in court.

To prepare the flight plans, the team used pre-loaded Google Earth overhead images to get an idea of the layout of the site. Creating a flight plan correctly means that the images captured will have enough overlap between them — which is vital for an accurate reconstruction without distortions. They were mapping 70 acres (about 28 hectares) and used two Autel EVO II Pro drones. These are rotary drones, meaning they have rotor blades like a helicopter. This type of drone has a shorter flight time than a fixed-wing drone (with wings like an airplane), yet it is more maneuverable and can take off more easily as it has vertical take-off. The Autel also has 360-degree obstacle avoidance, which is very helpful in avoiding breaking expensive equipment.


The drone pilot used a tablet and a real-time kinematic (RTK) device to ensure geolocational accuracy and to oversee the imagery collecte by the drone. Photo courtesy Sundance Media Group.

One drone flew a north/south corridor flight, and the other did an east/west corridor. After the corridor flights — long and straight — were completed, the team flew an additional 45 minutes to capture precise details. Overall, the flight time was 2.5 hours, which was done as quickly as possible. This is because as time passes, the position of the sun moves, and shadows on the land change. This can be a challenge in data processing if the light quality or position changes significantly. This is another reason that effective flight planning is key. When you are working on an investigation where every detail matters, mitigating inconveniences like this is critical to getting the best results.

Maintaining safety and following drone regulations is also crucial, as ignoring these rules can cause risks to other people or aircraft in the area. For this reason, the drones were fitted with specialized FoxFury D3060 lights to make sure the pilots could always see them. The pilots also maintained a visual line of sight (VLOS) with the drones at all times to ensure that there was no chance of flying into an unexpected obstacle. Today, most drones have sensors and systems to help avoid crashes and help make the pilots’ lives easier. This improved functionality enhances their value as a tool for forensic investigators as it cuts down on the amount of training required.

Processing the Scene

Once the data was collected, it was processed in specialized desktop software called PIX4Dmapper that is designed for photogrammetry — the science of measuring from images. In the software, the team could generate a flat 2D map of the scene, also known as an orthomosaic, as well as 3D models or point clouds. The SMG team gathered over 6,000 images of the site, which generated 13 million 3D points in a point cloud — all of which were accurately geolocated thanks to using GCPs. The processing in the software took place over 12 hours, rendering a high-quality, detailed point cloud.


The 3D model is viewed in the photogrammetry software (PIX4Dmapper) with the green and blue rayCloud being a way for users to inspect individual photos as well as the 3D model at will. Image courtesy Sundance Media Group.

Processing in photogrammetry software also meant the team could use the outputs in more specialized software. This meant they could use computer-aided search to investigate the scene to look for debris and objects according to color. This was especially useful in this case as the helicopter that crashed had recently been wrapped in grey vinyl as it had been on display at a trade show. This color was tough to spot on the scene but made easier thanks to using the 3D model.

Mapping the Details

The overall impact of this project was that it provided an accurate resource for use in the investigation of the helicopter crash as well as collecting the debris. The GSD was between 0.4 cm and 0.2 cm, which is incredibly accurate. That means that investigators could look in detail at points on the ground or areas of shrubbery to check for every clue as to the impact of the crash. The investigators also received a 3.5-GB GeoTIFF in Google Earth from SMG to use as a general overview resource alongside the detailed models produced by the photogrammetry software.


Identifying debris is possible wiht high-resolution imagery, meaning users can zoom in to look for details. Photo courtesy Sundance Media Group.

Hours and possibly days of work were saved by using drones in this investigation as the proven accuracy of the model meant investigators did not have to walk all over the mountainside to find out what happened. Debris found using color filtration search tools included small objects such as pieces of the fuselage as small as 2.5 cm (1 inch) long. They also found stickers, toolboxes, and oil cans. An understanding of the crash — which was caused by the helicopter catching on a cable which wrapped around the tail rotor and boom — as well as the spread of the crash site was gained and made shareable thanks to being an accurate digital twin. Locating missing parts of the fuselage also helped fill in the gaps of what happened in the crash.

Overall, the use of drones in this project made work easier for investigators and inspectors alike. The specialized skills for the SMG team ensured that data collection was successful, and a useful resource was provided to the relevant stakeholders who literally could not miss details because it was an exact replica of the scene on the day they flew it.

Note: This story was originally shared by SMG and can be found on their website.

About the Author

Eloise McMinn Mitchell is the content writer at Pix4D, working on sharing and creating content about using drones and photogrammetry in all industries ranging from public safety to surveying to agriculture. Her background is in journalism and social media management, with a degree from the University of York. Pix4D is a leading provider of specialized photogrammetry tools and is based in Switzerland with six offices around the world, including the USA, China, and Japan.

< Prev   Next >

Image Clarification Workflow

A FEW WEEKS AGO, I received a call from Ocean Systems asking if I would like to beta test their newest software—ClearID v2.0 Image Clarification Workflow. The new progam has filters that were designed for use with Adobe’s Photoshop graphics-editing program.