UAV‐assisted real‐time evidence detection in outdoor crime scene investigations

Abstract Nowadays, a plethora of unmanned aerial vehicles (UAVs) designs that significantly vary in size, shape, operating flight altitude, and flight range have been developed to provide multidimensional capabilities across a wide range of military and civil applications. In the field of forensic and police applications, drones are becoming increasingly used instead of helicopters to assist field officers to search for vulnerable missing persons or to target criminals in crime hotspots, and also to provide high‐quality data for the documentation and reconstruction of the forensic scene or to facilitate evidence detection. This paper aims to examine the contribution of UAVs in real‐time evidence detection in outdoor crime scene investigations. It should be highlighted that the project innovates by providing a quantitative comparative analysis of UAV‐based and traditional search methods through the simulation of a crime scene investigation for evidence detection. The first experimental phase tested the usefulness of UAVs as a forensic detection tool by posing the dilemma of humans or drones. The second phase examined the ability of the drone to reproduce the obtained performance results in different terrains, while the third phase tested the accuracy in detection by subjecting the drone‐recorded videos to computer vision techniques. The experimental results indicate that drone deployment in evidence detection can provide increased accuracy and speed of detection over a range of terrain types. Additionally, it was found that real‐time object detection based on computer vision techniques could be the key enabler of drone‐based investigations if interoperability between drones and these techniques is achieved.


| Study's scope
The present paper aims to examine the contribution of UAV technology in evidence detection in outdoor crime scene investigations.
Specifically, the study tested the efficacy of drones in real-time object detection at a simulated outdoor crime scene, in a case where humans may fail. The efficacy in terms of accuracy and speed of detection was examined by directly comparing the drone's performance results with those obtained from a field team. It should be noted that when referring to UAV-assisted real-time object detection in this study, this means that the detection was performed solely by the drone operator and only by watching the drone-based live video feed as it was displayed on the mobile device's screen. In addition, the study tested whether the performance results acquired by the drone deployment can be reproduced in different terrains and whether computer vision techniques can enhance the drone detection capabilities.
In addition, recent research [2,[32][33][34][35][36][37][38][39] has underlined the usefulness of UAVs as a forensic detection tool or as a source of high-quality data for the documentation and reconstruction of the forensic scene. The existing literature focuses mainly on the application of UAV-based aerial photography for the detection of clandestine burials in the field of forensic archeology [33,34,37,40] or for documentation purposes in crime scene or accident investigations [32,35,36,38,39]. Going one step further, Rocke et al. [41] added a Geoforensic Search Strategy (GSS) perspective to the drone deployment in the context of assessing the likelihood of detecting a buried target based on the observation of general ground conditions using technological advances in remotely sensed aerial imagery.
In addition, the recent literature puts emphasis on real-time object and/or human detection and tracking derived from UAVsourced photography and videography. This is based on image processing and computer vision techniques but not under a forensic perspective (as for example in [42][43][44]) or highlights the usefulness of remote sensing in forensic investigations (as for example in studies [45][46][47][48]).
It should be noted that the present research project innovates by quantifying the effectiveness of drones through the direct comparison with humans in the context of simulating a crime scene investigation in terms of evidence detection. Only Urbanova et al. [39] attempted to investigate drone capabilities but only by relying on drone-recorded videos (i.e., "passive real-time viewing"), since technical issues related to Wi-Fi connectivity prevented them from examining the potential of drones for real-time survey, while Sharma et al. [38] presented a qualitative comparative analysis of UAVs and traditional search methods. Both Urbanova et al. [39] and Sharma et al. [38] described the contribution of UAVs in evidence detection in crime scene investigations as beneficial.

| Design of experiment
The study consisted of three experimental phases: The first phase tested the usefulness of UAVs as a forensic detection tool by posing the dilemma of humans vs drones; both the drone operator by watching the live video from the drone and the field team had to detect as many items as possible in the shortest possible time. For that purpose, randomly selected objects were scattered in gradually increasing areas per scenario executed. The accuracy measured in terms of success rate of detection was determined as the primary performance criterion, while the speed measured as the time required for a full scan of the area of interest was the secondary objective. During the first phase, 16 scenarios were implemented, 4 of which focused on items that the field team had difficulty detecting.
The second phase examined the ability of a drone to reproduce the obtained performance results in different terrains; a total of 4 already implemented scenarios were conducted in two new terrains, which differed in color and morphological characteristics.
Lastly, the third phase tested the accuracy in detection by subjecting the drone-recorded videos to computer vision techniques.
For that purpose, the analysis was based on the videos acquired from the first phase, while some additional scenarios were carried out in order to investigate the software-enhanced detection capabilities, even when the drone flew faster, and the objects were smaller.
• UAVs can offer reliable detection capabilities over a range of terrain/vegetation types.
• Computer vision techniques can enhance drone's detection capabilities.
It should be highlighted that this phase does not concern real-time detection since it was neither feasible to obtain the IP address of the drone camera nor to incorporate image processing tools into drone's software.
Phases I and III were implemented in May 2019 during morning or afternoon hours. The weather was mostly sunny or partly cloudy with winds up to 18 kph, and therefore, the drone operation was not affected since DJI SPARK™ can withstand wind speeds of up to 28 kph [49]. Phase II was conducted in June 2019 under similar weather and daylight conditions.
Each round (i.e., scenario) of the experiment was prepared by scattering the items in the area of interest. Both the field team and the drone operator were in the adjacent parking site without having direct view to the sports pitch in order to avoid having prior knowledge of the objects' position. In addition, both the field team and the drone operator were aware of the scanning patterns to be followed before entering the scene but none of them knew the number of the objects included in each scenario.
The time started to count when the field team or drone entered the scene and stopped when they left the scene. The time required to upload the CSV files containing the drone's flight plan through the Litchi website (as mentioned in Section 2.6) did not count since this process takes only 1-2 min and the preparation can be done prior to arrival on scene, as it really happened. Lastly, it should be noted that when a searcher of the field team detected an object, he simply had to raise his hand and continue the search without having to stop to collect it.

| Study area
The experiment was conducted in the approved areas for outdoor drone flying of the Defence Academy of the United Kingdom in Shrivenham after receiving permission from the Defence Academy Site Security. All study areas were free of spatial constraints, such as trees, which may impede the drone's accessibility capabilities or limit visibility due to vegetation cover, while 4G signal and/or Wi-Fi networks were available at all times.
In specific, the flights for Phases I and III were undertaken at a sports pitch, which was covered with dense, green, and short grass (approximately 5-6 cm tall). The field had a well-groomed appearance characterized by a smooth and even cut without having a remarkable amount of grass clippings from lawn mowing or dead grass spots. The extent of the area used to imple-

| Objects
The objects used in this study were 2 mm thick foam pads in order to avoid the direct detection by the field team from a distance, since the flat surface of the first phase's study area is not the common case; in real life, uneven surfaces and/or the presence of natural or artificial barriers can limit visibility.
The shape of the items was decided to be square for reasons of convenience, while the size was determined after running some trials with the field team in order to define the threshold below which humans might have difficulty in detection; in this way, it was possible to test whether the drone deployment could effectively contribute to evidence detection. Therefore, it was determined that 5 cm × 5 cm was the appropriate size for the study.
In addition, it was decided that the objects would be randomly selected from 8 predefined colors: red-blue-yellow (primary), green-purple-orange (secondary) plus black and white; each color corresponded to 10 objects out of the total 80 used in these multi-colored scenarios of Phase I (i.e., 12.5%). It should be highlighted that only the color that the field team had difficulty to detect (i.e., black) was used for the last 4 single-colored scenarios of the first phase. Lastly, the number of items used per multi-colored scenario of Phase I was randomly selected, ranging between 5 and 10, while each single-colored scenario contained 10 (black) objects.

| Field team
The field team consisted of two military officers with accumulated experience in Counter-IEDs activities and aircraft accident investigations.

| Aircraft and payloads
The unmanned aerial vehicle used in the experimental phase was a DJI SPARK™, which is a low-cost drone that incorporates all the

| Command and control element and communication data link
The SPARK™ remote controller was paired with the drone, and by using its advanced Wi-Fi signal transmission system, it was possible to operate both the aircraft and the gimbal camera at a maximum distance of 2 km [49]. In addition, the controller was attached and wirelessly connected to a Samsung Galaxy S8+ mobile phone, which was used to display the live video stream.

| Launch and recovery element
The equipment needed to takeoff and land the DJI SPARK™ was a circular launch pad, as the drone can ascend and descend vertically.

| Human element
The drone operator was a Research Fellow in Imaging and Autonomous Systems Centre of Electronic Warfare, Information and Cyber of Cranfield University. In the past, the drone operator has performed relevant tasks in similar research projects.

| Microsoft (MS) Excel
Waypoints were automatically generated in MS Excel for the full scan of the area of interest. Flight paths were generated by a custom, in-house developed, MS Excel spreadsheet using VBA macros for spherical geometry calculations.
Moreover, MS Excel was used to randomly select the number and the color of the objects used in each scenario, as well as their position in the study area.

| Litchi
The Litchi website was used to upload the flight plan (in CSV format files as created by MS Excel) and to save the flight plan in order to be available on the phone application via the cloud, while the Litchi android application is used in order to fly autonomously the DJI drones such as the SPARK™.

| DJI GO 4 android application
DJI drones cannot takeoff in Authorization Zones (e.g., military zones), such as the Defence Academy of the UK, and users are required to unlock the flight restriction through their DJI-verified account [50]. Hence, the DJI GO 4 application was used to get permission to operate in the no-fly zone of the study area.

| MATLAB (version R2019a)
A color detection algorithm was created to recognize colors in the drone-sourced videos and thus to facilitate object detection.
MATLAB code was run in the pre-recorded videos, as neither the IP address of the drone camera could be obtained nor the drone itself could execute the script in order to achieve the real-time implementation of the algorithm. The in-house developed algorithm was able to identify red-green-blue-white, the orange was recognized in case of simultaneous detection of red and yellow, while purple and black can be detected by increasing the sensitivity for blue color identification.

| Implemented scenarios
During the first part of Phase I, random selected objects (as referred to in Section 2.3) were scattered over gradually increasing areas of the sports pitch (as mentioned in Section 2.2). The study area

| Search patterns
The field team decided to divide the area in half (Zone A and B).
Then, the searchers dealt with each zone individually by following a strip search pattern; after conducting a detailed search of their zones, they switched halves in order to ensure that the area would be double-checked (Figure 1).
The drone's search pattern was based on autonomous flight operation by following a path of predetermined waypoints, as shown in

| Implementation details
This section provides information about the sequence of the experiment steps and some parameters that were not included in the design of the experiment. In specific: • Phases I and III were implemented in May (2019) during morning or afternoon hours. The weather was mostly sunny or partly cloudy with winds up to 18 kph, and therefore, the drone operation was not affected since DJI SPARK™ can withstand wind speeds of up to 28 kph [50]. It should be noted that the daylight created minimal shadowing effects as the objects were of "minimum" thickness (i.e., 2 mm, as mentioned in Section 2.3). Phase II was conducted in June (2019) under similar weather and daylight conditions.
• As each round (i.e., scenario) of the experiment was prepared by scattering the items in the area of interest, both the field team and the drone operator were in the adjacent parking site without having direct view to the sports pitch in order to avoid having prior knowledge of the objects' position.
• Both the field team and the drone operator were aware of the scanning patterns to be followed (as analyzed in Section 2.8) before entering the scene.  • When a searcher of the field team detected an object, he simply had to raise his hand and continue the search without having to stop to collect it.
• The success rate and the time required to fully search the area of interest were recorded for both the field team and the drone.
• As far as the field team, the number of objects identified per searcher was recorded, as well as the number of objects detected during the first scan of the area (i.e., before the searchers switched zones, as mentioned in Section 2.8).
• As far as the drone, the operator was asked to identify the color of the items, while the number of double-detected objects due to overlapping was also recorded.
• The first round of experiments was conducted by the field team.
Before these scenarios, the field team ran some trials in order to determine the size of objects that humans might have difficulty in detection, as explained in Section 2.3.
• It was found that the field team had difficulty in detection of black color after running the 12 multi-colored scenarios of Phase I. Therefore, 4 additional scenarios containing black-colored items were then carried out. The field team and the drone performed exactly the same scenarios.

| Phase I
During this phase, the field team detected 70 out of total 80 objects by achieving 87.5% accuracy, while the drone missed only one detection by having a success rate of 99%, as shown in Table 1 and Figure 3. In addition, with regard to the speed of detection, the field team was faster by 10.7% in relatively small areas (up to 30 m × 45 m), while the drone was faster in areas larger than 30 m × 50 m by 12.1%, as shown in Table 1 and

Time for Search
Field Team

Drone
The efficiency in detection is presented in Figure 5 as a combined view of the accuracy and time for detection. In specific, in this scatter chart, the success rate is plotted against the search time for each multicolored scenario of Phase I for both the field team and the drone operator.
The second part of Phase I consisted of 4 single-colored scenarios, that is, objects of black color. By following a similar statistical approach to Part 1, Table 2 presents the accuracy and time for search for both the field team and the drone.

| Phase II
The first area tested was covered with red clay soil, while the second area was covered with high grass in comparison with the short-grass terrain of Phase I. Drone was able to reproduce the performance results obtained from the first phase, by achieving 100% in object detection, as shown in Table 3.

TA B L E 4 Efficiency of MATLAB in object detection based on color
Success rate

| Phase III
The results obtained from the use of MATLAB computer vision toolbox for color-based object detection are presented in Table 4. Unfortunately, this phase does not concern real-time detection, as explained in Section 2.
Moreover, it was found that MATLAB can detect objects (except black ones) that have a quarter of the original size (i.e., size: 2.5 cm × 2.5 cm) even when the DJI SPARK™ is flying four times faster than the project speed settings (i.e., speed: 28 kph), as shown in the figure below:

| DISCUSS ION
The use of UAVs in forensic applications for evidence detection tasks can be beneficial since these low-cost and easy-to-use platforms can multidimensionally help crime scene or accident investigations and assist in apprehension and prosecution of offenders.
In specific, taking into account the results of Phase I (as presented in Section 3.1), it should be highlighted that drone deployment can achieve high detection rates of nearly 100%. However, it should be noted that the degree of drone's accuracy depends on the flight settings. Therefore, for the detection of such small Additionally, it was found that drone could search relatively large areas faster by more than 10%. In specific, in areas larger than 30 m × 50 m, drone achieved a −12.1% and −15.5% in the time required for a full scan of the search area in the multi-colored (Table 1, Figure 4) and single-colored ( Table 2) scenarios of Phase I, respectively.
However, from a resource point of view, it should be highlighted that a 10% reduction in the time for search is equivalent to an approximately 50% reduction in the consumed man-hours (Figure 7), as the field team consisted of 2 searchers compared to the sole drone operator.
By synthesizing the aforementioned findings, it can be deduced that the drone is considered to be ultimately more efficient for areas larger than 1,500 m 2 (i.e., 30 m × 50 m area), since drone achieved superior performance results both in terms of accuracy and speed of detection compared to those obtained from the field team.
As far as the second phase, the reproducibility of the drone's performance results demonstrates the robustness of the proposed search method. In particular, the fact that drone achieved high detection rates in three different terrains (i.e., short grass, high grass, and red clay soil), as shown in Tables 1-3, provides reasonable support that drone deployment can ensure reliable detection capabilities. Furthermore, it is worth noting that the speed of detection for drone remained the same as the search width is not dependent on the terrain type. This proved to be significant when compared to the human approach, in which the swath size of the searcher is reduced. For example, the ground is covered by high grass in comparison with an asphalt terrain [53], as illustrated in Figure 8.
Moreover, a decrease in the degree of overlapping by increasing the track separation distance in the drone's scanning pattern will result in a reduction in the time for search but possibly at the expense of accuracy. Another way to achieve the same results is the utilization of cameras with larger field of view (e.g., fisheye lens cameras), but it must always be ensured that the potential positive results are not offset by the radial distortion [33,54].
Regarding Phase III, it should be highlighted that computer vision techniques can enhance the drone detection capabilities.
Specifically, the fact that MATLAB ensured 100% accuracy in detection ( Table 4), even when the drone flew faster and the objects were smaller (Figure 6), provides strong support that real-time object detection based on computer vision techniques can be the key enabler of drone-based forensic investigations, as is already the case, for example, in the field of autonomous driving systems [55,56].
However, it should be noted that color-based detection by using computer vision techniques can result in increased rates of false alarms when, for example, a shadow is detected as a black object. The rate of false alarms depends on the ground complexity.
In specific, a terrain of red clay soil with green vegetation, white

| Conclusions
The current research project examined the usefulness of UAVs in real-time evidence detection in outdoor crime scene investigations.
Based on the obtained results, this project provides reasonable support that drone deployment as a forensic detection tool offers: • Increased accuracy in detection compared to the traditional human approach. In specific, the drone can ensure detection rates of nearly 100%.
• Increased speed of detection in relatively large areas, since the drone requires less time to fully search these areas of interest compared to the traditional human approach.
• Reliable detection capabilities since the drone can achieve high detection rates over a range of terrain types. • Enhanced detection capabilities through computer vision techniques. If interoperability between drones and computer vision techniques is achieved, the UAV-based real-time evidence detection will be more consistent with real-life investigations.

| Future work
The following ideas could be tested to further investigate the usefulness of UAVs as a forensic detection tool: Furthermore, I want to wholeheartedly thank the two searchers, Maj N.P. and Capt A.B, for walking so many kilometers without losing their smile during the search phase in order to support my project.
Lastly, I wish to convey my grateful thanks to my family for their understanding and patience throughout my studies.