Flapping‐Wing Dynamics as a Natural Detector of Wind Direction

Flapping‐wing unmanned aerial vehicles have potential advantages, such as consuming lower energy by leveraging the force of wind. Since the flapping movements of the soft wings contain information about the wind, measuring the movement of each part of the wings allows these vehicles to distinguish the direction of the wind. To confirm this prediction, herein, the detection of wind flow from the flapping‐wing motion of a bird robot using an integrated flexible strain sensor on its wing and a physical reservoir computing analysis is presented. In the presence of different wind directions, the movement of the flapping‐wings is measured using flexible strain sensors, and the current wind direction is detected by capitalizing on the intrinsic wing dynamics. As a result, it is found that the detection accuracy using our embedded flexible strain sensors is significantly high, showing a similar level of accuracy with a high‐speed camera recorded from the fixed position in the environment. The results indicate that flapping‐wing unmanned aerial vehicles can recognize wind direction by exploiting the natural dynamics of their wings.

Flapping-wing unmanned aerial vehicles (UAVs) [1][2][3][4] may potentially emit less noise and consume less energy when compared to fixed-wing and multirotor UAVs as it leverage the force of the wind. [1,2] Flapping-wing UAVs must recognize the direction of the wind stream for this leveraging to take place. Chirarattananon et al. proposed a method of estimating and compensating for the wind disturbance for a microflapping-wing UAV. [4] Escareño et al. presented a trajectory control method of a quadrotor regarding wind disturbances using a control strategy based on a slidingmode and adaptive control techniques. [5] Nakata et al. proposed a method for detecting the nearby ground or walls by perceiving the modulations of the self-induced airflow patterns of a UAV. [6] This method was validated by developing a quadcopter that measured the fluid velocities around it using stereo particle image velocimetry. These robots recognize the wind direction from the data of sensors in their trunks or ones installed in the room and indirectly estimate their positions and postures by using the data. A robot can also recognize wind direction using airflow sensors. Zhao and Zhu presented a multifunctional electronic skin inspired by the thermosensation of human sensory system. [7] Wang et al. developed an ultrasensitive and flexible all-textile airflow sensor based on fabric with in situ grown carbon nanotubes. [8] These sensors were attached to the surface and measured air flow directly. If these sensors were attached to the flapping UAVs, it would be difficult to measure the wind direction because the local airflow caused by the flapping-wings would be dominantly measured. It is also important to note that the flapping-wings of the UAVs change their shape in a complex manner when receiving the wind stream. A flapping UAV can directly recognize the direction of the wind stream by measuring the movement of its wings and using the measured data. To illustrate this idea, a framework called physical reservoir computing (PRC) was used that allows one to exploit the natural physical dynamics as a computational resource.
Reservoir computing (RC) is a machine-learning framework that deals with time series data and exploits high-dimensional nonlinear dynamical systems, which are often referred to as reservoirs, as a computational resource. [9,10] A conventional RC system consists of a randomly connected recurrent neural network, and the learning is implemented in the readout part, which directly connects the reservoir nodes and output nodes. [11] In this framework, the input signals are fed into the reservoir, which acts as a temporal and finite kernel to project low-dimensional input into a high-dimensional dynamical system that facilitates input separability. The framework is suitable not only for realtime signal processing at the edge device (such as the sensory DOI: 10.1002/aisy.202000174 Flapping-wing unmanned aerial vehicles have potential advantages, such as consuming lower energy by leveraging the force of wind. Since the flapping movements of the soft wings contain information about the wind, measuring the movement of each part of the wings allows these vehicles to distinguish the direction of the wind. To confirm this prediction, herein, the detection of wind flow from the flapping-wing motion of a bird robot using an integrated flexible strain sensor on its wing and a physical reservoir computing analysis is presented. In the presence of different wind directions, the movement of the flapping-wings is measured using flexible strain sensors, and the current wind direction is detected by capitalizing on the intrinsic wing dynamics. As a result, it is found that the detection accuracy using our embedded flexible strain sensors is significantly high, showing a similar level of accuracy with a high-speed camera recorded from the fixed position in the environment. The results indicate that flapping-wing unmanned aerial vehicles can recognize wind direction by exploiting the natural dynamics of their wings. devices) [12] but also for conventional classification tasks. [13] A PRC can be regarded as a natural extension of an RC that exploits the complex physical dynamics as a reservoir. [14] In soft robotics, it has been demonstrated that the rich dynamics of a soft body acts as a successful reservoir, which can emulate nonlinear dynamical systems and embed closed-loop control. [15,16] The first physical reservoir system was proposed by Fernando and Sojakka. It exploited the dynamics of a water surface as a reservoir, which they called a "liquid brain," and implemented pattern recognition tasks. [17] In this article, just like this previous approach of the liquid brain, it is shown that the flapping dynamics of a wing acts as a natural reservoir that can detect wind direction.
Therefore, a method for recognizing the wind direction based on wing movement using flexible strain sensors to measure the shape of the wing and a PRC approach to recognize the wind stream from the sensor data is proposed ( Figure 1).
The movement of the soft flapping-wings must be measured to use the flapping dynamics as a natural reservoir. Hard sensors measuring the wings will disturb the movements and may decrease their complexity. That is, flexible sensors are essential for the flapping dynamics to detect wind direction by acting as a natural reservoir. Therefore, flexible strain sensors are developed and used to measure movement.
These flexible strain sensors are directly fabricated on a polyimide film, which is used for the wings of the bird robot. The detailed fabrication process is shown in Figure S1, Supporting Information, and the Experimental Section. Wingbending information is measured by the resistance change of the electrical contacts between laser-induced graphene (LIG) filaments caused by the tensile strain when the wing is moved (Figure 2a). The carbonized conductive film of polyimide formed by laser scanning was stacked layers of multilayer graphene confirmed by Raman spectroscopy ( Figure S2, Supporting Information) and scanning electron micrography (SEM) images ( Figure S3, Supporting Information), the results of which are in good agreement with other reports. [18] The SEM images show that the distance between LIG layers increases under tensile strain, where roughly a 4 μm increase is observed at certain points, as shown in Figure S3, Supporting Information. The fundamental characteristics of the resistance change under the applied strain are then conducted. Strains are controlled by the bending direction and the distance to hold the sensor sheet, as shown in Figure 2b-c. The resistance change ratio, ΔR/R 0 , where ΔR represents the change between the resistance at the bending state (R 1 ) and flat state (R 0 ), is used at different thicknesses of the parylene passivation layer. Under compressive strain, the sensor is almost insensitive at any bending conditions, regardless of the parylene thickness (Figure 2 d). This is because the LIG is formed with high density and electrical contact with each other, resulting in the contact resistance being constant under the compressive strain ( Figure S3a, Supporting Information). It should be noted that there is a small resistance change or drift during a repeat experiment. This small fluctuation needs to be studied further in the near future to create more stable sensor operations for practical application. However, when the tensile strain is applied, each LIG layer has less electrical contact than that at the flat state, as shown in Figure 2a. Following this mechanism, the resistance increases when the bending distance decreases (Figure 2e and S4, Supporting Information). In particular, the thinner parylene layer experiences a large resistance change that corresponds to a high sensitivity because a higher tensile strain is applied to the LIG layers. Due to strain engineering, thicker parylene layers create less strain sensitivity at the same bending condition due to smaller straining in the LIG film ( Figure S4, Supporting Information). Using a 100 nm parylene strain sensor, the detection limit is about 8 cm in a distance, which corresponds to about a 2.3 cm bending radius. For the application of a bird robot, the speed of response and recovery time is another important factor. Based on the real-time experimental results, the response and recovery times are %0.025 and %0.018 s, respectively ( Figure S5, Supporting Information). This speed is similar to or better than other studies reported previously, whereas the sensitivity is similar to or less than them (Table S1, Supporting Information).
Hysteresis behavior is also important for wing motion detection. Figure 2f,g show the hysteresis properties under compressive and tensile strains. Under the tensile strain condition, the maximum resistance change ratio difference is %0.56% at a distance of 9 cm distance (Figure 2f ), whereas almost no change is observed under the compressive strain between the forward and backward strain applications. A sensor directly formed on a polyimide film wing is assembled with a commercially available bird robot for the confirmation that this hysteresis under tensile conditions affects the motion detection of the wing. Under the bird robot operation, the maximum motion of the wing is about 14 Hz (Figure 2h). For the motion detection, a high-speed camera is also used when motion is detected using the strain sensor. The results clearly show that the strain sensor can precisely detect wing motion without a significant difference in the highspeed camera detection (Figure 2i). However, it should be noted that a small time delay of the strain sensor output compared to the output of the camera is observed. This is most likely due to the difference observed in the position of the wing movement. To detect an output signal using the camera, contrast change is used, whereas the output using the sensor indicates the specific flexible wing bending. The results show that the hysteresis of the flexible strain sensor is negligible in detecting the wing motion.
Environmental condition dependences of the strain sensor, such as temperature and humidity, are investigated because the robot bird has the ability to be used under different temperatures and humidity levels. Figure S6, Supporting Information, shows the resistance change ratio at different temperatures and humidities. The resistance slightly decreases at higher  temperatures with a sensitivity of À0.05 %/ C; however, small changes may be ignored for this application. For the humidity response, because the parylene layer is coated over the LIG film, almost no resistance change is observed ( Figure S6b, Supporting Information). Based on these results, this strain sensor can be used for the bird robotic application without having large error caused by the environmental condition changes. A robot using our method predicts the direction of the wind stream y as the objective variable from the explanatory variable x, which is obtained from data of the sensors attached to the flapping-wings of the robot. Figure 3a shows the composition of the sensors attached to the wing. The time series of the sensor data s includes a vector of the sensor data s(t * ) at discrete-time t * .
The variable x should clearly reflect the effects of the wind on the movement of the wings. A short time series of the sensor data before the current time can reflect the effects. Therefore, the time series is used as x. The movement of the flapping-wings is a cyclic motion. The robot recognizes and outputs the direction of the wind stream from this time series at every cycle. The time when the sensor reaches its peak is used as the end of a cycle. To measure the wind with the flapping-wings, the wind is blown for 20 s and stopped for the same amount of time; this process is applied from multiple directions. Each direction is tested once. Note that a flapping-wing UAV moves the wings of the robot at a Figure 2. a) Schematic of the strain-sensing mechanism of LIG on a polyimide film; photos for the experiments to apply b) tensile strain and c) compressive strain; resistance change ratio of the strain sensor under d) compressive and e) tensile strains at different parylene thicknesses; hysteresis behavior of the strain sensor under f ) compressive and g) tensile strain as a function of distance; h) photos of the bird robot motion; and i) output signals of the bird wing motion using the strain sensor and the high-speed camera.
www.advancedsciencenews.com www.advintellsyst.com high frequency, and the sensors attached to the wings reaching their peak provide the end of the cycle. Thus, a robot is capable of detecting the change of wind direction in a short amount of time if it recognizes direction only when each cycle ends. The data for the variable x are sampled as follows (Figure 3b). First, the peak time when the value of a sensor reaches the peak is extracted. We use Sensor 1 as a reference to extract the peak and define the peak time according to it. Second, the ith vector of the state is calculated using a constant N as v i ¼ ½sðt i Þ, sðt i À ΔtÞ, sðt i À 2ΔtÞ, : : : , sðt i À ðN À 1ÞΔtÞ (1) where s(t) are the sensor values at time t, Δt is the sampling time of the sensors, called a timestep, and t i is the ith peak time.
Finally, x at t i as x i is sampled as the following equation using a constant M to consider the effects of the delay of the flappingwing as the dynamical system Note that in the preprocessing of the data, the sensor data are regressed as a linear function with respect to time. The regressed results are removed from the data to omit the effects of the drift of the sensors, and the data are standardized such that the mean values and standard deviations are zero and one, respectively.
The class y is predicted from x using the softmax function. That is, the probability p j is calculated as the following equation of which the jth class is obtained from a given x where A j and B j are a matrix and a vector, respectively, ( j ¼ 1, …, 6). The robot selects the class with the highest probability in its prediction. Both A and B are updated using the cross entropy of the predicted classes and actual classes using a training data set.
Samples of x and y, which are obtained from a measured time series of sensor data, are divided into a training data set for learning and a test data set for evaluating this learning. A test data set and a training data set are, respectively, built from the x i and y i samples in the class K such that jt i À t Kini j < t ini and jt i À t Kend j < t end where t Kini and t Kend are the times when the wind of the Kth direction starts and finishes blowing, respectively, and t ini and t end are constants. Different numbers of training samples in each class may decrease the prediction performance. Therefore, to equalize the number of samples in each class and to improve the performance, the minimum number of samples included in a class is searched and the number of samples from each class is randomly extracted.
To obtain the statistical results from one time series, a training data set is repeatedly built, and the class is predicted while decreasing N from 77 to 58 by one each time. The prediction of each class is recorded, along with its ratio and the accuracy of the prediction. Note that the minimum number of samples between peaks is 77. In all conditions, the following settings are used M ¼ 10, t ini ¼ 9 s, and t end ¼ 9 s.
One condition is tested in which one classifier predicts six classes of wind (multiclass classification) and another condition in which the five classifiers are trained using different samples in different periods, and each classifier predicts whether the wind is blowing or not (two-classification). The combinations of sensors used for predictions are also changed to investigate which sensors contributed to the prediction performance. In addition, classifiers are built using the data of high-speed cameras instead of the data of flexible strain sensors with the same data parameters. Figure 3c shows the number of the class of the wind direction. Class 1-6 correspond to the wind streams from above, below, the side, and diagonally from the front and behind, respectively. Figure 3d shows the time series of the output of Sensor 1. Other sensor outputs are shown in Figure S7, Supporting Information. As can be seen from the plots, the waveforms of the outputs before and after the wind starts blowing are similar. Figure 3e shows the ratios of predictions for each class. These graphs indicate that the actual class is predicted from most of the test samples, and the accuracy of one class is different from that of another (see the graph of Class 1 and the one of Class 2 in the multiclass classification). The graphs also suggest that the accuracy of the two-class classification condition was higher than that of the multiclass classification condition. The mean of the accuracy of the multiclass classification condition and that of the two-class classification conditions are 0.915 and 0.966, respectively, which are significantly high values. Note that the accuracy is lower when using a large M, such as M ¼ 64, or a smaller one such as M ¼ 2 ( Figure S8, Supporting Information).
The mean values of the accuracy of the multiclass classification condition and that of the two-class classification conditions using data of the high-speed camera are 0.959 and 0.965, respectively, which are also significantly high values. These results highlight two important points. The first point is that these results strongly confirm that the wing dynamics are actually capable of distinguishing wind direction. The second point is that they demonstrate our embedded flexible strain sensors are indeed capable of classifying wind directions at a similar level as a high-speed camera image recording from a fixed position in the room. The second point is especially important to confirm if we aim to use our embedded flexible strain sensors for real robots flying around a dynamic environment. Figure 4 shows the results of examining the factors that contribute to the prediction accuracy of Figure 3. Figure 4a shows the scores of the first principal component and second principal component. These components are obtained using principal component analysis for samples in each class. This graph indicates that the distribution of samples in one class is clearly separated from that in another class, for example, Class 2 (orange) and Class 4 (green). This result suggests that the movement of the flapping-wing as the dynamical system itself naturally facilitated the input separation to classify the direction of the wind stream. www.advancedsciencenews.com www.advintellsyst.com Figure 4b shows the accuracy of the prediction for each class under the conditions in which different sensors are used (i.e., the sensor conditions). These graphs indicate that the accuracy increased in one sensor condition and decreased in another. For example, the accuracy in a condition in which all sensors are used and Class 1 (blue) is predicted in the multiclass classification (0.382) is smaller than another condition in which only Sensor 1 and Sensor 2 are used (0.643).
In contrast, the results in another condition show Class 6 (brown) is opposite (0.977 vs 0.869) to the ones in Class 1. The graphs also indicate that among the sensors, Sensor 1 contributes the most to the accuracy. For example, the mean (black) of the accuracy of the condition where only Sensors 2 and 3 are used is smaller than the conditions where all the sensors are used in the multiclass classification condition (0.768 vs 0.867) and the two-class classification condition (0.858 vs 0.943). The graphs finally suggest that Sensors 2 and 3 also contribute to the prediction accuracy in some conditions. For example, the accuracy in the multiclass classification condition (0.936) is larger than when only Sensor 1 is used (0.837) in a condition in which all the sensors are used and Class 5 (violet) is predicted. These results indicate that the directions of the wind stream can be identified from the movement of the flapping-wing using the flexible strain sensors and PRC.
In this article, the movement of the flapping-wings in different wind directions is measured using flexible strain sensors with an aim to predict the current wind direction by capitalizing on these intrinsic wing dynamics. As a result, the accuracy of the prediction of the multiclass classification condition is high in general. The distribution of samples in one class is clearly separated from that in another class in the high-dimensional space. All sensors attached to the wing contribute to the prediction. In this article, we aim to show the possibility of the airflow dynamics of the flipping-wing soft robot, and the airflow rate dependence is required to analyze the airflow dynamics to control the bird robot under wind; this includes the limitation of detectable airflow rates. Although a lot of challenges still remain, our results indicate that flapping-wing UAVs can recognize wind direction by exploiting the natural dynamics of their wings.

Experimental Section
Fabrication: A flexible strain sensor was directly formed onto a polyimide film-based wing. Figure S1, Supporting Information, shows the fabrication process. A CO 2 laser (VLS2.30, UNIVERSAL Laser System) at 10 W power was scanned on the polyimide film (50 μm thick) to form graphene films, see Figure S1, Supporting Information (2). This graphene formation process and the detailed material characteristics were reported elsewhere. [19,20] After the formation of the resistive LIG strain sensor materials, Ag electrodes as the interconnection layers were screen printed, see Figure S1, Supporting Information (3). A parylene C passivation layer was then deposited using a parylene coater (PDS2010 Labcoater 2, SCS), see Figure S1, Supporting Information (4). The thickness of the film was 100 nm. Finally, the polyimide film was cut to assemble onto the bird robot (BionicBird 56 618, Kyosho).
Bird Motion Measurement: All strain sensors were connected to a memory HiCorder (MR6000, HIOKI) to measure the electrical response of the strain sensor simultaneously. To confirm the bird wing motion, a high-speed camera (VW-9000, Keyence) was set on the side of the experimental setup, and its position was fixed. The bird was measured using a camera unit (VW-300M) from its side, obtaining a grayscale image at