Field‐based individual plant phenotyping of herbaceous species by unmanned aerial vehicle

Abstract Recent advances in Unmanned Aerial Vehicle (UAVs) and image processing have made high‐throughput field phenotyping possible at plot/canopy level in the mass grown experiment. Such techniques are now expected to be used for individual level phenotyping in the single grown experiment. We found two main challenges of phenotyping individual plants in the single grown experiment: plant segmentation from weedy backgrounds and the estimation of complex traits that are difficult to measure manually. In this study, we proposed a methodological framework for field‐based individual plant phenotyping by UAV. Two contributions, which are weed elimination for individual plant segmentation, and complex traits (volume and outline) extraction, have been developed. The framework demonstrated its utility in the phenotyping of Helianthus tuberosus (Jerusalem artichoke), an herbaceous perennial plant species. The proposed framework can be applied to either small and large scale phenotyping experiments.


| INTRODUC TI ON
Plant phenotyping involves the comprehensive measurement of the physical and biochemical traits of plant genotypes under specific environmental conditions and provides essential information for the plant sciences. Recent advances in technical and analytical methods have made high-throughput field phenotyping possible (Furbank & Tester, 2011;Houle et al., 2010;Tardieu et al., 2017;Tripodi et al., 2018). Proximal sensing through the use of unmanned aerial vehicles (UAVs) is among the most promising and popular techniques for field phenotyping owing to its rapidity, nondestructiveness, cost-effectiveness, and information density (Chapman et al., 2014;Maes & Steppe, 2019;Sankaran et al., 2015;Yang et al., 2017). UAV sensing platforms developed for agriculture also can used in genetics, ecology, forestry, and environmental science (Carrasco-Escobar et al., 2019;Christie et al., 2016;Zhang et al., 2016). However, further methodological development is necessary for their use to become common in other fields of plant science (Minervini et al., 2015;Roth et al., 2018).
One crucial technique that remains to be addressed is the de- Thirdly, by combining with local environmental data collected by field Internet of things (IOT) devices, UAV-based IPP can be a novel tool to examine fine-scale genotype-environment interactions of individual plants in the field. However, despite these great potential contributions of UAV-based IPP to plant research, few attempts have been made to develop IPP, except for several studies of individual tree phenotyping (mostly focusing on tree height) (Díaz-Varela et al., 2015;Fujimoto et al., 2019;Mu et al., 2018;Zarco-Tejada et al., 2014).
One of the challenges for IPP under field conditions is the segmentation of individual plants from weedy backgrounds in image analysis. For instance, the experiment that grows single plants at a relatively low density in the field can promote the germination and growth of weeds. Even those small and low-density weeds are not to impede the development of the focal species, because their textural and reflectance properties are often similar, it becomes a significant technical problem when segmenting the boundaries of each plant of the target species from the image. To mention, it is also not realistic to remove all weeds in a large-scale field experiment manually. Therefore, to develop UAV-based IPP, it is necessary to devise a technique to segment each plant of the target species, even among weeds.
Here, we present a methodological framework for UAV-based IPP and demonstrate its utility in the phenotyping of Helianthus tuberosus L. (Jerusalem artichoke), an herbaceous perennial plant species. First, we developed a WEIPS (weed elimination for individual plant segmentation) method to segment each plant of the focal species in images with weeds. To evaluate its reliability, we compared areas of individual plants segmented by WEIPS with those delineated manually. Second, we tested the versatility of our framework by comparing individual plant heights estimated from images taken with those measured by hand. Finally, we illustrate the broader application of our framework by showing that it detects significant phenotypic variations among source populations of Helianthus tuberosus in various traits that are difficult to measure manually and requires extensive labors, such as height, volume, and outline.

| Growth and measurement of Helianthus tuberosus
Helianthus tuberosus L. (Jerusalem artichoke) is native to North America (Swanton et al., 1992). Because it produces large quantities of edible tubers, H. tuberosus was an essential crop for native North Americans before European contact (Kays & Nottingham, 2007).
The species has only been weakly domesticated, so high levels of genetic diversity exist among individuals and populations in physiological, morphological, and life-history traits (Kays & Kultur, 2005;Puttha et al., 2012;Swanton et al., 1992). Also, it has become naturalized and invasive in many regions of the world (Tesio et al., 2012;Weber & Gut, 2004).
We purchased seed tubers of H. tuberosus from three private farms in Tochigi, Chiba, and Gunma prefectures, Japan. Because were covered with plastic mulch film (60 cm width). Because this experiment had different research purposes, some plants were paired or grouped, but these plants did not affect the growth of focal individuals grown singly and were omitted from the subsequent analyses. For more details, see our previous study (Fukano et al., 2019).
We measured individual height and stem diameter five times during plant development (13 May, 1, 13, 30 June, and 14 July) by using ruler and caliper, respectively.  (16,31 May,4,12,16,29 June, 3, 7, 10 July). Third, several camera parameters and the point cloud are refined through bundle adjustment, which is a iterative optimization method F I G U R E 1 The whole process for field-based individual plant phenotyping by UAV.

| Three-dimensional reconstruction and plot segmentation
Step 1: Imaging by UAV, threedimensional reconstruction, and plot segmentation.
Step 3 Step 1 : 3D reconstruction Step 2 : WEIPS (weed elimination for individual plant segmentation) Step 3 : Phenotyping Outline to solve nonlinear equation, here, subject to minimization of reprojection error, see (Hartley & Zisserman, 2004) for details. Finally, MVS generates a dense point cloud based on the set of multi-view images and camera parameters estimated in SfM. In our analysis, we used default intrinsic parameters provided by Pix4Dmapper as initial values. An orthomosaic image of the whole field was then generated from the Digital Surface Model (DSM) based on the dense point cloud (Figure 1, step 1).
We then extract the plot images manually. First, by using the "fishnet" function of ArcGIS 10.5 software (ESRI), a net of adjacent rectangular cells are generated according to user-input numbers of rows and columns inside the predefined field boundary. Then, the plot ID is semiautomatically recorded in an attribute

| WEIPS
The image segmentation process is needed to extract individual plants from plot images. In most cases, images are segmented by color on account of a large contrast between plants and bare soil (Fan et al., 2018;Guo et al., 2013Guo et al., , 2017. However, if the background includes objects with similar colors, such as weeds, further processing is needed. Therefore, researchers proposed several methods to distinguish weeds from the plant. For example, the use of a specific camera that can provide more spectral information; the use of complex algorithms such as machine learning with the manual selec-

| Individual plant phenotyping
Several phenotypic traits are extracted from the segmented individual plants. The cover area, major and minor axis lengths, eccentricity, orientation, convex area, filled area, equivalent diameter, solidity, extent, perimeter, and roundness can be easily calculated by MATLAB function "regionprops" (Figure 1, step 3). The height, volume, and outline are computed from the corresponded segmented DSM as following algorithms.

| Height
Plant height is calculated as the difference between the plant boundary and the ground level subtracted from the DSM (Hu et al., 2018;Watanabe et al., 2017). We used ground elevations generated from the first flight as the reference (E r

| Volume
Plant volume is approximated as the sum of height × area of all pixels at the plant base: where f i , i is the height of approximated cylinder cube, Δσ i is the area of pixel i, and n = the number of plant pixels acquired by WEIPS.

| Outline
We defined the canopy outline as the upper boundary of the pro-

| Statistical analyses
To validate the WEIPS method, we examined the correlation between the canopy coverage rate and the height of individual plants segmented by WEIPS and those manually segmented by author YF, using Pearson's correlation analysis, for each of five measurement dates.
To illustrate the utility of our framework in-field phenotyping, we examined the phenotypic variations among the three H. tuberosus  (Martinez, 2020) in the R language (R Core Team, 2020) because only dissimilarities could be calculated.

| Development of framework for UAV-based IPP
We developed an easy-to-use framework for UAV-based IPP ( Figure 1) that requires only a commercial-level UAV as hardware, and all processes are easy to implement.

| Comparison between WEIPS method and manual segmentation
We developed a novel technique, WEIPS, to eliminate weeds from UAV images by combining color-based segmentation and adaptive thresholding-based segmentation. The performance of WEIPS has been evaluated by Qseg, which is a well-known measurements of vegetation segmentation method (Guo et al., 2013;Meyer & Neto, 2008). Qseg is defined as below: where A is the set of the vegetation pixels (v = 255) or background pixels (v = 0) identified by a classification model, B is a reference set of manually segmented vegetation pixels (v = 255) or background pixels (v = 0), m and n are the image row and column sizes, and i, j are the pixel coordinate indices of the images. The more consistent pixels between A and B, the values become the larger ranging from 0 to 1. Namely, the higher the value, the more accurate the segmentation is.  Figure 3c).

| Comparison between estimated and measured heights
There were significant correlations between plant height estimated from the UAV images and that measured by hand on all measurement dates (p < .001 for all dates, Figure 4). The correlations were relatively high (R 2 ≈ 0.85) except on 12 June (R 2 = 0.47).

| Phenotypic variation among source populations
The results of all statistical tests are shown in Tables S1 and S2.
We detected significant variations among the source populations Canopy coverage rate calculated by hand respectively). Other traits did not differ among populations. The outline differed between population 1 and 2 and population 1 and 3 (p = .014 and <.001, respectively) but not differed between population 1 and 2 (p = .084).

| D ISCUSS I ON
We propose a new methodological framework for UAV-based IPP.
Overall The framework will shed new light on and improve research efficiency in both basic and applied plant biology. Its most remarkable feature is that it can estimate several shape-related traits that are difficult to measure manually (e.g., outline of aboveground parts and roundness). The ecological and evolutionary relevance of individual plant shape has received relatively little attention, probably owing to difficulties in noninvasive measurement. By using this framework, we might be able to examine which ecological and evolutionary factors influence the aboveground plant shape in field conditions. The framework can be easily applied to phenotyping in typical garden experiments, which is a classical approach to quantifying genetically based phenotypic differentiation among populations (Colautti et al., 2009). Because the UAV-based IPP saves labor, the use of the framework will improve research efficiency significantly. Recent studies have developed methods for automatically detecting crop head/flowering in time-series RGB images (Desai et al., 2019;Ghosal et al., 2019;Guo et al., 2018). By combining these methods and UAV-  Height measured by hand (cm) Crop Production under Climatic Change" of the Japan Science and Technology Agency.

CO N FLI C T O F I NTE R E S T
The authors declare no conflict of interest.

O PE N R E S E A RCH BA D G E S
This article has earned an Open Data Badge for making publicly available the digitally-shareable data necessary to reproduce the reported results. The data is available at https://doi.org/10.5061/ dryad.0cfxp nw0b