Open Access
8 February 2019 Detecting and mapping Gonipterus scutellatus induced vegetation defoliation using WorldView-2 pan-sharpened image texture combinations and an artificial neural network
Romano Lottering, Onisimo Mutanga, Kabir Peerbhay, Riyad Ismail
Author Affiliations +
Funded by: University of KwaZulu-Natal, National Research Foundation of South Africa
Abstract
Defoliation induced by the weevil Gonipterus scutellatus is causing significant damage to South Africa’s eucalyptus plantations. Therefore, the ability of remote sensing to detect and map G. scutellatus defoliation is essential for monitoring the spread of the weevil so that precautionary measures are set in place. In our study, an integrated approach using image texture in various processing combinations and an artificial neural network (ANN) were developed to detect and map G. scutellatus induced vegetation defoliation. A 0.5-m WorldView-2 pan-sharpened image was used to compute texture parameters from the gray-level occurrence matrix and gray-level co-occurrence matrix, using optimal moving windows for specific levels of G. scutellatus induced vegetation defoliation. In order to improve the accuracy of detecting and mapping G. scutellatus induced vegetation defoliation, a method involving a three-band texture processing combination was proposed and tested. Using a sequential forward selection algorithm allowed for the selection of optimal texture combinations, which were subsequently input into a backpropagation ANN. The results showed an improvement in detecting vegetation defoliation using single texture bands [R2  =  0.82, root mean square error (RMSE) = 0.95 (1.82% of the mean measured defoliation)] when compared to single spectral reflectance bands [R2  =  0.60, RMSE = 1.79 (3.43% of the mean measured defoliation)], two-band spectral reflectance combination model [R2  =  0.74, RMSE = 1.48 (2.83% of the mean measured defoliation)], and the three-band spectral reflectance combination model [R2  =  0.80, RMSE = 1.35 (2.59% of the mean measured defoliation)]. Further improvements were obtained using the two-band texture combination model [R2  =  0.85, RMSE = 1.05 (2.01% of the mean measured defoliation)] and the most promising result was obtained using the proposed three-band texture combination model [R2  =  0.90, RMSE = 0.85 (1.63% of the mean measured defoliation)]. Overall, our study highlights the potential of image texture combinations in improving the detection and mapping of vegetation defoliation.

1.

Introduction

In South Africa, eucalyptus plantations cover 700,000 ha of the country’s land base and is considered one of the most productive commercially planted exotic species in the country.1 However, the sustainability of these prolific species is threatened by outbreaks of the weevil Gonipterus scutellatus Gyllenhal (Coleoptera: Curculionidae).2,3 The weevil is native to south-east Australia and has subsequently perpetuated across the globe, with first reports of establishment in South Africa in 1916.35 The weevil is a specialist on the genus Eucalyptus and is considered an important limiting factor, which may cause a significant loss of productivity in established eucalyptus plantations.4,6 The effects of repeated aggressive feeding by the weevil may result in a reduction in eucalyptus growth rates and subsequent tree mortality.7 The problem is further augmented as many eucalypts demonstrate little to no resistance to the weevil, which is of major apprehension to South Africa’s commercial forestry sector.3 Primary management strategies to control weevil population densities may include either planting of nonsusceptible species of eucalyptus3,8 or the use of biological control.4,9 The latter was initiated in 1926 when the egg parasitic wasp Anaphes nitens Siscaro (Hymenoptera: Mymaridae) was first introduced into South Africa.4,10 However, South Africa’s environmental conditions and the absence of environmental resistance provide favorable conditions for the weevil to proliferate.11,12

Therefore, measures need to be set in place for incipient detection and monitoring of weevil-induced vegetation defoliation, as to ensure that appropriate interventions are set in place before a point of nonrecovery is reached. Although conventional methods of monitoring vegetation defoliation are widely used, in this regard, they may be expensive, subjective, time consuming, and spatially restrictive.1316 On the contrary, remote sensing offers a synoptic view of phenomenon on the ground, therefore having the potential to adequately detect and map vegetation defoliation induced by insects and other defoliating agents.14,15,17 The basis for monitoring and assessing forest health using remote sensing can be achieved by either detecting subtle spectral changes of foliage13 or foliage reduction.18,19 Previous studies have used vegetation indices to provide a quantitative framework for detecting and monitoring vegetation defoliation.18,20,21 For example, Lottering et al.19 used spatially optimized vegetation indices and successfully detected and mapped vegetation defoliation and leaf area index. However, since defoliation is expressed in tree crowns and canopies, some studies have argued that when using high spatial resolution data, image texture would be a more significant source of information.22,23

Image texture is used to determine the spatial orientation of features within high spatial resolution imagery.2426 It is a function of local variance and is thus scale dependent.22,27 This method captures the spatial information composed within a remotely sensed scene, making it possible to identify aspects of forest structure, which may include vegetation defoliation.28,29 Yuan et al.,30 for instance, used image texture to detect sugar maple decline and found that changes in tree crowns due to shoot dieback resulted in changes in image texture and could be quantified using linear models. Similarly, Moskal and Franklin22 used image texture analysis computed from high spatial resolution CASI image data and successfully detected the severity of aspen defoliation. The current study extends the work of previous studies22,30 by exploring the capability of image texture combinations in improving the detection of pest induced vegetation defoliation. Therefore, a three-band image texture processing combination approach was developed and tested in this study. The premise of this approach is twofold: (1) image texture has the capability of improving the detection of vegetation defoliation by simplifying the structure of the canopy and (2) band combinations have previously shown to improve the detection of vegetation defoliation when compared to the utility of individual bands. This is achieved by reducing background effects, sun angle effects, sensor angle effects, and atmospheric effects.31,32 In essence, the three-band image texture processing combination proposed in this study combines these two unique techniques, thus aiming to improve the detection of vegetation defoliation. In addition, previous studies have also focused on the relationship between image texture and defoliation using linear regression models, which assumes that a linear relationship exists between phenomena.

However, this is not always the case, as phenomena may be more complex and follow a nonlinear relationship, which can be effectively investigated using multivariate regression techniques. For example, an artificial neural network (ANN) is a multivariate regression technique that does not make any assumptions about the data.33 This algorithm emulates the functionality of the biological nervous system, demonstrating predictive capabilities that are not found in traditional statistical techniques.24,33,34 Many studies support the utility of an ANN in understanding complex relationships, as it has the capacity to deal with non-normality, nonlinearity, and collinearity within a system.24,33,34 Hence, integrating an ANN with image texture combinations computed from a 0.5-m WorldView-2 pan-sharpened image would be effective in detecting G. scutellatus induced vegetation defoliation. The WorldView-2 pan-sharpened dataset was selected because of its high spatial resolution, thus providing enhanced image texture information for detecting vegetation defoliation.

Although image texture has shown potential in detecting vegetation defoliation in the past, surprisingly its capabilities have not been fully explored. Therefore, the aim of this study was to explore the ability of image texture combinations computed from a 0.5-m WorldView-2 pan-sharpened image for detecting and mapping G. scutellatus induced vegetation defoliation using an ANN. More specifically, we investigated the performance of (1) image texture in various processing combinations, (2) spectral reflectance in various processing combinations, and finally, (3) image texture combinations versus spectral reflectance combinations in detecting G. scutellatus induced vegetation defoliation.

2.

Methods

2.1.

Study Area

The study was conducted in the Sappi Hodgsons Estate (30°59’85”E; 29°19’03”S), near Greytown in KwaZulu-Natal, South Africa (Fig. 1). This area covers 6391 ha of land and falls under the midlands mistbelt grassland bioregion. The landscape is undulating with an elevation ranging from 1030 to 1590 m above sea level. The area is characterized by summer rainfall ranging from 730 to 1280 mm/annum, with summer temperatures ranging from 24°C to the mid-30s. Winter months are dominated by misty conditions with temperatures ranging from 5°C to 14°C. Apedal and plinthic soil forms are predominantly found in this region and are derived primarily from the Ecca group. The three principal genera dominated in this area are pine (P. patula, P. echinata), eucalyptus (E. grandis, E. dunnii), and black wattle (A. mearnsii). Frequency of G. scutellatus outbreaks commonly occur in this area during the South African spring and summer months.

Fig. 1

Sappi Hodgsons Estate in KwaZulu-Natal, South Africa. Compartments are displayed using the WorldView-2 red band.

JARS_13_1_014513_f001.png

2.2.

Imagery

A WorldView-2 pan-sharpened image was obtained on September 20, 2012 under cloudless conditions. Figure 2 displays the characteristics35 of each WorldView-2 pan-sharpened band used in this study.

Fig. 2

Characteristics of the WorldView-2 pan-sharpened spectral bands.

JARS_13_1_014513_f002.png

The image was atmospherically calibrated to the top-of-atmosphere reflectance using fast line-of-sight atmospheric analysis of spectral hypercube radiative transfer algorithm in ENVI 4.736 and was georectified and orthorectified using 25 well-distributed ground control points. Following georectification, a root mean square error (RMSE) of less than one pixel size (0.5 m) was obtained. The image was then projected to Universal Transverse Mercator projection and the World Geodetic System 84 datum.

2.3.

Field Data Collection

Field data were collected between September 24, 2012 to October 5, 2012, which commenced 4 days after the acquisition of the WorldView-2 pan-sharpened image and coincided with G. scutellatus induced vegetation defoliation being at a peak. The Hawths tool in ArcGIS 10.3 was used to randomly generate 30×30  m plots over the study area using an existing WorldView-2 image. Prior field surveys that were conducted within the plantation showed that the 30×30  m plots were adequate for detecting levels of vegetation defoliation induced by the weevil. These plots were at least 15 m away from other features, such as roads, and were located in field using a handheld Trimble Geo-Explorer with submeter accuracy. Once the area of interest was located, a 30×30  m plot was created in field and the percentage level of defoliation was established per plot. All trees within the 30×30  m plot were inspected for G. scutellatus induced vegetation defoliation and was validated by an entomologist from Sappi forests. Defoliation levels were calculated as a percentage of defoliated trees to the total number of trees within the 30×30  m plot. These estimates were then grouped into four major defoliation classes regularly used in forest inventories: 25% and less defoliation (low), 26% to 50% defoliation (medium), 51% to 75% defoliation (high), and greater than 75% defoliation (severe). Training areas were created using these 30×30  m plots overlaid onto the texture and spectral images. This resulted in a total of 320 sample plots showing different levels of percentage defoliation. A summary of the field data collected is presented in Table 1.

Table 1

Summary of collected field data.

Class1% to 25%26% to 50%51% to 75%>75%
No. of Plots65888978
MeanStandard deviationMeanStandard deviationMeanStandard deviationMeanStandard deviation
Defoliation (%)15.586.7336.816.9560.556.6881.326.33

2.4.

Image Texture

Image texture is a function of local variance within an image and is dependent on the properties of a neighborhood of pixels.24,37 It is a useful approach for detecting vegetation defoliation, because vegetation structural changes induced by defoliation result in image texture variation.22 In addition, several studies have illustrated the superiority of image texture as a source of information, particularly with high spatial resolution imagery.24,26 These are divided into two categories, namely the gray-level occurrence matrix (GLOM) and the gray-level co-occurrence matrix (GLCM). GLOM ignores the spatial relationship between pixels and is computed from the histogram of pixel intensities within a window.38 Figure 3 provides a brief description of GLOM image texture parameters.

Fig. 3

First-order (GLOM) image texture parameters.38,39

JARS_13_1_014513_f003.png

On the other hand, GLCM determines the possibility of all pairwise combinations of gray-levels within a window.24,25 When determining the GLCM, a set of gray-level co-occurrence probabilities are stored, and statistics are then applied to the matrix to generate image texture parameters.40 Figure 4 provides a brief description of GLCM image texture parameters.

Fig. 4

Second-order (GLCM) image texture parameters.30,39,4143

JARS_13_1_014513_f004.png

In this study, the two categories of image texture were selected to determine their potential in detecting weevil-induced vegetation defoliation and were computed from a 0.5-m WorldView-2 pan-sharpened image with a co-occurrence shift of x=1, y=1 and θ=45  deg. The angle has minimal influence on the coefficient of determination41 and therefore using this angle exclusively was deemed adequate to calculate image texture parameters. The window sizes used for each image texture parameter computed was based on a study conducted by Lottering and Mutanga.2 They used the minimal variance to determine optimal spatial resolutions for detecting levels of G. scutellatus induced vegetation defoliation. The study concluded that an optimal spatial resolution for low and medium levels of defoliation was established at 2.5 m (5×5), and for high and severe levels of defoliation, an optimal spatial resolution was established at 3.5 m (7×7) and 4.5 m (9×9), respectively. Therefore, in this study, we computed image texture parameters using moving window sizes in accordance with the optimal spatial resolutions at which levels of defoliation were best represented. The mean values of each of the samples were then extracted in ArcGIS 10.3 and corresponded with all 320 sample plots. These image texture parameters were computed using ENVI 4.7 software.

2.5.

Processing the WorldView-2 Pan-Sharpened Image for Modeling

2.5.1.

WorldView-2 pan-sharpened image was processed in two steps

  • Step 1: Single spectral bands, two-band spectral combination, and three-band spectral combination

    This step involves testing the spectral reflectance of single bands and band combinations of the WorldView-2 pan-sharpened image in detecting weevil induced vegetation defoliation. Using the optimal spatial resolutions established by Lottering and Mutanga,2 the 0.5-m WorldView-2 pan-sharpened image was resampled accordingly. These spectral processing combinations were then formulated in the following manner:

    • 1) The spectral reflectance of the 8 WorldView-2 pan-sharpened bands were used in an ANN.

    • 2) All possible combinations of any 2 spectral reflectance bands were used in an ANN. These combinations were derived using Eq. (1):

      Eq. (1)

      B1B2B1+B2
      where B1 and B2 are spectral reflectance bands or image texture parameters.

    • 3) All possible combinations of any three spectral reflectance bands were used in an ANN. These combinations were derived using Eq. (2):

      Eq. (2)

      B1B2B1+B3
      where B1, B2, and B3 are spectral reflectance bands or image texture parameters.

  • Step 2: Single image texture bands, two-band image texture combination, and three-band image texture combination

    Thirteen image texture parameters were computed from each of the eight WorldView-2 pan-sharpened bands. Window sizes were selected based on the optimal spatial resolution established by Lottering and Mutanga.2 These image texture parameters were then formulated in the following manner:

    • 1) Single image texture bands were computed from each of the eight WorldView-2 pan-sharpened bands and were used in an ANN.

    • 2) All possible combinations of any two image texture parameters were used in an ANN. These combinations were derived using Eq. (1).

    • 3) All possible combinations of any three image texture parameters were used in an ANN. These combinations were derived using Eq. (2).

2.6.

Artificial Neural Network Analysis

This study used an ANN to determine the relationship between spectral reflectance and image texture parameters with G. scutellatus induced vegetation defoliation. Figure 5 illustrates the structure of an ANN.

Fig. 5

Representation of a feedforward neural network, where Oi represents the layer comprising the spectral reflectance or texture parameters, Oj represents the hidden layer to which Oi and Ok are connected, Ok represents the output layer, Wji and Wkj represent the weight value.34

JARS_13_1_014513_f005.png

Each model was trained using a backpropagation algorithm with one hidden layer.24,34 This supervised approach makes use of the gradient descent technique that aims to decrease the prediction of error. The algorithm functions in a forward and backward phase. The forward phase introduces the spectral or image texture parameters to the network. The model is subsequently run with initial random weights that generate an output for each of the inputs.44 In the hidden node, the product of the input and initial weight is summed up to produce a Zj value for the jth layer.45 Equation (3) illustrates this as follows:

Eq. (3)

Zj=JWji×Oi.

A neural network having three layers namely; i,j,k, and the k layer being the output, Sk can be calculated as follows:

Eq. (4)

Sk=jWkjZj,
where Sk is the value from the hidden layer j and Wkj is the weight value.

An aspect on nonlinearity is added to the network when the Zj passes through a sigmoidal activation function.45 The output is defined as follows:

Eq. (5)

Oj=11+e(Zj+θ)/θ0,
where θ is a threshold and θ0 is a constant.

Once the output values for each node are calculated, the forward phase stops. This is followed by the backward phase, where changes in the predicted and observed RMSE are directed back to the network to decrease overall error. This phase reoccurs until the error is at a minimum.

In the current study, the dataset (320 plots) was randomly divided into test (96 plots) and training (224 plots) data, using a repeated hold out sample with 100 iterations. The optimal number of nodes was established by manually changing the number of nodes in the hidden layer. Training epochs were set at 5000, this was done to avoid over training the model.24 The learning rate and momentum of the model were set at 0.01 and 0.30, respectively.

2.7.

Accuracy Assessment

Model performance of the spectral and image texture models was based on an independent test (96 plots) dataset. Each model’s performance was tested using the coefficient of determination (R2), RMSE, and %RMSE between observed and predicted levels of G. scutellatus induced vegetation defoliation. Models with the highest R2, lowest RMSE, and %RMSE were retained for detecting vegetation defoliation. Figure 6 shows a flowchart of the methodology undertaken to achieve the objectives of this study.

Fig. 6

Flowchart showing the research methodology.

JARS_13_1_014513_f006.png

3.

Results

Figure 7 illustrates the difference in % defoliation over the study area. From a total of 320 plots, 20.31% were scored low (1% to 25%), 27.50% were scored medium (26% to 50%), 27.81% were scored high (51% to 75%), and 24.38% were scored severe (>75%) for visual estimates of percentage defoliation.

Fig. 7

Histogram illustrating defoliation levels within the study area.

JARS_13_1_014513_f007.png

3.1.

Relationship Between Spectral Reflectance and Image Texture Parameters with Weevil Induced Vegetation Defoliation

The relationship between each parameter and defoliation was determined using a Pearson’s correlation test. Image texture and spectral parameters used in model development subsequently underwent a sequential forward selection. This allowed for the selection of the best spectral reflectance and image texture parameters for the development of models to detect weevil-induced vegetation defoliation.

3.2.

Selected Spectral Reflectance and Image Texture Parameters for the Artifical Neural Network

Since the three-band image texture combination and three-band spectral reflectance combination models showed the highest correlation with vegetation defoliation in their respective groups, we chose to illustrate these two models in Table 2. Table 2 also shows the input variables selected by the sequential forward selection algorithm, which were subsequently used in the ANN for model development. In addition, the Pearson’s correlation test revealed that all selected parameters from each model were highly correlated with weevil-induced vegetation defoliation, however higher correlations were established using the three-band image texture combination model. It can also be noted that image texture parameters selected for development of the three-band image texture combination model were predominantly gray-level co-occurrence image texture parameters computed primarily from the red, red edge, near-infrared 1, and near-infrared 2 bands.

Table 2

Significant (p<0.01) combinations of spectral and image texture variables for detecting defoliation.

VariableThree-band spectral reflectanceThree-band texture parameters
DefoliationSpectral bandrTexturer
(RED)(NIR1)(RED)+(REDEDGE)0.67(B8,E,9,C)(B5,H,5,C)(B8,E,9,C)+(B6,Cr,5,C)0.70
(NIR2)(RED)(NIR2)+(REDEDGE)0.69(B6,H,7,C)(B7,S,5,C)(B6,H,7,C)+(B8,S,5,C)0.74
(NIR1)(REDEDGE)(NIR1)+(RED)0.71(B7,H,5,C)(B8,Cr,9,C)(B7,H,5,C)+(B5,H,7,C)0.73
(REDEDGE)(RED)(REDEDGE)+(NIR2)0.72(B5,H,5,C)(B8,E,9,)(B5,H,5,C)+(B7,Cr,9,C)0.74

3.3.

Artificial Neural Networks

Using an ANN, we detected G. scutellatus induced vegetation defoliation using spectral and image texture parameters computed from a 0.5-m WorldView-2 pan-sharpened image. We achieved this by changing the number of nodes in the hidden layer. A list of parameters used to train each model are shown in Tables 3 and 4 for spectral and image texture models, respectively.

Table 3

Best trained ANN parameters for detecting levels of defoliation using spectral reflectance data.

ModelInputHiddenProfile
Single spectral bands54MLP 5:5-4-1:1
Two-band spectral combination54MLP 5:5-4-1:1
Three-band spectral combination43MLP 4:4-3-1:1

Table 4

Best trained ANN parameters for detecting levels of defoliation using image texture.

ModelInputHiddenProfile
Single texture bands53MLP 5:5-3-1:1
Two-band texture combination43MLP 4:4-3-1:1
Three-band texture combination42MLP 4:4-2-1:1

3.4.

Training, Testing and Applying the Artificial Neural Network to Detect Levels Vegetation Defoliation

Each of the models were trained using the parameters displayed in Tables 3 and 4. Using random initial weights, each model was run a maximum of five times.24,44 Weight configurations that yielded the highest R2, lowest RMSE, and %RMSE between predicted and measured defoliation based on an independent test dataset were reserved for detecting vegetation defoliation.

3.5.

Performance of Spectral and Image Texture ANN Models

Table 5 shows that the three-band spectral reflectance combination model [R2=0.80 and RMSE = 1.35 (2.59% of the mean measured defoliation)] outperformed both the two-band spectral reflectance combination [R2=0.74 and RMSE = 1.48 (2.83% of the mean measured defoliation)] and single spectral reflectance band [R2=0.60 and RMSE = 1.79 (3.43% of the mean measured defoliation)] models. This was due to a higher R2, lower RMSE, and %RMSE value based on an independent test dataset.

Table 5

Comparison between spectral reflectance predictive models (n=320).

VariableSingle spectral bandsTwo-band spectral modelThree-band spectral model
DefoliationDataR2RMSER2RMSER2RMSE
Test0.601.79 (3.43%)0.741.48 (2.83%)0.801.35 (2.59%)
Train0.581.32 (2.52%)0.720.93 (1.78%)0.790.89 (1.71%)
Note: R2: All significant at p<0.01

While ANN models using the single image texture bands [R2=0.82 and RMSE = 0.95 (1.82% of the mean measured defoliation)] and two-band image texture combinations [R2=0.85 and RMSE = 1.05 (2.01% of the mean measured defoliation)] yielded high R2 values in detecting vegetation defoliation, Table 6 shows the ANN model using the three-band image texture combination [R2=0.90 and RMSE = 0.85 (1.63% of the mean measured defoliation)] yielded the highest R2, lowest RMSE, and %RMSE value based on an independent test dataset.

Table 6

Comparison between image texture predictive models (n=320).

VariableSingle texture parametersTwo-band texture modelThree-band texture model
DefoliationDataR2RMSER2RMSER2RMSE
Test0.820.95 (1.82%)0.851.05 (2.01%)0.900.85 (1.63%)
Train0.810.83 (1.59%)0.860.70 (1.34%)0.880.61 (1.17%)
Note: R2: All significant at p<0.01

3.6.

Testing and Applying the ANN to Detect Weevil Defoliation Using Local Binary Patterns

Local binary patterns (LBPs) are an alternate texture analysis proposed by Ojala et al.46 LBPs were computed from the WorldView-2 pan-sharpened image using the same moving window sizes used to compute the image texture and combination models. Similarly, the resulting LBPs were also used to develop models using a backpropagation ANN to detect levels of vegetation defoliation. The model that yielded the highest R2, lowest RMSE, and %RMSE was retained and used for detecting vegetation defoliation induced by the weevil. Table 7 shows the results obtained from the LBP model.

Table 7

Results obtained using the LBP model (n=320).

VariableLBP model
DefoliationR2RMSE
0.781.11 (2.13%)
0.771.08 (2.07%)
Note: R2: All significant at p<0.01

3.7.

Comparing the Performance of Image Texture Models and the LBP Model

Tables 6 and 7 show the results of the image texture models and LBP model, respectively. Based on independent test datasets, all image texture models yielded higher R2 values for detecting defoliation induced by the weevil when compared to the LBP model. In addition, all image texture models also yielded lower RMSE and %RMSE results when compared to the LBP model, based on independent test datasets.

3.8.

Mapping the Distribution of G. Scutellatus Vegetation Defoliation Using an ANN

Although all three image texture models yielded high correlation coefficients, a predictive map showing the distribution of defoliation was developed using the best trained three-band image texture combination model. This was due to the superior performance of this model in detecting G. scutellatus induced vegetation defoliation over all tested models. The predicted map was developed using R statistical software. Figure 8 shows the distribution of levels of G. scutellatus induced vegetation defoliation over the study area.

Fig. 8

Predictive map showing the distribution of G. scutellatus over the entire study area.

JARS_13_1_014513_f008.png

3.9.

Sensitivity Analysis

Since the three-band image texture combination model outperformed all the models in detecting vegetation defoliation, we tested the importance of the selected variables in this model. A sensitivity analysis was run to determine which variables played the most prominent role in developing the three-band image texture combination model. Input variables with higher ratios show more importance than lower ratio variables in model development. Table 8 shows a sensitivity analysis for the three-band image texture combination model.

Table 8

Sensitivity analysis of the three-band image texture combination model used in the ANN.

VariableRankImage texture combinationRatio
Defoliation1(B6,H,7,C)(B7,S,5,C)(B6,H,7,C)+(B8,S,5,C)1.94
2(B7,H,5,C)(B8,Cr,9,C)(B7,H,5,C)+(B5,H,7,C)1.25
3(B5,H,5,C)(B8,E,9,C)(B5,H,5,C)+(B7,Cr,9,C)1.09
4(B8,E,9,C)(B5,H,5,C)(B8,E,9,C)+(B6,Cr,5,C)1.05
Note: B5, B6, B7, B8: red, red edge, near-IR1, near-IR2; H: homogeneity; E: entropy; Cr: correlation; S: second moment; C: co-occurrence; 5, 7, 9: 5×5, 7×7, 9×9

Values within the ratio column are calculated for each image texture combination as if the variable is absent during network execution. The error obtained in the absence of the variable is divided by the error obtained when the variable is present. An index with higher ratios would illustrate a reduction in network performance if removed from the network.

4.

Discussion

4.1.

Relationship Between Image Texture Combinations and Vegetation Defoliation

Although several studies have reported on the ability of image texture in detecting vegetation defoliation,22,30 the current study, however, was the first to report on the ability of image texture combinations in detecting vegetation defoliation. This study introduced a three-band image texture combination approach in an attempt to improve the detection of vegetation defoliation. Overall, the results in this study showed that image texture combinations have the capacity to adequately detect and map vegetation defoliation, outperforming traditional spectral reflectance data. Among the seven models tested, the three-band image texture combination showed the highest overall accuracy [R2=0.90 and RMSE = 0.85 (1.63% of the mean measured defoliation)] in detecting vegetation defoliation. Conversely, using single spectral reflectance bands produced the lowest overall accuracy [R2=0.60 and RMSE = 1.79 (3.43% of the mean measured defoliation)]. This was followed by the two-band spectral reflectance combination model, which produced a somewhat better result [R2=0.74 and RMSE = 1.48 (2.83% of the mean measured defoliation)]. However, the three-band spectral reflectance combination produced the highest overall accuracy of the three spectral reflectance models [R2=0.80 and RMSE = 1.35 (2.59% of the mean measured defoliation)].

The detection of G. scutellatus induced vegetation defoliation improved immensely using image texture. This result is consistent with the findings of many studies that reported improved performance of image texture over spectral reflectance data.22,24,37 For example, Nichol and Sarker37 found that image texture outperformed spectral reflectance data in estimating vegetation biomass. In the current study, this improvement was noted in all three image texture models. The results obtained from each image texture model were very promising with the single image texture parameter model having an R2 of 0.82 and an RMSE of 0.95 (1.82% of the mean measured defoliation), which improved further using the two-band image texture combination model [R2=0.85 and RMSE = 1.05 (2.01% of the mean measured defoliation)]. However, the highest agreement was achieved using the three-band image texture combination model [R2=0.90 and RMSE = 0.85 (1.63% of the mean measured defoliation)], a result that has not been previously reported in literature for detecting levels of pest-induced vegetation defoliation. This could be due to the high spatial resolution of the WorldView-2 pan-sharpened image. For instance, several studies have emphasized the dependence of image texture on the spatial resolution of remotely sensed data.22,24,26,27 In addition, the improved accuracy of the three-band image texture combination model could have also been attributed to the fact that the moving windows used to compute each image texture parameter matched the optimal spatial resolution at which each level of weevil-induced vegetation defoliation was best represented. Lottering and Mutanga,2 for example, found that optimizing the spatial resolution of remotely sensed data improved the detection of vegetation defoliation. This result was subsequently compared to the result obtained from the LBP model, whose performance was not as significant when compared to the image texture and texture combination models in detecting vegetation defoliation induced by the weevil. Similarly, Kaya et al.47 also found that gray-level co-occurrence image texture outperformed the LBPs in evaluating image texture features.

Moreover, it was also noted that image texture parameters selected for model development were predominately co-occurrence texture parameters when compared to the occurrence texture parameters. According to Rao et al.,48 co-occurrence texture parameters are the most applicable for remote sensing of vegetation. This was echoed by previous studies that focused on forest structure and image texture.24,30,49 For example, Yuan et al30 found that co-occurrence texture parameters were better at detecting sugar maple decline when compared to occurrence texture parameters. In another study, Moskal and Franklin22 successfully detected aspen defoliation using homogeneity, which is a co-occurrence texture parameter. The sensitivity analysis in our study showed that homogeneity appeared in most processing combinations, thus reaffirming the significance of co-occurrence texture parameters in detecting vegetation defoliation. Furthermore, image texture parameters computed from the red, red edge, near-infrared 1, and near-infrared 2 bands of the WorldView-2 pan-sharpened image were predominately selected for developing the three-band image texture combination model. Similarly, structurally relevant information found in the red, red edge, and NIR spectral ranges have also confirmed the results of other studies focusing on forest structure, for example, Xu et al.,50 Ingram et al.,33 Cho et al.,51 and Gebreslasie et al.52 All three moving windows seemed to play an equal role in the development of the three-band image texture combination model, which was probably due to the moving window sizes corresponding with the optimal spatial resolution best representing specific levels of weevil-induced vegetation defoliation.

4.2.

Spatial Distribution of Defoliation Levels Over the Study Area

The results have shown that the three-band image texture combination model has the ability to effectively detect and map levels of vegetation defoliation induced by the weevil G. scutellatus. From the predictive map, it can be seen that in all compartments, there was some degree of defoliation. However, higher and severe levels of defoliation were more prominent toward the southern parts of the map. The rational explanation behind this could be due to preference shown by the weevil. Several studies have highlighted the selective nature of G. scutellatus, whereby certain eucalyptus species are more preferred than others.5356 For example, Fuentes et al.56 found that Eucalyptus camaldulensis was more susceptible to G. scutellatus defoliation than Eucalyptus robusta or Eucalyptus globulus. In our case, the eucalyptus species found toward the southern parts of the predictive map were Eucalyptus grandis and toward the northern parts were Eucalyptus dunni. Therefore, the more preferred Eucalyptus grandis may have been initially consumed followed by the less preferred Eucalyptus dunni.

To summarize, all image texture models produced very promising results, however, the three-band image texture combination model was the more superior of the models. This was primarily due to the high spatial resolution of the WorldView-2 pan-sharpened image and the unique three-band image texture processing combination technique, which combines the advantages of both image texture and band combinations. The methodology conducted in this study could be applied to other areas, provided the remotely sensed image is optimized to specific spatial resolutions at which levels of defoliation are best represented. Furthermore, it is also essential to use high spatial resolution data for improved detection outcomes, as image texture is scale dependent.22,24,27

5.

Conclusion

Although the potential of image texture has been previously demonstrated, no study has investigated the full potential of image texture in detecting and mapping G. scutellatus induced vegetation defoliation. This study has therefore shown an integrated approach in detecting weevil-induced vegetation defoliation using image texture combinations and an ANN. This study has revealed that:

  • the three-band image texture combination model showed the highest overall accuracy in detecting vegetation defoliation, when compared to the two-band image texture combination model and the single-band image texture model,

  • the three-band spectral reflectance combination model showed the highest overall accuracy in detecting vegetation defoliation, when compared to the two-band spectral reflectance combination model and the single-band spectral reflectance model,

  • and finally, all image texture models outperformed all spectral reflectance models in detecting G. scutellatus induced vegetation defoliation.

Overall, this study was the first attempt to detect vegetation defoliation using image texture combinations and an ANN. Detecting vegetation defoliation with high spatial resolution imagery and image texture combinations can be a reliable practical alternative to conventional field surveys. The three-band image texture combination proposed in this study is an objective, quantitative, and a cost effective means of detecting vegetation defoliation. The result is important in improving the overall accuracy of detecting and mapping pest-induced vegetation defoliation.

Acknowledgments

The authors of this paper would like to acknowledge the support of the University of KwaZulu-Natal, Applied Centre for Climate and Earth System Science (ACCESS), Sappi Forests, and the National Research Foundation of South Africa (Grant No. 114898), which allowed for the successful completion of this research.

References

1. 

S. Gebeyehu, B. P. Hurley and M. J. Wingfield, “A new lepidopteran insect pest discovered on commercially grown Eucalyptus nitens in South Africa,” S. Afr. J. Sci., 101 (1/2), 26 –28 (2005). Google Scholar

2. 

R. Lottering and O. Mutanga, “Optimising the spatial resolution of WorldView-2 pan-sharpened imagery for predicting levels of Gonipterus scutellatus defoliation in KwaZulu-Natal, South Africa,” ISPRS J. Photogramm. Remote Sens., 112 13 –22 (2016). https://doi.org/10.1016/j.isprsjprs.2015.11.010 IRSEE9 0924-2716 Google Scholar

3. 

S. W. Newete, R. G. Oberprieler and M. J. Byrne, “The host range of the Eucalyptus weevil, Gonipterus “scutellatus” Gyllenhal (Coleoptera: Curculionidae), in South Africa,” Ann. For. Sci., 68 (5), 1005 –1013 (2011). https://doi.org/10.1007/s13595-011-0108-9 NSFAAO Google Scholar

4. 

F. G. C. Tooke, “The eucalyptus snout-beetle, Gonipterus scutellatus Gyll: a study of its ecology and control by biological means,” Ent. Mem. Dept. Agric. U. S. Afr., 3 1 –282 (1955). Google Scholar

5. 

C. W. Mally, “The eucalyptus snout beetle (Gonipterus scutellatus Gyll.),” J. Dept. Agric. S. Afr., 9 415 –442 (1924). Google Scholar

6. 

A. D. Loch and M. Matsuki, “Effects of defoliation by Eucalyptus weevil, Gonipterus scutellatus, and chrysomelid beetles on growth of Eucalyptus globulus in southwestern Australia,” For. Ecol. Manage., 260 (8), 1324 –1332 (2010). https://doi.org/10.1016/j.foreco.2010.07.025 FECMDW 0378-1127 Google Scholar

7. 

A. Cheraghian, “Eucalyptus weevil Gonipterus scutellatus Gyllenhal 1833 Coleoptera: Curculionidae,” 1 –234 (2013). Google Scholar

8. 

K. F. Richardson and R. H. Meakins, “Inter- and intra- specific variation in the susceptibility of eucalypts to the snout beetle Gonipterus scutellatus Gyll. (Coleoptera: Curculionidae),” S. Afr. For. J., 139 (1), 21 –31 (1986). https://doi.org/10.1080/00382167.1986.9630053 Google Scholar

9. 

A. R. Reis et al., “Efficiency of biological control of Gonipterus platensis (Coleoptera: Curculionidae) by Anaphes nitens (Hymenoptera: Mymaridae) in cold areas of the Iberian Peninsula: implications for defoliation and wood production in Eucalyptus globules,” For. Ecol. Manage., 270 (0), 216 –222 (2012). https://doi.org/10.1016/j.foreco.2012.01.038 FECMDW 0378-1127 Google Scholar

10. 

G. D. Tribe, “The present status of Anaphes nitens (Hymenoptera: Mymaridae), an egg parasitoid of the Eucalyptus snout beetle Gonipterus scutellatus, in the Western Cape Province of South Africa,” Southern Afr. For. J., 203 (1), 49 –54 (2005). https://doi.org/10.2989/10295920509505218 Google Scholar

11. 

A. D. Loch and R. B. Floyd, “Insect pests of Tasmanian blue gum, Eucalyptus globulus globulus, in south-western Australia: history, current perspectives and future prospects,” Austral Ecol., 26 (5), 458 –466 (2001). https://doi.org/10.1046/j.1442-9993.2001.01145.x Google Scholar

12. 

J. T. Huber and G. L. Prinsloo, “Redescription of Anaphes nitens (girault) and description of two new species of Anaphes haliday (hymenoptera: mymaridae), parasites of Gonipterus scutellatus gyllenhal (coleoptera: curculionidae) in Tasmania,” Aust. J. Entomol., 29 (4), 333 –341 (1990). https://doi.org/10.1111/aen.1990.29.issue-4 Google Scholar

13. 

R. Ismail, O. Mutanga, A. Ahmed, “Discriminating Sirex noctilio attack in pine forest plantations in South Africa using high spectral resolution data,” Hyperspectral Remote Sensing of Tropical and Sub-Tropical Forests, 161 –175 CRC Press, Florida (2008). Google Scholar

14. 

Z. Oumar, O. Mutanga and R. Ismail, “Predicting Thaumastocoris peregrinus damage using narrow band normalized indices and hyperspectral indices using field spectra resampled to the hyperion sensor,” Int. J. Appl. Earth Obs. Geoinf., 21 (0), 113 –121 (2013). https://doi.org/10.1016/j.jag.2012.08.006 Google Scholar

15. 

K. M. de Beurs and P. A. Townsend, “Estimating the effect of gypsy moth defoliation using MODIS,” Remote Sens. Environ., 112 (10), 3983 –3990 (2008). https://doi.org/10.1016/j.rse.2008.07.008 Google Scholar

16. 

L. Eklundh, T. Johansson and S. Solberg, “Mapping insect defoliation in scots pine with MODIS time-series data,” Remote Sens. Environ., 113 (7), 1566 –1573 (2009). https://doi.org/10.1016/j.rse.2009.03.008 Google Scholar

17. 

R. Ismail, O. Mutanga and U. Bob, “Forest health and vitality: the detection and monitoring of Pinus patula trees infected by Sirex noctilio using digital multispectral imagery,” S. Hemisphere For. J., 69 (1), 39 –47 (2007). https://doi.org/10.2989/SHFJ.2007.69.1.5.167 Google Scholar

18. 

Z. Oumar and O. Mutanga, “Using WorldView-2 bands and indices to predict bronze bug (Thaumastocoris peregrinus) damage in plantation forests,” Int. J. Remote Sens., 34 (6), 2236 –2249 (2012). https://doi.org/10.1080/01431161.2012.743694 IJSEDK 0143-1161 Google Scholar

19. 

R. Lottering, O. Mutanga and K. Peerbhay, “Detecting and mapping levels of Gonipterus scutellatus-induced vegetation defoliation and leaf area index using spatially optimized vegetation indices,” Geocarto Int., 33 (3), 277 –292 (2018). https://doi.org/10.1080/10106049.2016.1250823 Google Scholar

20. 

S. E. Franklin, H. Fan and X. Guo, “Relationship between Landsat TM and SPOT vegetation indices and cumulative spruce budworm defoliation,” Int. J. Remote Sens., 29 (4), 1215 –1220 (2007). https://doi.org/10.1080/01431160701730136 IJSEDK 0143-1161 Google Scholar

21. 

J. P. Spruce et al., “Assessment of MODIS NDVI time series data products for detecting forest defoliation by gypsy moth outbreaks,” Remote Sens. Environ., 115 (2), 427 –437 (2011). https://doi.org/10.1016/j.rse.2010.09.013 Google Scholar

22. 

L. M. Moskal and S. E. Franklin, “Relationship between airborne multispectral image texture and aspen defoliation,” Int. J. Remote Sens., 25 (14), 2701 –2711 (2004). https://doi.org/10.1080/01431160310001642304 IJSEDK 0143-1161 Google Scholar

23. 

S. E. Franklin et al., “Aerial and satellite sensor detection and classification of western spruce budworm defoliation in a subalpine forest,” Can. J. Remote Sens., 21 (3), 299 –308 (1995). https://doi.org/10.1080/07038992.1995.10874624 CJRSDP 0703-8992 Google Scholar

24. 

R. Lottering and O. Mutanga, “Estimating the road edge effect on adjacent Eucalyptus grandis forests in KwaZulu-Natal, South Africa, using texture measures and an artificial neural network,” J. Spatial Sci., 57 (2), 153 –173 (2012). https://doi.org/10.1080/14498596.2012.733617 Google Scholar

25. 

R. M. Haralick, K. Shanmugam and I. H. Dinstein, “Textural features for image classification,” IEEE Trans. Syst. Man Cybern., SMC-3 (6), 610 –621 (1973). https://doi.org/10.1109/TSMC.1973.4309314 Google Scholar

26. 

H. S. Dungey and B. M. Potts, “Eucalypt hybrid susceptibility to Gonipterus scutellatus (Coleoptera: Curculionidae),” Austral Ecol., 28 (1), 70 –74 (2003). https://doi.org/10.1046/j.1442-9993.2003.01250.x Google Scholar

27. 

M. Wulder, “Optical remote-sensing techniques for the assessment of forest inventory and biophysical parameters,” Prog. Phys. Geogr., 22 (4), 449 –476 (1998). https://doi.org/10.1177/030913339802200402 Google Scholar

28. 

S. E. Franklin, W. W. Bowers and G. Ghitter, “Discrimination of adelgid-damage on single balsam fir trees with aerial remote sensing data,” Int. J. Remote Sens., 16 (15), 2779 –2794 (1995). https://doi.org/10.1080/01431169508954591 IJSEDK 0143-1161 Google Scholar

29. 

S. E. Franklin, M. A. Wulder and G. R. Gerylo, “Texture analysis of IKONOS panchromatic data for Douglas-fir forest age class separability in British Columbia,” Int. J. Remote Sens., 22 (13), 2627 –2632 (2001). https://doi.org/10.1080/01431160120769 IJSEDK 0143-1161 Google Scholar

30. 

X. Yuan, D. King and J. Vlcek, “Sugar maple decline assessment based on spectral and textural analysis of multispectral aerial videography,” Remote Sens. Environ., 37 (1), 47 –54 (1991). https://doi.org/10.1016/0034-4257(91)90049-C Google Scholar

31. 

R. B. Myneni and F.G. Hall, “The interpretation of spectral vegetation indexes,” IEEE Trans. Geosci. Remote Sens., 33 (2), 481 –486 (1995). https://doi.org/10.1109/36.377948 IGRSD2 0196-2892 Google Scholar

32. 

W. Wang et al., “Estimating leaf nitrogen concentration with three-band vegetation indices in rice and wheat,” Field Crops Res., 129 90 –98 (2012). https://doi.org/10.1016/j.fcr.2012.01.014 FCREDZ 0378-4290 Google Scholar

33. 

J. C. Ingram, T. P. Dawson and R. J. Whittaker, “Mapping tropical forest structure in southeastern Madagascar using remote sensing and artificial neural networks,” Remote Sens. Environ., 94 (4), 491 –507 (2005). https://doi.org/10.1016/j.rse.2004.12.001 Google Scholar

34. 

O. Mutanga and A. K. Skidmore, “Integrating imaging spectroscopy and neural networks to map grass quality in the Kruger National Park, South Africa,” Remote Sens. Environ., 90 (1), 104 –115 (2004). https://doi.org/10.1016/j.rse.2003.12.004 Google Scholar

35. 

DigitalGlobe, The Benefits of the 8 Spectral Bands of WorldView-2, United States(2010). Google Scholar

36. 

ENVI, Version 4.7, Exelis Visual Information Solutions, ITT Industries, Boulder, Colorado (2009). Google Scholar

37. 

J. E. Nichol and M. L. R. Sarker, “Improved biomass estimation using the texture parameters of two high-resolution optical sensors,” IEEE Trans. Geosci. Remote Sens., 49 (3), 930 –948 (2011). https://doi.org/10.1109/TGRS.2010.2068574 IGRSD2 0196-2892 Google Scholar

38. 

V. St-Louis et al., “High-resolution image texture as a predictor of bird species richness,” Remote Sens. Environ., 105 (4), 299 –312 (2006). https://doi.org/10.1016/j.rse.2006.07.003 Google Scholar

39. 

A. Materka and M. Strzelecki, “Texture analysis methods-a review,” Technical University of Lodz, Institute of Electronics, COST B11 Report, Brussels 9 –11 (1998). Google Scholar

40. 

R. Jobanputra and D. A. Clausi, “Preserving boundaries for image texture segmentation using grey level co-occurring probabilities,” Pattern Recognit., 39 (2), 234 –245 (2006). https://doi.org/10.1016/j.patcog.2005.07.010 Google Scholar

41. 

F. Kayitakire, C. Hamel and P. Defourny, “Retrieving forest structure variables based on image texture analysis and IKONOS-2 imagery,” Remote Sens. Environ., 102 (3), 390 –401 (2006). https://doi.org/10.1016/j.rse.2006.02.022 Google Scholar

42. 

Y. Rubner et al., “Empirical evaluation of dissimilarity measures for color and texture,” Comput. Vision Image Understanding, 84 (1), 25 –43 (2001). https://doi.org/10.1006/cviu.2001.0934 Google Scholar

43. 

E. M. Tuttle et al., “Using remote sensing image texture to study habitat use patterns: a case study using the polymorphic white-throated sparrow (Zonotrichia albicollis),” Global Ecol. Biogeogr., 15 (4), 349 –357 (2006). https://doi.org/10.1111/geb.2006.15.issue-4 GEBIFS 1466-8238 Google Scholar

44. 

Y. Zhang et al., “Application of an empirical neural network to surface water quality estimation in the Gulf of Finland using combined optical data and microwave data,” Remote Sens. Environ., 81 (2–3), 327 –336 (2002). https://doi.org/10.1016/S0034-4257(02)00009-3 Google Scholar

45. 

A. Skidmore et al., “Performance of a neural network: mapping forests using GIS and remotely sensed data,” Photogramm. Eng. Remote Sens., 63 (5), 501 –514 (1997). Google Scholar

46. 

T. Ojala, M. Pietikainen and D. Harwood, “Performance evaluation of texture measures with classification based on Kullback discrimination of distributions,” in Proc. 12th Int. Conf. Pattern Recognit., (1994). https://doi.org/10.1109/ICPR.1994.576366 Google Scholar

47. 

Y. Kaya et al., “Evaluation of texture features for automatic detecting butterfly species using extreme learning machine,” J. Exp. Theor. Artif. Intell., 26 (2), 267 –281 (2014). https://doi.org/10.1080/0952813X.2013.861875 JEAIEL 1362-3079 Google Scholar

48. 

P. V. N. Rao et al., “Textural analysis of IRSID panchromatic data for land cover classification,” Int. J. Remote Sens., 23 (17), 3327 –3345 (2002). https://doi.org/10.1080/01431160110104665 IJSEDK 0143-1161 Google Scholar

49. 

S. E. Franklin et al., “Incorporating texture into classification of forest species composition from airborne multispectral images,” Int. J. Remote Sens., 21 (1), 61 –79 (2000). https://doi.org/10.1080/014311600210993 IJSEDK 0143-1161 Google Scholar

50. 

B. Xu, P. Gong and R. Pu, “Crown closure estimation of oak savannah in a dry season with Landsat TM imagery: comparison of various indices through correlation analysis,” Int. J. Remote Sens., 24 (9), 1811 –1822 (2003). https://doi.org/10.1080/01431160210144598 IJSEDK 0143-1161 Google Scholar

51. 

M. A. Cho, A. K. Skidmore and I. Sobhan, “Mapping beech (Fagus sylvatica L.) forest structure with airborne hyperspectral imagery,” Int. J. Appl. Earth Obs. Geoinf., 11 (3), 201 –211 (2009). https://doi.org/10.1016/j.jag.2009.01.006 Google Scholar

52. 

M. T. Gebreslasie, F. B. Ahmed and J. A. N. van Aardt, “Extracting structural attributes from IKONOS imagery for Eucalyptus plantation forests in KwaZulu-Natal, South Africa, using image texture analysis and artificial neural networks,” Int. J. Remote Sens., 32 (22), 7677 –7701 (2011). https://doi.org/10.1080/01431161.2010.527392 IJSEDK 0143-1161 Google Scholar

53. 

A. R. Clarke, S. Paterson and P. Pennington, “Gonipterus scutellatus Gyllenhal (Coleoptera: Curculionidae) oviposition on seven naturally co-occurring Eucalyptus species,” For. Ecol. Manage., 110 (1–3), 89 –99 (1998). https://doi.org/10.1016/S0378-1127(98)00277-1 FECMDW 0378-1127 Google Scholar

54. 

F. G. C. Tooke, “The eucalyptus snout beetle Gonipterus scutellatus Gyll. A study of its ecology and control by biological means,” Entomol. Memoir, 3 1 –26 (1953). Google Scholar

55. 

A. D. Loch, “Parasitism of the Eucalyptus weevil, Gonipterus scutellatus Gyllenhal, by the egg parasitoid, Anaphes nitens Girault, in Eucalyptus globulus plantations in southwestern Australia,” Biol. Control, 47 (1), 1 –7 (2008). https://doi.org/10.1016/j.biocontrol.2008.07.015 BCIOEB 1049-9644 Google Scholar

56. 

H. A. Fuentes et al., “Susceptibility of eucalyptus species to Gonipterus scutellatus and electrophoretic profiles of adult marker proteins,” Agrociencia, 42 (3), 327 –334 (2008). AGCCBR Google Scholar

Biography

Romano Lottering received his master’s degree in applied environmental science (cum laude) and a PhD in environmental science from the University of KwaZulu-Natal, Pietermaritzburg, South Africa. From 2011 to present, he is a lecturer and research scientist at the University of KwaZulu-Natal. His research focus is on the utility of remotely sensed data to detect and map pest-induced vegetation defoliation in commercial forest plantations.

Onisimo Mutanga is a professor at the University of KwaZulu-Natal, Pietermaritzburg, South Africa. His research focus is on ecological assessment and monitoring of vegetation patterns using GIS and remote sensing. His focus in recent years has been on the development of remote sensing techniques for mapping tropical vegetation quality and quantity in an attempt to understand wildlife feeding patterns and distribution. His emerging niche areas include mapping vegetation species, disease infestation on plantation forests and agricultural crops as well as the quantification of forest degradation and its impact on biodiversity.

Kabir Peerbhay received his master’s degree in applied environmental science (cum laude) and a PhD in environmental science from the University of KwaZulu-Natal, Pietermaritzburg, South Africa. From 2015 to present, he is a senior research scientist at Institute for Commercial Forestry Research and was recently appointed research fellow at the University of KwaZulu-Natal. His research focus is on using remote sensing to detect and map alien invasive plant species in commercial forest plantations.

Riyad Ismail received his master’s degree in GIS (cum laude) and a PhD in remote sensing from the University of KwaZulu-Natal, Pietermaritzburg, South Africa. He is a principal research officer at Sappi forests and a research fellow at the University of KwaZulu-Natal. His research focus is on remote sensing of forest health and the utility of machine learning algorithms to maximize the benefits of spatial technology.

© 2019 Society of Photo-Optical Instrumentation Engineers (SPIE) 1931-3195/2019/$25.00 © 2019 SPIE
Romano Lottering, Onisimo Mutanga, Kabir Peerbhay, and Riyad Ismail "Detecting and mapping Gonipterus scutellatus induced vegetation defoliation using WorldView-2 pan-sharpened image texture combinations and an artificial neural network," Journal of Applied Remote Sensing 13(1), 014513 (8 February 2019). https://doi.org/10.1117/1.JRS.13.014513
Received: 7 August 2018; Accepted: 15 January 2019; Published: 8 February 2019
Lens.org Logo
CITATIONS
Cited by 17 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Vegetation

Reflectivity

Data modeling

Spatial resolution

Volume rendering

Performance modeling

Spectral models

Back to Top