韩 丁, 武佩, 张强, 韩国栋, 通霏. 基于颜色矩的典型草原牧草特征提取与图像识别[J]. 农业工程学报, 2016, 32(23): 168-175. DOI: 10.11975/j.issn.1002-6819.2016.23.023
    引用本文: 韩 丁, 武佩, 张强, 韩国栋, 通霏. 基于颜色矩的典型草原牧草特征提取与图像识别[J]. 农业工程学报, 2016, 32(23): 168-175. DOI: 10.11975/j.issn.1002-6819.2016.23.023
    Han Ding, Wu Pei, Zhang Qiang, Han Guodong, Tong Fei. Feature extraction and image recognition of typical grassland forage based on color moment[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2016, 32(23): 168-175. DOI: 10.11975/j.issn.1002-6819.2016.23.023
    Citation: Han Ding, Wu Pei, Zhang Qiang, Han Guodong, Tong Fei. Feature extraction and image recognition of typical grassland forage based on color moment[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2016, 32(23): 168-175. DOI: 10.11975/j.issn.1002-6819.2016.23.023

    基于颜色矩的典型草原牧草特征提取与图像识别

    Feature extraction and image recognition of typical grassland forage based on color moment

    • 摘要: 针对内蒙古乌兰察布市荒漠化草原牧草监测与数字化程度较低的问题,该文实现了2种典型牧草的特征提取与图像识别,为多牧草种类识别与草业管理提供依据。利用智能导航车采集草原原始图像,对羊草和灰绿藜2种牧草图像提取RGB与HSV颜色矩特征并建立相应的规则库,数据表明二者的颜色矩特征具有明显区别。采用2G-B-R色差特征的模糊C-均值聚类算法对图像进行背景分割后,构建了一种3层BP神经网络模型,通过主成分分析法(principal component analysis,PCA)将15维输入特征参数降为10维以提高识别速度,且最终的整体识别率达到89.5%,实现了羊草与灰绿藜图像的有效分类识别,同时得到灰绿藜与羊草在测试图像中的植被覆盖度分别约为9.78%、34.21%。试验结果表明,利用颜色矩特征为基础,模糊C-均值聚类算法与BP(back propagation,BP)神经网络模型为分割、识别手段能够有效地实现典型牧草的图像分类研究。自动识别牧草是草业数字化的重要组成部分,可为监测植被物种多样性、草种退化及病虫草害的控制提供科学依据,是实现现代草原生态环境保护,发展草原经济的重要途径。

       

      Abstract: Abstract: To improve pasture monitoring and low-level digitization in Inner Mongolia Ulanqab desert steppe, the feature extraction and image recognition for two typical pastures were conducted in this paper so as to provide a basis for grass species identification and grassland management. The grass images were collected in the desert steppe of Siziwang banner of Inner Mongolia at 3 pm on August 15, 2014, where belongs to the middle temperate semi-arid continental monsoon climate zone. Under the natural sunlight intensity, 100 original images of varying blade sizes and shapes were taken as training and testing samples from prairie by using an intelligent navigation data acquisition vehicle. The method of color moment was used considering that it can reflect the color distribution information comprehensively. The process doesn't need to quantify the color space, and the feature vector dimension is low when the color feature is extracted. In this process, the first moment describes the mean color, the second moment represents the color variance, and the third moment describes the color skewness. From the three components of Red、Green、Blue (R、G、B), nine color moment feature vectors can be obtained and Hue (H) as well as Saturation (S) components were included along with six color moment feature vectors. But the Value (V) component of the brightness information had nothing to do with the color. In all, 15 dimensional vector features were required to be extracted. Then,RGB and HSV color moment were extracted and color rule set was established for the two typical grasses of Leymus chinensis and Chenopodium glaucum. The results showed that the color moment for these two forages had an obvious difference especially in the first color moment of RGB. The interval range of each parameter was an important factor for image classification and recognition. For the 60 collected images in the steppe, the 2G-B-R vegetation index method was employed separately, for separating the pasture from soil background of which the 33 image separations were successful. Then, the fuzzy C-means clustering algorithm based on 2G-B-R color difference characteristic was applied to the other 27 images of them. All the images acquired ideal background segmentation results. After finishing background segmentation, we built a three-layer BP neural network model. To improve the speed of recognition, the method of principal component analysis was used so that we could reduce the input feature parameters from 15 to 10. The number of output nodes of BP network was 2, which was the number of grass segmentation expressed by the binary form. The number of hidden layer nodes was calculated by the classical formula, the range of values was from 4 to 13, and the experiment indicated that the performance of the network can reach the best condition when the number of hidden nodes was 10 in the hidden layer. By taking 40 images as the validation samples, 89.5% of final overall recognition rate and an effective classification result of two pasture images were received. At the same time, the coverages of 9.78% and 34.21% for Chenopodium glaucum and Leymus chinensis were obtained respectively. The experimental results indicated that the image segmentation and classification were perfectly realized by using Fuzzy C-means clustering algorithm and BP neural network model based on color moment feature. Automatic identification of grass plays an important role in the grass digitization, the segmentation and recognition for forage image provided a scientific basis for the monitoring of vegetation species diversity, grass degradation and pest control. The results of this research could provide a significant theoretical foundation and data support for forage grass species identification. It would provide an important way in realizing the modern grassland ecological environment protection and development of the grassland economy.

       

    /

    返回文章
    返回