Image classification for detection of winter grapevine buds in natural conditions using scale-invariant features transform, bag of features and support vector machines

TítuloImage classification for detection of winter grapevine buds in natural conditions using scale-invariant features transform, bag of features and support vector machines
Publication TypeJournal Article
Year of Publication2017
AuthorsPérez DSebastián, Bromberg F, Diaz CAriel
JournalComputers and Electronics in Agriculture
Volume135
Start Page81
Pagination81-95
Date Published04/2017
ISSN0168-1699
Palabras clavecomputer vision, Grapevine bud, Image classification, Precision viticulture, Scanning-window detection
Abstract

In viticulture, there are several applications where bud detection in vineyard images is a necessary task, susceptible of being automated through the use of computer vision methods. A common and effective family of visual detection algorithms are the scanning-window type, that slide a (usually) fixed size window along the original image, classifying each resulting windowed-patch as containing or not containing the target object. The simplicity of these algorithms finds its most challenging aspect in the classification stage. Interested in grapevine buds detection in natural field conditions, this paper presents a classification method for images of grapevine buds ranging 100 to 1600 pixels in diameter, captured in outdoor, under natural field conditions, in winter (i.e., no grape bunches, very few leaves, and dormant buds), without artificial background, and with minimum equipment requirements. The proposed method uses well-known computer vision technologies: Scale-Invariant Feature Transform for calculating low-level features, Bag of Features for building an image descriptor, and Support Vector Machines for training a classifier. When evaluated over images containing buds of at least 100 pixels in diameter, the approach achieves a recall higher than 0.9 and a precision of 0.86 over all windowed-patches covering the whole bud and down to 60% of it, and scaled up to window patches containing a proportion of 20%-80% of bud versus background pixels. This robustness on the position and size of the window demonstrates its viability for use as the classification stage in a scanning-window detection algorithms.

URLhttp://www.sciencedirect.com/science/article/pii/S0168169916301818
DOI10.1016/j.compag.2017.01.020
Miembros del DHARMa que son autores:: 
Peer reviewed?: 
1
Internacional?: 
1