Document Type

Article

Language

eng

Publication Date

10-2018

Publisher

Institute of Electrical and Electronic Engineers (IEEE)

Source Publication

IEEE Robotics and Automation Letters

Source ISSN

2377-3766

Original Item ID

DOI: 10.1109/LRA.2018.2849498

Abstract

In fruit production, critical crop management decisions are guided by bloom intensity, i.e., the number of flowers present in an orchard. Despite its importance, bloom intensity is still typically estimated by means of human visual inspection. Existing automated computer vision systems for flower identification are based on hand-engineered techniques that work only under specific conditions and with limited performance. This letter proposes an automated technique for flower identification that is robust to uncontrolled environments and applicable to different flower species. Our method relies on an end-to-end residual convolutional neural network (CNN) that represents the state-of-the-art in semantic segmentation. To enhance its sensitivity to flowers, we fine-tune this network using a single dataset of apple flower images. Since CNNs tend to produce coarse segmentations, we employ a refinement method to better distinguish between individual flower instances. Without any preprocessing or dataset-specific training, experimental results on images of apple, peach, and pear flowers, acquired under different conditions demonstrate the robustness and broad applicability of our method.

Comments

Accepted version. IEEE Robotics and Automation Letters, Vol. 3, No. 4 (October 2018): 3003-3010. DOI. © 2018 Institute of Electrical and Electronic Engineers (IEEE). Used with permission.

medeiros_13048acc.docx (1087 kB)
ADA Accessible Version

Share

COinS