Evaluation of stereovision for extracting plant features

Mohammed Amean, Z. and Low, T. and McCarthy, C. and Hancock, N. (2013) Evaluation of stereovision for extracting plant features. In: SEAg 2013: Innovative Agricultural Technologies for a Sustainable Future, 22-25 Sept 2013, Barton, Western Australia.

Text (Published Version)
Mohammed Amean_Low_McCarthy_Hancock_SEAg2013_AV.pdf

Download (1MB) | Preview


Visual sensors produce an image that can be analysed by computational algorithms to extract useful
information about image features. Plants have complex object structures in which images can help
extract, however, not all plant features can be recognised through a single image. Stereovision
enhances the single image features by adding the third dimension, depth, to obtain more accurate
localisation of the plant’s structures. Stereovision is a technique that produces a disparity map of a
scene through the use of two or more images taken from different points of view. Depth information
can be used to enhance the detection of fruit and plant parts; however research in using Stereovision
for extracting plant structures is sparse.

In this paper, Stereovision is analysed in its ability to extract important features from two types of
nursery plants taken in indoor and outdoor lighting conditions. From the colour images, colour and
shape segmentation are evaluated on their ability to extract certain plant features, such as stems,
branches and leaf. Depth images are also evaluated on their accuracy, coverage, and ability to improve
image segmentation for colour images. The depth images have some gaps and missing data. The new
algorithm develops the depth images by interpolating the gap data and smoothing depth images.
Preliminary results show good plant feature can be extracted from depth images at indoor
environment, while depth data from an outdoor environment contains more noise due to the variation
in lighting conditions.

Statistics for USQ ePrint 24533
Statistics for this ePrint Item
Item Type: Conference or Workshop Item (Commonwealth Reporting Category E) (Paper)
Refereed: Yes
Item Status: Live Archive
Additional Information: Published Version deposited with the permission of the Conference Chair (Scientific Committee).
Faculty/School / Institute/Centre: Current - Faculty of Health, Engineering and Sciences - School of Mechanical and Electrical Engineering (1 Jul 2013 -)
Faculty/School / Institute/Centre: Current - Faculty of Health, Engineering and Sciences - School of Mechanical and Electrical Engineering (1 Jul 2013 -)
Date Deposited: 15 Jan 2014 05:45
Last Modified: 14 May 2020 06:58
Uncontrolled Keywords: 3D perception, depth perception, disparity map, feature extraction, image segmentation mobile robot, plant part detection, precision agriculture.
Fields of Research (2008): 09 Engineering > 0906 Electrical and Electronic Engineering > 090602 Control Systems, Robotics and Automation
URI: http://eprints.usq.edu.au/id/eprint/24533

Actions (login required)

View Item Archive Repository Staff Only