A CNN-based identification of honeybees' infection using augmentation
Loading...
Supplementary material
Other Title
Authors
Kaur, Manjit
Author ORCID Profiles (clickable)
Degree
Master of Computing
Grantor
Unitec Institute of Technology
Date
2021
Supervisors
Ardekani, Iman
Varastehpour, Soheil
Varastehpour, Soheil
Type
Masters Thesis
Ngā Upoko Tukutuku (Māori subject headings)
Keyword
honey bee image processing
image data augmentation
honeybees (Apis mellifera)
Generative Adversarial Network (GAN)
Convolutional Neural Network (CNN)
beehive monitoring
bees
pest control
beehives
AI in agriculture
image data augmentation
honeybees (Apis mellifera)
Generative Adversarial Network (GAN)
Convolutional Neural Network (CNN)
beehive monitoring
bees
pest control
beehives
AI in agriculture
ANZSRC Field of Research Code (2020)
Citation
Kaur, M. (2021). A CNN-based identification of honeybees’ infection using augmentation. (Unpublished document submitted in partial fulfilment of the requirements for the degree of Master of Computing). Unitec Institute of Technology, New Zealand. Retrieved from https://hdl.handle.net/10652/5410
Abstract
RESEARCH QUESTIONS:
[Are standard images classified accurately using Convolutional Neural Network (CNN) or transfer learning method?]
Can varroa destructor mite be identified correctly from a small number of low-quality honeybee images?
ABSTRACT:
Artificial Intelligence (AI) solutions have been extensively explored and implemented in the apiculture industry. Designing an efficient system for continuous observation of bees for their health and activities has been a challenge and of great interest among researchers. Several studies manifest that AI-based solutions can accommodate the purpose of beehive monitoring. However, the systems constructed based on computer vision techniques are found to be incompetent with noisy and insufficient data. This research proposes a computer vision-based approach to classify bee images for infected or healthy bees, dealing with indistinct and inadequate image data. As a solution, image contrast enhancement and data augmentation methods are used to improve the quality and size of data, respectively. The foggy bee images are managed using the Contrast Limited Adaptive Histogram Equalization (CLAHE), enhancing the image contrast to make it more understandable. The small dataset is augmented using an advanced method, Deep Convolutional Generative Adversarial Networks (DCGAN), which is capable of generating new images resembling the real ones. We investigate the capability of DCGAN as an alternative to the conventional augmentation technique that uses the varied geometric transformation of the image. An optimized Convolutional Neural Network (CNN) model is used to classify the augmented bee image data. Further, one of the transfer learning techniques, VGG16, is applied to compare the classifiers' performance on the same data sets. The effectiveness of the preprocessing method CLAHE and augmentation method DCGAN is assessed against original images and conventional augmentation methods. The classifiers CNN and VGG16 are compared alongside for their performance on non-augmented and augmented data sets. The investigations conducted on the given bee data sets delivered promising results. The CLAHE pre-processing method not only improved the image sharpness by 10% but also enhanced the CNN classi er's performance as compared to the original images. The optimized CNN's performance enhanced with conventional geometrically transformed augmented data. However, the classifier outperformed with added DCGAN generated synthetic images in the original data set providing up to 99.9% accuracy. Further, ne-tuned transfer learning classification model provided comparable accuracy alongside CNN with added efficiency building and executing the model. The success of this preliminary investigation paves the way to explore di erent variations of Generative Adversarial Network (GAN) to augment the image data together with enhancing its quality.
Publisher
Permanent link
Link to ePress publication
DOI
Copyright holder
Author
Copyright notice
All rights reserved
