|dc.description.abstract||Whiteflies are a major pest and vector that has greatly affected the yields of cassava
production. Whiteflies are sucking insects that feed on the plant sap and in the
process transmit or pick up disease causing organisms. The high populations make
it easy for them to transmit disease causing organisms at a fast rate. When un-
controlled, whiteflies can reduce crop yields and affect cassava quality and in some
adverse cases cause total loss of crop yields. Therefore, knowing the numbers of
whiteflies is key in detecting and identifying their spread and prevention. However,
the current approach for detecting and counting whiteflies involves visual inspection
and performing manual count, these traditional methods of counting whiteflies have
setbacks like efficiency and time consumption.
In the past decade, there has been great developments in the machine learn-
ing field especially the area of computer vision such as facial recognition rapidly
approaching human, self driving cars and Merged Reality ( Virtual Reality and
Augmented Reality). We intend to leverage this advancements to build a model to
detect and count whiteflies.
Various studies have been proposed to aid in the process of detecting and count-
ing whiteflies using machine learning technologies; M.N Jige et al proposed a
method of detecting and counting using MATLAB, this approach had some concerns
since very little was used to train the model. J. F. Tusubira et al proposed an-
other approach using Haar cascade classifier and faster R-CNN, this approach was
limited by the complexity and small size of whiteflies. K. R. B. Legaspi et al
used YOLOV3 used to classify and detect objects, specifically whiteflies and fruit
flies, this approach was also limited by insufficient data.
In this research project, we implemented YOLOv5, a state of the art deep learn-
ing object detection model, for the detection and counting of whiteflies. We managed
to achieve an mAP of 0.842 @0.5IOU and deployed the model unto a smartphone.
The results obtained from this project showed 20% improvement compared to cur-
rent state of the art approaches.||en_US