Flower Image Classifier
- Aunsh Arekar
- Mar 25, 2022
- 3 min read
Updated: Mar 27, 2022
Overview
This consists of a dataset of over 4000 flowers belonging to 5 classes. The task was to build a classifier that can efficiently identify which class a flower belongs to on feeding the flower image to the classifier. Various tests were done in building an appropriate model to give good accuracy.
Model Used
For this classifier, the model that has been chosen is CNN(Convolutional Neural Network). A CNN model is part of Deep Learning. It consists of layers called convolution layers having neurons. The first layer takes input and passes the results to the next layer and so on.
Approach
In this classifier, the accuracy at the beginning was about 70%. At the first step, after loading the dataset, it underwent a label assigning procedure, wherein every flower image was assigned its respective class label. This was done in order to prepare the training dataset of each class that was needed for the modelling phase later. Some tuning and normalization of data was involved in making training data such as resizing the images to a common size since all the images were of different sizes.
After this, there was the encoding step for all the categorical data along with one hot coding. This was followed by the usual split for train and test data that is required for the modelling phase.
Finally we come to the modelling phase. As mentioned above, a CNN model was chosen which consists of multiple convolution layers. Each layer summarized the features of an input image. Each layer had neurons and a max pooling layer. An epoch size and batch size was also defined to run the iterations along with calculation of loss at each iteration and the resultant accuracy at each iteration.
Testing
After the initial test, 3 test cases were performed by changing various hyperparamters to try to achieve higher accuracy each time. This process is also termed as hyperparameter tuning.
The first test, that is Test 1 was done with an epoch size of 25, batch size of 60 and 3 convolution layers having 32, 64 and 96 neurons respectively. This is shown below:

Convolution Layers for Test 1

Defining Epoch and Batch size for Test 2
With this test an accuracy of 74.91% was obtained as shown below

Test 1 accuracy
The second test, that is Test 2 was done with an epoch size of 40, batch size of 95 and 4 convolution layers having 32, 64,96 and 128 neurons respectively. This is shown below:

Convolution Layers for Test 2

Defining Epoch and Batch size for Test 2
With this test an accuracy of 75.93% was obtained as shown below

Test 2 accuracy
The third and final test, that is Test 3 was done with an epoch size of 55, batch size of 125 and 5 convolution layers having 32, 64,96,128 and 256 neurons respectively. This is shown below:

Convolution Layers for Test 3

Defining Epoch and Batch size for Test 3
With this test an accuracy of 79.91% was obtained as shown below

Test 3 accuracy
Thus, across 3 steps the accuracy increases by altering various hyperparameters. This is also shown as a bar graph of the accuracy comparison below

Challenges
The challenge was to bring the accuracy up from initial step. Finding the right hyperparameters and tuning them in the right amount was a challenge to figure out, since there were instances where the accuracy was stuck at around 70%. Eventually, after numerous trail and errors the correct amount of hyperparameters were needed to be tuned.
Refrences
Comentarios