1 00:00:00,740 --> 00:00:05,410 But what the next step is to create architecture for our model. 2 00:00:05,520 --> 00:00:08,970 We will be using almost the same architecture. 3 00:00:09,120 --> 00:00:12,070 We will be using sequential guitars US A. 4 00:00:12,360 --> 00:00:15,960 And we will be adding four con layer. 5 00:00:15,960 --> 00:00:18,050 Will Assad would touch it for years. 6 00:00:18,240 --> 00:00:20,070 Then 60 votes were lost. 7 00:00:20,070 --> 00:00:21,800 Then for filters. 8 00:00:21,820 --> 00:00:29,030 Then again 128 filters and after each of this on layer we will. 9 00:00:29,070 --> 00:00:37,330 We are also applying willing layer and as always the activation function is that a loop for all these 10 00:00:37,330 --> 00:00:38,780 layers. 11 00:00:39,430 --> 00:00:45,770 And after that this time we are also playing global player. 12 00:00:46,080 --> 00:00:56,330 So what this layer will do is it we deactivate 50 percent of neurons during each epoch. 13 00:00:56,520 --> 00:01:03,030 It will randomly pick for people sent off our neurons and it will deactivate them and we will be creating 14 00:01:03,030 --> 00:01:08,770 model with the remaining 50 percent off neuron during each poll. 15 00:01:08,940 --> 00:01:17,620 So for each book we are deactivating randomly 50 percent of our total neuron. 16 00:01:17,760 --> 00:01:24,800 We are using robot here because doubled is a very effective layer to avoid or operating in our more 17 00:01:24,800 --> 00:01:27,970 than. 18 00:01:28,300 --> 00:01:32,900 So this is of a model architecture now. 19 00:01:33,080 --> 00:01:39,800 The next the surplus to come by for lost function we will be using binary cross entropy since we have 20 00:01:39,800 --> 00:01:41,020 two different classes. 21 00:01:42,640 --> 00:01:50,890 If you remember earlier in Amis we had 10 different classes and there we were using is spot's categorical 22 00:01:51,250 --> 00:01:52,590 across entropy. 23 00:01:52,750 --> 00:01:59,350 But here since we have under two classes we are using binary cross and property for optimize it. 24 00:01:59,380 --> 00:02:06,880 We are using Artemus prop with learning rate of zero point zero zero one and since this is a classification 25 00:02:06,880 --> 00:02:10,390 problem we are calculating accuracy metrics as well. 26 00:02:13,590 --> 00:02:23,200 The next step is to create our model since we are taking our data from Green generator. 27 00:02:23,230 --> 00:02:26,420 We have to use for the generator to fit our model. 28 00:02:26,530 --> 00:02:29,460 So we'll be using model dot for generator. 29 00:02:30,040 --> 00:02:38,480 Then green gender to this doesn't matter which will continuously generate data in the batches of 32 30 00:02:38,780 --> 00:02:49,460 images and then here we are using these steps but epoch as hundred earlier in our last model we were 31 00:02:49,460 --> 00:02:51,670 using a bed size of 21 steps. 32 00:02:51,680 --> 00:03:01,160 But epoch as hundred because we only had 2000 images for training purposes but this time since we are 33 00:03:01,160 --> 00:03:10,370 randomly generating images from this transformation we can use more than 2000 images as well. 34 00:03:10,370 --> 00:03:19,190 This time we are using a bed size of potato and steps but epoch as hundred So overall in e.g. pork we 35 00:03:19,190 --> 00:03:31,630 are feeding around three thousand two hundred images the number of epochs this time is hundred and similarly 36 00:03:31,660 --> 00:03:38,470 we will use validation generator to get the validation data. 37 00:03:39,100 --> 00:03:42,430 Now since we are running this four hundred epochs. 38 00:03:42,670 --> 00:03:49,840 If you are using the system with less than 16 GDP of RAM and without any graphic card it may take up 39 00:03:49,840 --> 00:03:54,950 to one and half to two overs to create this model. 40 00:03:55,020 --> 00:04:02,070 That's what I have already green this model and I have the data of what a hundred epochs yet. 41 00:04:04,250 --> 00:04:13,780 You can see that our validation accuracy is increasing with each book and at our own ninety two hundred 42 00:04:13,830 --> 00:04:25,180 but we are getting the validation accuracy between 80 to 84 percent and a training accuracy of 84 85. 43 00:04:25,240 --> 00:04:35,190 But say so if you compare in our last model we were getting a training accuracy of around 95 to 98 percent 44 00:04:36,000 --> 00:04:42,040 and a significantly lower validation accuracy of our own 79 percent. 45 00:04:42,690 --> 00:04:49,020 In this model we are getting almost the same validation and lower training accuracy overall it before 46 00:04:49,020 --> 00:05:00,070 passing so you can see that with our image processing and creating dummy images we have created or fitting 47 00:05:00,160 --> 00:05:00,910 in our more than 48 00:05:03,660 --> 00:05:04,770 after running this. 49 00:05:04,800 --> 00:05:13,710 You can save your model by model lots save method and lets us create this graph to see how our valuation 50 00:05:13,710 --> 00:05:19,990 accuracy and training accuracy are changing with each epoch. 51 00:05:20,160 --> 00:05:24,240 So this orange and red lines are for accuracy. 52 00:05:24,250 --> 00:05:30,510 This orange line is for training accuracy and red line is for validation accuracy. 53 00:05:30,510 --> 00:05:37,110 You can see here that the validation accuracy is more than 80 percent as well as the training accuracy 54 00:05:37,110 --> 00:05:42,630 is also more than 80 percent and both are moving together. 55 00:05:42,720 --> 00:05:51,090 So there are no evidence of what fitting in our model and this is still increasing. 56 00:05:51,090 --> 00:06:01,350 So if you run it for say 40 or 50 more epochs the validation accuracy may reach around 85 86 percent 57 00:06:01,380 --> 00:06:01,830 as well. 58 00:06:03,760 --> 00:06:12,460 So that's all for this will do we see that by augmenting our initial dataset by applying shear rotation 59 00:06:13,060 --> 00:06:16,210 we shift height shift and flips. 60 00:06:16,480 --> 00:06:24,610 We can treat or what fitting in our data and we can get a higher validation accuracy from our model. 61 00:06:25,180 --> 00:06:25,630 Thank you.