1 00:00:01,140 --> 00:00:01,620 Hello. 2 00:00:02,190 --> 00:00:07,920 In this lecture, we will learn how to run rate and the laso in Python. 3 00:00:09,810 --> 00:00:16,530 If you remember from the curity lecture, we first need to standardize our data before running the region. 4 00:00:16,590 --> 00:00:17,280 The LASO. 5 00:00:19,010 --> 00:00:24,890 So let's first start by centralizing over data for standardizing. 6 00:00:25,520 --> 00:00:28,420 We will first import be processing from Skillern. 7 00:00:28,960 --> 00:00:32,540 So we will write from Escalon Import Preprocessing. 8 00:00:41,210 --> 00:00:48,830 Then we cleared the scalar object, which will store the Skilling information for what x variable rate 9 00:00:48,920 --> 00:00:52,960 is scalar equate to reprocessing dot ascendants scalar. 10 00:00:53,660 --> 00:00:57,410 Here is standard scalar is the function of preprocessing. 11 00:01:02,830 --> 00:01:06,910 And then we will fit our extreme into this scale at. 12 00:01:17,500 --> 00:01:20,310 For this hour, Scaler is ready. 13 00:01:21,160 --> 00:01:26,200 Now let's first transform our extreme and to extreme scale. 14 00:01:26,530 --> 00:01:28,790 So we'll write X, underscore creon. 15 00:01:28,850 --> 00:01:30,010 Underscore s. 16 00:01:33,070 --> 00:01:35,690 He had our repellent is skilled. 17 00:01:37,140 --> 00:01:39,420 Right is skill learned, not transform. 18 00:01:41,550 --> 00:01:44,390 Remember, we created this scalar object double. 19 00:01:45,210 --> 00:01:46,470 Now we are transforming. 20 00:01:46,500 --> 00:01:47,430 What extreme? 21 00:01:54,450 --> 00:01:57,650 But on this hour, extreme is scarily variable is that. 22 00:02:00,070 --> 00:02:02,480 Now let's transform our test set also. 23 00:02:02,550 --> 00:02:06,380 So in right X underscored test underscore s. 24 00:02:08,870 --> 00:02:13,070 Equate to a scalar, not transform. 25 00:02:23,870 --> 00:02:29,960 X underscore test, just to give you a quick summary. 26 00:02:30,230 --> 00:02:37,370 Here we are first creating the scaler, which is exporting the information on how to divide each rule 27 00:02:37,400 --> 00:02:38,510 and each cell. 28 00:02:39,380 --> 00:02:46,340 Then we are using this is scalar object to transform our extreme variable so that we can create our 29 00:02:46,440 --> 00:02:47,570 Escorial variables. 30 00:02:48,710 --> 00:02:52,970 Similarly, we are using the same is scalar for our test set as well. 31 00:02:58,270 --> 00:03:01,390 Now we have we standardize over independent variables. 32 00:03:02,140 --> 00:03:04,720 Now let's move on to red integration. 33 00:03:07,150 --> 00:03:10,930 For religious aggression, we first need to import grit from Escala. 34 00:03:11,500 --> 00:03:17,650 So right from a Skillern, not linear model. 35 00:03:24,830 --> 00:03:25,790 Import grade. 36 00:03:32,190 --> 00:03:35,990 Now, let's run a simple, rich regression. 37 00:03:37,970 --> 00:03:41,520 Well, first cleared to object, LMR underscored odd 38 00:03:44,300 --> 00:03:45,750 liquid to build Ridge. 39 00:03:49,000 --> 00:03:53,520 And in record, we will mention our Lamela and Biton. 40 00:03:53,590 --> 00:03:54,550 We call it Dulfer. 41 00:03:54,670 --> 00:03:58,810 So we will write a five point two zero point five. 42 00:04:01,960 --> 00:04:04,330 So here I will memorize zero point five. 43 00:04:06,310 --> 00:04:08,760 And we are creating a Attridge object. 44 00:04:10,770 --> 00:04:14,670 Let's fit this model, Bill, right, elements of good are not fit. 45 00:04:19,700 --> 00:04:22,760 Now we will use over skill variables. 46 00:04:22,830 --> 00:04:26,170 We'll write X, underscore crane, underscore s. 47 00:04:32,960 --> 00:04:34,660 Comma, why underscore Craine? 48 00:04:39,060 --> 00:04:40,540 We have food data, what the first model. 49 00:04:40,890 --> 00:04:41,860 So let's find out. 50 00:04:42,980 --> 00:04:48,680 The artist squared value on the word test data will right out to underscore the school. 51 00:04:53,150 --> 00:04:57,830 Again, are to underscore is score is a function of skill and liablity. 52 00:04:57,860 --> 00:05:01,040 So remember, two important before running this function. 53 00:05:03,360 --> 00:05:04,870 We'll read like best. 54 00:05:08,270 --> 00:05:13,550 Then will write, L.M. underscored are not predict 55 00:05:16,360 --> 00:05:23,430 and in record, we'll write our best independent, weary ones, which is Xander Score Test underscore 56 00:05:23,540 --> 00:05:23,930 as. 57 00:05:25,370 --> 00:05:29,570 Here, our white based variable is a word or regional values. 58 00:05:30,710 --> 00:05:39,430 And the second argument is the predicted values of why on test variables can see the artist square value 59 00:05:40,460 --> 00:05:41,800 is zero point five four. 60 00:05:42,440 --> 00:05:49,580 Now, if you remember, we can also change the value of this LAMBROS or ULFA while running red regression, 61 00:05:51,290 --> 00:06:01,490 and we want to find out the value of all four Lamela for which the artist squared value is maximum for 62 00:06:01,490 --> 00:06:04,820 crying out such multiple values of hyper parameters. 63 00:06:05,270 --> 00:06:10,240 There is another function in Ashkelon, which is validation curves. 64 00:06:11,450 --> 00:06:20,020 So we'll first import the validation curve from Escalon Library and then I will show you how to select 65 00:06:20,020 --> 00:06:24,380 your optimum model from multiple values of hyper parameters. 66 00:06:27,490 --> 00:06:31,980 Well, first, import validation good right from Skillern, Todmorden, selection, 67 00:06:36,030 --> 00:06:37,330 import validation, good. 68 00:06:52,620 --> 00:06:57,250 Now, let's first see the parameters we need for this validation, good function. 69 00:06:57,530 --> 00:06:58,780 So, all right, validation. 70 00:06:59,710 --> 00:07:04,420 And then we will put a question mark to open the help for this function. 71 00:07:16,280 --> 00:07:20,270 If you see our first parameter should be an estimated. 72 00:07:23,080 --> 00:07:31,290 So now if we want to run rigid aggression, we should great, great, our second parameter is X variable. 73 00:07:31,540 --> 00:07:33,310 Our third parameter is very, very well. 74 00:07:33,730 --> 00:07:36,730 These are independent and dependent variables. 75 00:07:37,300 --> 00:07:41,020 Our fourth parameter is bad name. 76 00:07:42,010 --> 00:07:43,590 This is the hyper parameter. 77 00:07:43,650 --> 00:07:44,680 We want to change. 78 00:07:45,340 --> 00:07:48,130 In this case, we want to change the alpha value. 79 00:07:48,730 --> 00:07:48,990 So we. 80 00:07:49,040 --> 00:07:49,250 Right. 81 00:07:49,300 --> 00:07:50,080 I'll fire. 82 00:07:51,640 --> 00:07:54,890 Our fifth parameter is bad membranes. 83 00:07:55,570 --> 00:07:58,750 This is the values of this badam named parameter. 84 00:07:59,290 --> 00:08:00,100 You want to give. 85 00:08:01,480 --> 00:08:06,790 So here we will give my devil values all four alpha for our model. 86 00:08:09,870 --> 00:08:18,000 Now, we are not going to discuss groups, C.V. and other parameters, one another parameter we want 87 00:08:18,000 --> 00:08:19,830 to give is is scoring. 88 00:08:20,850 --> 00:08:24,370 We want our moderate to score using our square. 89 00:08:24,750 --> 00:08:28,080 So we will go escorting equate to are to. 90 00:08:30,860 --> 00:08:38,760 Now, to summarize, what we want to do is we want to train our models for multiple values of Lamela. 91 00:08:39,960 --> 00:08:44,230 So we will give multiple values of Lamela to our model. 92 00:08:44,790 --> 00:08:49,110 And we want the best model out of those lamda values. 93 00:08:49,920 --> 00:08:54,060 So we will run our model multiple times and will select the best model. 94 00:08:54,870 --> 00:08:56,250 All of those models. 95 00:09:00,650 --> 00:09:04,940 So first, let's create an array of over Lamela values. 96 00:09:06,830 --> 00:09:09,290 We will write Barum underscore range. 97 00:09:09,440 --> 00:09:10,670 This is our variable. 98 00:09:16,500 --> 00:09:20,970 We will equate it to and be not lock space. 99 00:09:26,470 --> 00:09:33,810 And then record will rate minus two, comma eight, comma hundred. 100 00:09:34,570 --> 00:09:43,990 Let me explain you this lock space function, this lock space function will create hundred values between 101 00:09:44,020 --> 00:09:48,220 Penrith's toward minus two and Ben Reas to deport eight. 102 00:09:51,650 --> 00:09:52,970 Now, if I just. 103 00:09:53,150 --> 00:10:01,930 The values of my parents change, you can see we have created a hundred different values between Penrith's 104 00:10:01,930 --> 00:10:05,350 who are minus two and 10 Race to the Power eight. 105 00:10:07,790 --> 00:10:13,090 We will use these values as the values of our lamda for our model. 106 00:10:13,730 --> 00:10:17,390 And then we will select the best Lamela value out of this. 107 00:10:17,420 --> 00:10:18,800 Different level of values. 108 00:10:19,910 --> 00:10:23,330 Now let's run the accretions for our modern. 109 00:10:26,440 --> 00:10:28,610 We saw validation curve earlier. 110 00:10:29,080 --> 00:10:33,430 The output of validation curve is in the form of two arrays. 111 00:10:33,650 --> 00:10:35,050 First is their training score. 112 00:10:35,140 --> 00:10:36,640 And second is their test score. 113 00:10:38,980 --> 00:10:42,240 I told you earlier that we will be scoring on Artist Square. 114 00:10:43,020 --> 00:10:48,700 So training score and test score will contain the values of our score for all these different models. 115 00:10:51,050 --> 00:10:53,450 Let's write Green underscored escort's. 116 00:10:55,840 --> 00:10:57,400 This other what variables? 117 00:11:03,950 --> 00:11:05,540 Equal to valuation of. 118 00:11:10,610 --> 00:11:15,360 Remember, the first barometer of validation good was the estimate. 119 00:11:15,830 --> 00:11:23,140 So here it assuming that is that it will write grade, second parameter was X value. 120 00:11:23,330 --> 00:11:26,720 So we'll write X and at school Kreen underscore to S. 121 00:11:30,490 --> 00:11:33,670 Next, but I mean, that is our way value. 122 00:11:38,430 --> 00:11:39,990 Next, but ometer was over. 123 00:11:40,050 --> 00:11:40,920 But I'm returning home. 124 00:11:44,070 --> 00:11:52,440 We will write ULFA in double quotation marks since we want to change the value of this Alpha. 125 00:11:56,690 --> 00:11:59,680 And then the next parameter was by Rembrandt. 126 00:12:01,760 --> 00:12:08,340 We created a bad Grange variable using lock space functions. 127 00:12:08,420 --> 00:12:09,950 We'll use this video. 128 00:12:10,370 --> 00:12:14,210 But next, we wonder what it is scoring to be off. 129 00:12:14,790 --> 00:12:15,980 Artist Kerswill, right. 130 00:12:16,670 --> 00:12:20,840 Scoring equal to a square. 131 00:12:23,560 --> 00:12:24,400 Fit on this. 132 00:12:26,740 --> 00:12:30,460 We have created this model for all this lamanno values. 133 00:12:31,900 --> 00:12:35,550 Now we will just print our CRANEY score and test the score. 134 00:12:39,620 --> 00:12:45,800 So notice we have not given our test set and to this validation. 135 00:12:47,690 --> 00:12:51,710 So how this validation, Gervase, calculating their test scores? 136 00:12:52,910 --> 00:12:59,930 This is actually running a cross validation or give for validation on our training data. 137 00:13:03,740 --> 00:13:08,150 So for each value of lambda, we are getting three values of our square. 138 00:13:09,290 --> 00:13:15,680 That is because the validation group is running Kiefel validation behind the scene. 139 00:13:18,740 --> 00:13:22,370 Let's just take a mean score of these three values. 140 00:13:22,640 --> 00:13:25,040 So, all right, dream underscored mean 141 00:13:28,120 --> 00:13:29,690 equal to and paid out mean. 142 00:13:33,510 --> 00:13:34,890 Will light green escort. 143 00:13:41,190 --> 00:13:46,920 Coma access equate to one, since we want the word every just to be along the roads. 144 00:13:49,590 --> 00:13:52,330 Similarly, we take the mean well lose over Tesla School. 145 00:14:04,000 --> 00:14:06,580 If we view this test mean Krien mean. 146 00:14:15,490 --> 00:14:21,820 So these are the hundred other square values for and Greg Lambros. 147 00:14:25,690 --> 00:14:29,610 Similarly, we have hungered values on our test score also. 148 00:14:30,910 --> 00:14:33,700 You can also compared with our lambda values. 149 00:14:34,480 --> 00:14:39,490 So for each of our Lemonheads values, these are the artist's good values we are getting. 150 00:14:40,150 --> 00:14:44,950 No, we are only interested in the model which have the highest artist's good value. 151 00:14:45,100 --> 00:14:52,180 So will write Max of our best mean. 152 00:14:59,300 --> 00:15:07,580 This will give me the highest artist square venue for my model, which is zero point seven two four 153 00:15:07,700 --> 00:15:08,240 six. 154 00:15:09,260 --> 00:15:13,410 Few remember earlier our artists got to where Lou was our own point. 155 00:15:13,470 --> 00:15:15,790 So I went to one nove. 156 00:15:16,940 --> 00:15:18,120 We have improved it. 157 00:15:18,140 --> 00:15:19,810 Two point seven dofor. 158 00:15:24,870 --> 00:15:31,410 Now, another way to look at this artist square where new data is to plot a graph of this will use. 159 00:15:33,000 --> 00:15:40,480 Let's load the scatterplot of this data along with our lamina values will right write asanas, don't 160 00:15:40,490 --> 00:15:41,220 join plot. 161 00:15:47,270 --> 00:15:50,990 And since I would never lose, you are in exponential form. 162 00:15:51,170 --> 00:15:53,040 Will take a low of those values. 163 00:16:11,030 --> 00:16:12,700 Overbuy is best mean. 164 00:16:23,220 --> 00:16:25,960 So this is the graph for artists good values. 165 00:16:26,580 --> 00:16:29,580 You can see as we are increasing our lamda. 166 00:16:30,820 --> 00:16:33,400 The artists go to Ray Lewis slightly in freezing. 167 00:16:34,680 --> 00:16:39,120 And after this point, the oddest really starts to decrease. 168 00:16:41,040 --> 00:16:44,670 And then our artists go to value as their own. 169 00:16:44,920 --> 00:16:46,120 Zero four. 170 00:16:46,470 --> 00:16:47,940 Very high lamda values. 171 00:16:48,810 --> 00:16:54,390 You can see we are getting the maximum artist's good value around this around this point. 172 00:16:55,530 --> 00:17:02,010 So what the next objective is to find this point and find the lamda value for this point. 173 00:17:03,000 --> 00:17:08,310 We see that the artists go to value of this pointis zero point seven two four six six. 174 00:17:11,170 --> 00:17:15,790 Not to find the location of all of this will lose it open. 175 00:17:15,880 --> 00:17:18,580 So I went to four double six will right. 176 00:17:18,880 --> 00:17:19,670 And B dot. 177 00:17:24,220 --> 00:17:25,510 Then our test mean 178 00:17:28,360 --> 00:17:33,460 we are just finding wherever testimony is is equal to this when you. 179 00:17:36,360 --> 00:17:41,840 We use Equilar design, the right mix of best me. 180 00:17:44,350 --> 00:17:51,940 We are just finding out the index where this best mean value is equal to maximum of best mean value. 181 00:17:53,200 --> 00:17:58,060 If we run this, you can see the location of this, Max. 182 00:17:58,120 --> 00:17:58,750 That's mean. 183 00:17:58,960 --> 00:18:00,760 Is that deterred index? 184 00:18:02,840 --> 00:18:04,610 So let's find out. 185 00:18:05,240 --> 00:18:10,820 The Lamela value of this started targeted index will rate Barum, underscore range 186 00:18:13,940 --> 00:18:15,030 and then a squared record. 187 00:18:15,080 --> 00:18:16,190 We will rate today three. 188 00:18:17,730 --> 00:18:18,540 If we run this. 189 00:18:20,600 --> 00:18:25,280 This is the Lamela value at which we're getting the maximum artist's good value. 190 00:18:27,260 --> 00:18:32,650 Now you have the LAMDA values for which we have the maximum artist's good value. 191 00:18:35,100 --> 00:18:37,900 We will fit our rage model again for this Lamela. 192 00:18:37,960 --> 00:18:39,470 Well, you will, right? 193 00:18:41,990 --> 00:18:44,570 Alem underscore our best. 194 00:18:48,070 --> 00:18:50,640 Equated to a rich 195 00:18:53,490 --> 00:18:54,670 and our other fiber. 196 00:18:54,810 --> 00:18:56,220 Lou, is this Lemina, Lou? 197 00:19:08,020 --> 00:19:09,500 We've done this, we have cleared that out. 198 00:19:09,830 --> 00:19:12,250 And them are underscore the best variable. 199 00:19:12,900 --> 00:19:16,030 Now, let's forget this on our training dataset. 200 00:19:22,590 --> 00:19:26,380 Then I would equity, but which is X underscore, Kreen underscore. 201 00:19:26,540 --> 00:19:30,210 S and then I would a variable. 202 00:19:36,950 --> 00:19:39,250 We've done this, we have tried our best. 203 00:19:39,330 --> 00:19:40,030 Rich Morton. 204 00:19:41,400 --> 00:19:42,420 Now let's find out. 205 00:19:42,440 --> 00:19:44,880 The artist's good value on our test data. 206 00:19:51,090 --> 00:19:53,360 Values are to under school to. 207 00:19:54,960 --> 00:20:01,330 Then why test and then Alem underscored are under. 208 00:20:01,620 --> 00:20:02,040 Best. 209 00:20:06,570 --> 00:20:11,130 Not predict then X underscore, test, underscore as. 210 00:20:19,650 --> 00:20:20,980 Now, let's find out. 211 00:20:21,030 --> 00:20:23,160 The artist could of a new photo or training data. 212 00:20:38,330 --> 00:20:46,380 And this hour, artists go to a new photo ID training data ascertains zero points on five for it and 213 00:20:46,380 --> 00:20:49,500 for our test data is zero point five four zero. 214 00:20:52,580 --> 00:20:58,820 You will not see any major difference and not as good a value if we run underage or if we're on simple 215 00:20:58,820 --> 00:21:05,780 linear regression in our case, because our data set is very small and it will hardly make any difference 216 00:21:05,930 --> 00:21:14,550 which algorithm or with technique we are using to train our model in case we have hundreds or thousands 217 00:21:14,550 --> 00:21:17,170 of variable and our dataset is very large. 218 00:21:17,300 --> 00:21:21,680 In that case, this my thoughts would really have an answer. 219 00:21:21,710 --> 00:21:22,280 What a model. 220 00:21:24,020 --> 00:21:29,900 Just to get all the things we did first, we standardized our X variable. 221 00:21:31,100 --> 00:21:40,810 We created this standard scalar and we use this scalar to transform our extrem and expressed data. 222 00:21:42,680 --> 00:21:51,770 After that, we executed the raid more than and there we use a constant other form of zero point five. 223 00:21:53,810 --> 00:21:59,690 Now, after that, we created the Angel full force, which would go into our model. 224 00:22:01,130 --> 00:22:07,720 So we created a range of hundred Lambros from 10 that super to minus two, two Penrith's toward eight. 225 00:22:08,480 --> 00:22:10,250 Then we use the word validation. 226 00:22:10,290 --> 00:22:13,550 Got to find the optimum model. 227 00:22:15,680 --> 00:22:22,560 After that, we calculated the lambda value corresponding to that more. 228 00:22:22,590 --> 00:22:29,570 Then after that, we again executed the order rates more for this value of lambda. 229 00:22:30,770 --> 00:22:38,710 And then we calculated the artist's good values on our training dataset and test their Dussart. 230 00:22:41,030 --> 00:22:44,060 Now running LASO is almost similar. 231 00:22:44,810 --> 00:22:48,410 We just have to import LASO and set off Rich. 232 00:22:52,970 --> 00:23:00,020 We will import LASO from Escalon Linnean more the library, and then we will soon. 233 00:23:00,020 --> 00:23:07,340 The leak cleared Lazor object as we created the LMR and that's what our object here. 234 00:23:07,340 --> 00:23:10,070 We will create LMR underscore object. 235 00:23:21,850 --> 00:23:23,870 Similarly, we will give overvalues. 236 00:23:31,910 --> 00:23:35,730 As you can see, executing LASO is very similar to red. 237 00:23:36,560 --> 00:23:39,170 We are not executing LASO here. 238 00:23:39,710 --> 00:23:48,230 I recommend you to execute it on your own and find the optimum value of Alpha using validation. 239 00:23:48,230 --> 00:23:48,420 Cut. 240 00:23:50,420 --> 00:23:50,870 Thanks.