1 00:00:00,380 --> 00:00:08,920 Or Rado Rado rat modelling Part 3 churning once you're happy with your models initial performance on 2 00:00:08,920 --> 00:00:10,570 your training dataset. 3 00:00:10,600 --> 00:00:15,010 The next step is to try and improve it like what we said in last lesson. 4 00:00:15,010 --> 00:00:21,850 How a car can be tuned for different styles of driving a model can be tuned for different types of data. 5 00:00:21,850 --> 00:00:28,050 Specifically your data usually this training will take place on a validation data split. 6 00:00:28,380 --> 00:00:34,680 However if you don't have access to a validation set due to how you've done your your training validation 7 00:00:34,680 --> 00:00:41,610 and test data split it can also happen on the training data on many models have different hyper parameters 8 00:00:41,610 --> 00:00:43,140 which can be adjusted. 9 00:00:43,410 --> 00:00:46,890 You can think of these as the same as dials on an oven. 10 00:00:46,920 --> 00:00:52,920 For example when you're learning to cook your favorite chicken dish that all that delicious sweet honey 11 00:00:52,920 --> 00:00:54,060 mustard dish. 12 00:00:54,060 --> 00:00:56,070 I'm getting hungry making these lectures. 13 00:00:56,610 --> 00:01:02,970 You notice that cooking it at 180 degrees for an hour meant it came out a little raw not ideal right. 14 00:01:02,970 --> 00:01:07,750 I want this chicken to be nice and crispy but at 200 degrees that's where we get the glory. 15 00:01:07,920 --> 00:01:09,990 That's 200 grains for an hour. 16 00:01:09,990 --> 00:01:12,000 My chicken came out just right. 17 00:01:12,000 --> 00:01:14,270 So like a little dial here Ryan. 18 00:01:14,280 --> 00:01:16,670 This is how you consume your oven. 19 00:01:16,860 --> 00:01:20,050 You can change it from 180 grains to be 200 degrees. 20 00:01:20,370 --> 00:01:25,160 If you'd lived 180 grains chicken doesn't turn out well you'd take it up 200 degrees. 21 00:01:25,170 --> 00:01:26,330 Chicken does turn out well. 22 00:01:27,430 --> 00:01:33,110 Depending on what kind of model you're using will depend on what kind of hyper properties you can chew. 23 00:01:33,130 --> 00:01:38,860 For example a random forest will allow you to adjust the number of trees in this one we've got three 24 00:01:38,860 --> 00:01:39,680 trees. 25 00:01:39,680 --> 00:01:43,240 So we've got five trees and we'll have a look at the random forest. 26 00:01:43,240 --> 00:01:45,210 This is a type of machine learning algorithm. 27 00:01:45,230 --> 00:01:52,200 We'll check this out in the future lesson and a no network will allow you to adjust the number of layers. 28 00:01:52,410 --> 00:01:58,590 Again we'll we'll look at a neural network in a future lesson just giving an example of different hyper 29 00:01:58,600 --> 00:02:01,390 parameters like the temperature of an oven. 30 00:02:01,390 --> 00:02:03,930 You can adjust on different kinds of algorithms. 31 00:02:05,090 --> 00:02:10,420 The main things for you to remember are machine learning models have hyper parameters you can adjust 32 00:02:10,940 --> 00:02:15,420 however depending on what model you're going to use the hyper parameters will be different. 33 00:02:16,320 --> 00:02:22,230 The goal of tuning hybrid parameters is to improve your model's performance and you should do model 34 00:02:22,230 --> 00:02:29,430 tuning on your training and or validation sets meaning you train a model using an initial set of hyper 35 00:02:29,430 --> 00:02:30,210 parameters. 36 00:02:30,210 --> 00:02:35,250 These might be the defaults like say for example you put your chicken ingredients in the oven you press 37 00:02:35,400 --> 00:02:41,700 chicken roast chicken on the oven and it goes for an hour on the ovens default settings and then you 38 00:02:41,700 --> 00:02:46,020 find it doesn't turn out how you want it to be you might adjust them as you go. 39 00:02:46,020 --> 00:02:48,530 This is exactly what you do with a machine learning model. 40 00:02:48,630 --> 00:02:53,440 You might try a machine learning model with an initial set of hyper parameters the default. 41 00:02:53,640 --> 00:02:56,870 And it turns out it does okay but not exactly how you want it. 42 00:02:56,880 --> 00:03:04,040 So you might try and improve it by adjusting some of the the hyper parameters up next we've got model 43 00:03:04,040 --> 00:03:04,910 comparison. 44 00:03:04,940 --> 00:03:11,150 Now this happens during your experimentation so you've trained multiple different models on the same 45 00:03:11,150 --> 00:03:12,110 dataset. 46 00:03:12,200 --> 00:03:15,520 We're going to look at how you might compare those and this is done on the test data. 47 00:03:16,270 --> 00:03:16,870 So let's do it.