1 00:00:00,090 --> 00:00:00,550 Hello. 2 00:00:00,770 --> 00:00:08,110 So before going into session, let's have a quick recap of what we have done in our capital now. 3 00:00:08,520 --> 00:00:15,030 So this is exactly a basic banking and what not to do with it and when to use it and how to compute 4 00:00:15,030 --> 00:00:17,640 value of basically using this cross-validation. 5 00:00:18,060 --> 00:00:22,290 Then we have this use case that we have already solved using over Euclidean distance. 6 00:00:22,650 --> 00:00:25,320 Then it is exactly the thing which is my probability. 7 00:00:25,320 --> 00:00:29,340 Then we have this one distance and Manhattan distance. 8 00:00:29,370 --> 00:00:33,000 You will see this as well, how Google Maps and this all. 9 00:00:33,000 --> 00:00:40,140 I would like to know how, how, how exactly the after it what we have, we have this mathematical intuition 10 00:00:40,810 --> 00:00:44,030 and happenstance after what we have. 11 00:00:44,220 --> 00:00:50,460 We have also learned why it is an algorithm, why exactly the skin in this algorithm. 12 00:00:50,790 --> 00:00:56,700 Then we have this case in which how you compute the distance between your categorical data. 13 00:00:56,730 --> 00:00:59,040 Then we have told you this regarding this. 14 00:00:59,190 --> 00:01:06,330 So in this session, we have to learn at what type of use and what type of real life use cases. 15 00:01:06,630 --> 00:01:07,920 You can go ahead. 16 00:01:07,920 --> 00:01:12,180 You can go ahead with this key and so can it. 17 00:01:12,210 --> 00:01:13,770 It is highly used. 18 00:01:14,130 --> 00:01:17,970 It is highly used in your recommender systems. 19 00:01:18,820 --> 00:01:22,910 Whenever, say, whenever you have to recommend Lexi, Lexi, how Ms. 20 00:01:22,970 --> 00:01:25,120 Yeah, let's let's talk about Amazon. 21 00:01:25,330 --> 00:01:26,370 Let's talk about it. 22 00:01:27,280 --> 00:01:31,520 So let's see how how this Amazon recommends a product. 23 00:01:31,810 --> 00:01:38,450 Well, let's say whenever you are going to Browse on Amazon, how exactly Amazon recommends some product. 24 00:01:38,890 --> 00:01:41,500 So they basically use this in their backyard. 25 00:01:42,310 --> 00:01:42,670 Yeah. 26 00:01:43,030 --> 00:01:44,220 So Violencia. 27 00:01:44,490 --> 00:01:45,370 Yeah, definitely. 28 00:01:45,490 --> 00:01:55,060 So approx, 35 percent of revenue, approx, 35 percent of revenue of Amazon is due to this recommender 29 00:01:55,060 --> 00:01:55,640 system. 30 00:01:55,660 --> 00:02:02,770 So just Amazon, just amazing how Amazon uses Gion and extensively just Amazon there. 31 00:02:02,980 --> 00:02:05,990 So let's talk about our second real life example. 32 00:02:05,990 --> 00:02:08,620 Let's say I have some documents. 33 00:02:08,630 --> 00:02:14,230 Actually, I have some lecci, I have some number of documents and multiple documents, a letter that 34 00:02:14,230 --> 00:02:14,930 has some news. 35 00:02:15,280 --> 00:02:21,070 Let's say these are all the documents on the Internet to determine all the documents on the Internet. 36 00:02:21,070 --> 00:02:29,560 And if if on this document, I have to find I have to find what are a document that contains similar 37 00:02:29,560 --> 00:02:32,830 topics, that contain similar topics. 38 00:02:33,160 --> 00:02:37,330 At the same time, I have to use this canon at that time. 39 00:02:37,330 --> 00:02:40,000 I have to use this document extensively. 40 00:02:40,210 --> 00:02:44,650 Lexis, let's say I'm going to search on Google, Executive Search and Google. 41 00:02:44,830 --> 00:02:45,220 Yeah. 42 00:02:45,640 --> 00:02:48,670 What is your or what is data science? 43 00:02:48,920 --> 00:02:51,610 Let's say you will see you have some. 44 00:02:51,610 --> 00:02:52,210 You have some. 45 00:02:52,210 --> 00:02:53,300 You have some resources. 46 00:02:53,320 --> 00:02:53,920 You have some. 47 00:02:53,920 --> 00:02:55,650 You have some top and enquiry's. 48 00:02:57,040 --> 00:03:03,210 Here, here, here, they use this they uses this cannon to showcase this result. 49 00:03:04,740 --> 00:03:09,000 That's it, similarly, similarly in, let's say, in image recognition. 50 00:03:09,840 --> 00:03:16,740 In image recognition, similarly in video recognition, highly used, highly used algorithm in this 51 00:03:16,740 --> 00:03:17,690 type of use cases. 52 00:03:17,730 --> 00:03:23,960 So let's talk about some advantages and disadvantages of you begin using again against it. 53 00:03:24,540 --> 00:03:24,900 Yeah. 54 00:03:25,540 --> 00:03:28,140 So let's let's let's talk about very first. 55 00:03:28,140 --> 00:03:31,020 What are the pros of using Kinect? 56 00:03:31,470 --> 00:03:32,210 Yeah, definitely. 57 00:03:32,220 --> 00:03:34,170 It is used for both classification. 58 00:03:34,740 --> 00:03:39,450 It is most used for classification and it is used for regression as well. 59 00:03:39,750 --> 00:03:43,860 So we all have learned how how it works in case of classification. 60 00:03:44,160 --> 00:03:51,420 So let's talk about let's talk about how how exactly it works in case of regression support or support. 61 00:03:51,420 --> 00:03:52,480 I have this use. 62 00:03:52,870 --> 00:03:54,780 Suppose I have this use case. 63 00:03:54,990 --> 00:03:58,760 And here let's say this is my neighbors scattered over here. 64 00:03:58,770 --> 00:04:00,070 Let's say this is my class, too. 65 00:04:00,090 --> 00:04:00,980 This is my class. 66 00:04:02,330 --> 00:04:04,120 Let's say this is my new data. 67 00:04:04,380 --> 00:04:11,210 And I have to find to I have to find let's say let's this this is a use case of predicting the price, 68 00:04:11,910 --> 00:04:16,890 predicting the price or predicting the price of flights. 69 00:04:17,780 --> 00:04:22,580 Predicting the price of sweets, so what it will do, let's say I have some new data. 70 00:04:22,610 --> 00:04:24,410 By the way, let's say I have some new data. 71 00:04:25,070 --> 00:04:28,190 So what it will do, let's see if my value is three or five. 72 00:04:29,160 --> 00:04:32,120 So what if my value of three of selected three? 73 00:04:32,690 --> 00:04:36,590 So let's say it will consider three three Naess neighbors. 74 00:04:36,590 --> 00:04:43,910 And basically it will it will mean of all these data points, it will basically take me off all these 75 00:04:43,910 --> 00:04:44,590 data points. 76 00:04:44,990 --> 00:04:47,480 So whenever you have regression, you give. 77 00:04:48,370 --> 00:04:55,660 Whenever you have some regression, you just consider me and whenever you have some classification, 78 00:04:55,660 --> 00:05:02,740 you go with mode or you can also consider your majority or your probability that's all about you. 79 00:05:03,490 --> 00:05:09,520 So basically, one of the pros is it is used in both classification and recreation. 80 00:05:10,070 --> 00:05:18,400 And one of the products that I really feel is it's mathematics behind the mathematics behind this algorithm. 81 00:05:18,400 --> 00:05:18,940 Is that it? 82 00:05:18,940 --> 00:05:25,510 Is it it's definitely it's too easy in comparison to all the other algorithms of machine learning. 83 00:05:25,990 --> 00:05:28,970 And definitely one of the uses it is it is very easy. 84 00:05:29,020 --> 00:05:35,690 It is very easy to implement as well, because you have a class you have a class in your in your in 85 00:05:35,690 --> 00:05:42,930 your neighbors module, in your neighbors module of cyclone, in you best model of cyclone, you have 86 00:05:42,930 --> 00:05:48,240 to just imported and you have to give your data and then you have to say what, what, what exactly 87 00:05:48,260 --> 00:05:49,660 the value of it and you will get it. 88 00:05:49,660 --> 00:05:50,000 Output. 89 00:05:50,140 --> 00:05:50,630 That's it. 90 00:05:51,110 --> 00:05:52,920 That's just like a piece of cake. 91 00:05:53,710 --> 00:05:58,270 So it is definitely very easy to be very easy, very easy to implement. 92 00:05:58,900 --> 00:06:00,730 So what are the pros? 93 00:06:00,790 --> 00:06:06,940 Well, the boss definitely it will take not that much time, that much time in training phase. 94 00:06:07,600 --> 00:06:10,690 It will take not that much time in training this. 95 00:06:11,530 --> 00:06:17,190 So I can say definitely that as it is a lazy learning algorithm that we all have learned. 96 00:06:18,130 --> 00:06:23,990 So, yeah, training wise, it will now take that much time, as has my all of the algorithms will be. 97 00:06:24,100 --> 00:06:27,120 So definitely it is one of the advantages of Ken. 98 00:06:27,490 --> 00:06:29,590 So let's talk about some of the counts. 99 00:06:30,580 --> 00:06:37,110 From the Council of Canada so far, though, definitely one of the concepts here, it is not recommended 100 00:06:37,120 --> 00:06:45,480 whenever you have some huge data from the council and yeah, it is not good for high dimensional data. 101 00:06:45,490 --> 00:06:45,790 Yeah. 102 00:06:46,210 --> 00:06:52,020 Whenever you have some high dimensional data, let's say you have some pandemic data. 103 00:06:52,510 --> 00:06:55,500 So definitely you have to compute distance. 104 00:06:56,040 --> 00:06:58,650 Imagine just imagine if your relatives. 105 00:06:58,990 --> 00:07:03,760 Just imagine you have to come to the new one with each and every data point. 106 00:07:03,770 --> 00:07:05,490 Let's see if this is pandemic. 107 00:07:05,860 --> 00:07:10,240 Just imagine how much combat it and it will be. 108 00:07:10,690 --> 00:07:13,630 That's why that's why it is not good for high dimensional data. 109 00:07:14,140 --> 00:07:15,400 And yeah, it is. 110 00:07:15,400 --> 00:07:24,520 It is it is definitely expensive in in testing fees or I can see it is definitely expensive in my prediction 111 00:07:24,520 --> 00:07:25,170 purpose. 112 00:07:25,180 --> 00:07:32,080 Definitely very, very expensive because it takes lots of resources and they are part of the this to 113 00:07:32,080 --> 00:07:33,950 find out the value of care. 114 00:07:35,230 --> 00:07:41,260 Yeah, so these are all my pros and cons of Gannon and how it is used exactly in case of regression 115 00:07:41,260 --> 00:07:46,170 classification, so we have gold can add up to a greater extent. 116 00:07:46,570 --> 00:07:48,580 So that's all about this algorithm as well. 117 00:07:48,730 --> 00:07:51,460 Hope you love this session and this algorithm as well. 118 00:07:51,940 --> 00:07:52,690 Thank you. 119 00:07:52,770 --> 00:07:54,520 How nice to keep learning. 120 00:07:54,580 --> 00:07:55,360 Keep growing. 121 00:07:55,690 --> 00:07:56,470 Keep practicing.