1 00:00:00,510 --> 00:00:08,160 In the last lecture, we discussed how support vector classifier overcomes the limitations of marginal 2 00:00:08,160 --> 00:00:09,390 maximal classifier. 3 00:00:11,040 --> 00:00:18,570 It was by introducing a new parameter called cost and by allowing some amount of misclassification Merza 4 00:00:18,590 --> 00:00:27,540 portrait to classify it still has one limitation support better classify as only work when the observations 5 00:00:27,540 --> 00:00:29,130 are linearly separable. 6 00:00:31,820 --> 00:00:38,630 We have discussed only Linnean hyper plint, but there may be scenarios where the classes are not linearly 7 00:00:38,630 --> 00:00:47,510 separable, such as the scenario four point distributer like this by visual inspection. 8 00:00:47,930 --> 00:00:52,350 We can clearly see that these classes are well separated. 9 00:00:53,510 --> 00:01:01,040 If I can draw a circle here and say that anything within this circle is going to be purple and anything 10 00:01:01,070 --> 00:01:02,450 outside will be blue. 11 00:01:03,200 --> 00:01:06,820 That would be a much better classifier than the one on the right. 12 00:01:10,020 --> 00:01:16,820 So to handle this limitation of linearity, we further generalize and use something known as Cottonelle 13 00:01:16,820 --> 00:01:20,000 method to reach out support vector machines. 14 00:01:24,810 --> 00:01:29,610 Using support vector machines will be able to draw nonlinear boundaries as well. 15 00:01:30,900 --> 00:01:33,570 We'll be discussing support vector machines and the next video.