0 1 00:00:01,480 --> 00:00:07,840 So Wikipedia has a pretty neat API that allows you to grab information and images from the Wikipedia 1 2 00:00:07,840 --> 00:00:08,410 site. 2 3 00:00:08,410 --> 00:00:15,460 So if you head over here mediawiki.org, then you can actually see a description of how the Wikipedia API works. 3 4 00:00:15,970 --> 00:00:22,060 And they give you a simple example of how you can make a GET request looking for this exact page,API 4 5 00:00:22,090 --> 00:00:22,900 main page. 5 6 00:00:22,930 --> 00:00:27,080 And it also shows you what is the endpoint and all of the various parameters. 6 7 00:00:27,100 --> 00:00:33,610 So what we want to do is to pull back data about the Barberton Daisy. And in order to do that, we have 7 8 00:00:33,610 --> 00:00:35,690 to construct our API call. 8 9 00:00:35,710 --> 00:00:40,750 So let's have a look at this you URL, so that we can break down all of its various parameters. 9 10 00:00:40,780 --> 00:00:47,640 So this is the base URL and this is provided over here, as well as you can see. 10 11 00:00:47,740 --> 00:00:51,730 This is the same endpoint and this goes to the English version of Wikipedia. 11 12 00:00:51,730 --> 00:00:57,940 So after that php is where all of our parameters come in, and the parameters are preceded with a question 12 13 00:00:57,940 --> 00:01:02,230 mark, and then we start listing the keys and the values. 13 14 00:01:02,350 --> 00:01:08,020 So the first key is what is the format of the data you want to receive and we want it in the form of 14 15 00:01:08,050 --> 00:01:12,940 a JSON. The next parameter is what type of action are you trying to perform. 15 16 00:01:12,940 --> 00:01:16,220 Were trying to perform a query. We're looking for something specific. 16 17 00:01:16,270 --> 00:01:22,420 And then there's this part of the code which specifies that we want the extract and just the intro explain 17 18 00:01:22,420 --> 00:01:28,450 text, not the whole article so that we can cut down the amount of data that were transferring. Some Wikipedia 18 19 00:01:28,450 --> 00:01:30,960 articles can go on for reams and reams and reams. 19 20 00:01:31,000 --> 00:01:36,210 We don't want to import all of that data. All we want is just that top explainer text that you always 20 21 00:01:36,280 --> 00:01:39,070 get at the beginning of every single Wikipedia article. 21 22 00:01:39,250 --> 00:01:44,410 And the last parameter is this thing called titles and this is the thing that we're going to search 22 23 00:01:44,410 --> 00:01:44,860 for. 23 24 00:01:44,890 --> 00:01:47,150 And in my case, it's the Barberton Daisy. 24 25 00:01:47,230 --> 00:01:50,210 You might wonder what is that percentage mark 20. 25 26 00:01:50,380 --> 00:01:54,410 Well, if you delete it and replace it with a space, and hit enter, 26 27 00:01:54,430 --> 00:01:57,490 you'll see this is actually the ASCII code for a space. 27 28 00:01:57,490 --> 00:02:01,450 So when we hit enter with all of that, we get a JSON back. 28 29 00:02:01,510 --> 00:02:04,180 Let's visualize this inside a JSON editor. 29 30 00:02:04,200 --> 00:02:06,790 This will make it a lot easier to see for our human eyes. 30 31 00:02:06,850 --> 00:02:12,130 So if I expand all of these keys, you can see that it's a tree structure, and we have to go quite a few 31 32 00:02:12,130 --> 00:02:16,810 levels deep before we actually get the information that we want which is this extract. 32 33 00:02:16,810 --> 00:02:22,810 Now, if you remember our lesson on Jason's then you'll realize that if our result was called results, 33 34 00:02:23,290 --> 00:02:29,380 then in order to get access to this extract, we have to specify all of the keys that led us there. 34 35 00:02:29,380 --> 00:02:31,930 So everything inside the object is the JSON. 35 36 00:02:31,930 --> 00:02:35,630 So the first level is the query, right? 36 37 00:02:35,800 --> 00:02:43,360 So result, then it will be query. Then the next level after query will be pages. 37 38 00:02:43,660 --> 00:02:46,240 So we'll write pages, 38 39 00:02:46,240 --> 00:02:46,660 right?? 39 40 00:02:46,870 --> 00:02:48,280 But then we're kind of stuck. 40 41 00:02:48,280 --> 00:02:52,690 What is this number? In order to get any further within pages, 41 42 00:02:52,690 --> 00:02:58,020 we have to specify this number. And this number is the page ID 42 43 00:02:58,060 --> 00:03:04,370 as you can see here. But we wouldn't know the page ID when we're searching for this particular object. 43 44 00:03:04,390 --> 00:03:08,370 So how can we go any further? It seems like we're a bit stuck, right? 44 45 00:03:08,410 --> 00:03:12,260 Well, what we can actually do is to add another parameter. 45 46 00:03:12,340 --> 00:03:14,910 And this is called index page IDs. 46 47 00:03:15,010 --> 00:03:17,100 And I found this from the documentation. 47 48 00:03:17,110 --> 00:03:23,140 So, now the JSON that we get back actually contains a reference to that page ID. 48 49 00:03:23,170 --> 00:03:29,330 So you can see that it's still query, but now you've got pageids, and you've got pages. 49 50 00:03:29,380 --> 00:03:40,370 So we can, first, work out the page ID by saying result ["query"] [pageids] first object, and that is going 50 51 00:03:40,370 --> 00:03:44,560 to give us this 102158-- 51 52 00:03:44,600 --> 00:03:52,610 So then we can install this inside a constant, say, code pageid, and then we can use this pageid here. 52 53 00:03:53,330 --> 00:04:01,280 So then that allows us to go further down and we can go into the extract. 53 54 00:04:01,280 --> 00:04:07,970 And the result of this JSON passing will equal this extract here which we'll be able to display inside 54 55 00:04:08,060 --> 00:04:09,440 a label. 55 56 00:04:09,440 --> 00:04:15,080 So that's just a little work around around this particular structure of Wikipedia's API call. 56 57 00:04:15,080 --> 00:04:18,820 Now, there's just one more problem with our http request. 57 58 00:04:18,860 --> 00:04:25,970 You can see that here it says that this is a redirect from a vernacular name to the scientific 58 59 00:04:25,970 --> 00:04:32,360 name. And you can see that the extract doesn't actually give us what we wanted which is the information 59 60 00:04:32,360 --> 00:04:33,930 about this particular flower. 60 61 00:04:33,950 --> 00:04:38,230 Instead, it's telling us that this term is a vernacular term 61 62 00:04:38,270 --> 00:04:44,720 and the article is actually under the scientific name. So we can modify our URL to make that redirect 62 63 00:04:44,780 --> 00:04:45,870 for us. 63 64 00:04:45,920 --> 00:04:48,740 So to do that, we're going to add yet another parameter, 64 65 00:04:48,740 --> 00:04:54,020 and this is called redirect and it's going to equal 1 which stands for true. 65 66 00:04:54,470 --> 00:04:55,840 And if you hit enter, 66 67 00:04:55,980 --> 00:05:02,300 now if you have a look at that extract of ours, you can see it's actually taking us to the information 67 68 00:05:02,360 --> 00:05:03,410 that we want. 68 69 00:05:03,410 --> 00:05:09,050 So inside this lesson, we provided a text file for you to download. And this contains everything that 69 70 00:05:09,050 --> 00:05:12,260 you need in order to make that API call. 70 71 00:05:12,260 --> 00:05:15,130 So we've got the base URL for you. 71 72 00:05:15,230 --> 00:05:19,550 So this is the English version of Wikipedia and this is the URL that you're going to build 72 73 00:05:19,550 --> 00:05:23,660 on top of in order to access the data that we've got here. 73 74 00:05:23,930 --> 00:05:31,480 We've also created a array of dictionary items that will form as the parameters for your HTTP request. 74 75 00:05:31,490 --> 00:05:37,040 So we've included everything that's in this URL including the format, the action, the properties, 75 76 00:05:37,340 --> 00:05:41,980 the titles, the index page IDs, as well as that redirect equals true. 76 77 00:05:42,020 --> 00:05:44,300 So now it's time for your challenge. 77 78 00:05:44,390 --> 00:05:52,490 Can you use the information that you learned in Clima and in FlashChat to format a HTTP request using 78 79 00:05:52,490 --> 00:05:59,300 Alamofire passing it the parameters and the URL, and you should be able to print this entire JSON 79 80 00:05:59,630 --> 00:06:01,550 into your debug area. 80 81 00:06:01,550 --> 00:06:06,430 And if in your console, you can see exactly what I see here on screen, then you would have succeeded. 81 82 00:06:06,530 --> 00:06:12,590 So go ahead give it a try. And if you really get stuck, I'll be there in the next lesson to give you the 82 83 00:06:12,590 --> 00:06:13,430 solution.