1 00:00:00,210 --> 00:00:03,820 Welcome to part seven of this module. 2 00:00:03,840 --> 00:00:06,960 Now it's time to go over Skip fish. 3 00:00:06,960 --> 00:00:12,440 It is located in applications web application analysis 4 00:00:14,840 --> 00:00:21,090 skip fish can also open it in the terminal window by typing. 5 00:00:21,240 --> 00:00:29,340 Skip fish tack H and that'll give us a read out of all of the help options as you can see there are 6 00:00:29,340 --> 00:00:36,300 many ways to customize how this tool functions but strictly speaking this program runs in a very direct 7 00:00:36,300 --> 00:00:42,770 and simple way you supply it a target and start the lengthy process of scanning it. 8 00:00:43,080 --> 00:00:51,030 So to save time I'm going to lunch skip fish right away and explain well it does its thing once you 9 00:00:51,030 --> 00:00:53,430 get a feel for Skip fish and what it does. 10 00:00:53,430 --> 00:01:00,260 You can add some of these additional options to the command line and we'll be talking about those towards 11 00:01:00,260 --> 00:01:07,520 the end of the video so to run skip fish in the simplest and most default possible way we type skip 12 00:01:07,520 --> 00:01:15,830 fish tack lowercase o for the output directory we then specify the directory which in this case is going 13 00:01:15,830 --> 00:01:17,220 to be the desktop. 14 00:01:17,450 --> 00:01:23,490 And this is where the results of our scan are going to be stored and then we supply it with the target. 15 00:01:23,510 --> 00:01:29,110 And this would be the Web site that you're scanning which of course in this case is the Met exploited 16 00:01:29,110 --> 00:01:31,880 by a machine and we press enter 17 00:01:35,890 --> 00:01:41,860 this screen appears every time we initially run skip fish we can press any key to continue or just wait 18 00:01:41,860 --> 00:01:44,050 60 seconds and then a press enter. 19 00:01:44,110 --> 00:01:50,300 So right away when we press Enter skip fish is going to create a results folder on the desktop. 20 00:01:50,380 --> 00:01:55,240 You'll notice that once again our target is the mayor's political to virtual machine running on the 21 00:01:55,240 --> 00:02:03,160 network but you could replace the 10 0 0 dot 8 with any Web site you are L that you wish. 22 00:02:03,160 --> 00:02:09,760 But remember always to never run this tool against any target that you do not either personally own 23 00:02:10,120 --> 00:02:11,830 or have written permission to test. 24 00:02:11,890 --> 00:02:18,400 Otherwise you could be breaking the law with the process under way oh elaborate a bit on what is going 25 00:02:18,400 --> 00:02:19,460 on here. 26 00:02:19,570 --> 00:02:25,330 Skip fish is an active Web application security reconnaissance tool. 27 00:02:25,330 --> 00:02:29,520 It was designed and built by get ready for it. 28 00:02:29,560 --> 00:02:38,210 Google itself it comes prepackaged with Kali 2.0 and it's been around since the early days of backtrack 29 00:02:38,210 --> 00:02:39,780 five. 30 00:02:39,800 --> 00:02:48,280 This is one of those ancient tools that has been kept alive by user enthusiasm it prepares an interactive 31 00:02:48,280 --> 00:02:55,990 site map for the target site by carrying out a recursive crawl and dictionary based probes. 32 00:02:56,050 --> 00:03:03,700 The resulting map is then annotated with the output by carrying out a recursive crawl and dictionary 33 00:03:03,700 --> 00:03:05,790 based probes. 34 00:03:05,830 --> 00:03:12,970 The resulting map is then annotated with the output from a number of active but hopefully non disruptive 35 00:03:12,970 --> 00:03:14,830 security checks. 36 00:03:14,830 --> 00:03:21,700 The final report generated by the tool is meant to serve as a foundation for professional web application 37 00:03:21,700 --> 00:03:23,710 security assessments. 38 00:03:23,710 --> 00:03:30,550 Although it is just as often of use to hackers wishing to find vulnerabilities in a target site that 39 00:03:30,550 --> 00:03:38,450 can be leveraged and exploited as I'm speaking skip Phish is performing a wide range of these tests 40 00:03:38,450 --> 00:03:40,880 against our simulated target. 41 00:03:40,940 --> 00:03:43,110 These are noisy and noticeable. 42 00:03:43,190 --> 00:03:49,910 The scans can take anywhere from a few hours to a few days to complete depending on the complexity of 43 00:03:49,910 --> 00:03:51,390 the target. 44 00:03:51,440 --> 00:04:00,590 Unfortunately one drawback of this tool is you won't know going in just how long any such scan might 45 00:04:00,590 --> 00:04:01,690 actually take. 46 00:04:01,700 --> 00:04:07,980 And this can be a real problem for penetration testers who have to adhere to a tight schedule. 47 00:04:08,060 --> 00:04:13,460 The tool boasts high performance values but it tends to be a bit of a hog when it comes to resources 48 00:04:13,820 --> 00:04:18,290 and crawling over an entire site can slow things down measurably. 49 00:04:18,290 --> 00:04:25,010 I mentioned that this tool uses word lists these come pre packed with the tool and I'll speak more about 50 00:04:25,010 --> 00:04:27,740 them when the process is complete for now. 51 00:04:27,740 --> 00:04:33,080 Just keep in mind that you can use the ones that come with the tool or you can supply your own. 52 00:04:33,080 --> 00:04:40,190 However skip fish is a little weird in the fact that it likes to make changes to these lists as it goes. 53 00:04:40,190 --> 00:04:45,080 So make sure you have a backup of whatever list you do decide to use. 54 00:04:45,110 --> 00:04:47,000 If this is a consideration for you. 55 00:04:47,480 --> 00:04:53,350 Alrighty so by now you're probably wondering why should we even bother with this particular tool. 56 00:04:53,450 --> 00:04:59,060 A number of commercial and open source tools with the exact same functionality are readily available 57 00:04:59,390 --> 00:05:03,440 such as Nick to Nexus and so on. 58 00:05:03,440 --> 00:05:04,280 Compared to them. 59 00:05:04,310 --> 00:05:09,650 Skip fish seems like a bit of a blunt blunt instrument for the most part. 60 00:05:09,770 --> 00:05:10,620 It is. 61 00:05:11,210 --> 00:05:14,420 I don't really recommend skip fish over the other major tools. 62 00:05:14,480 --> 00:05:20,340 However it's been kept around for a long time because it does have a few selling points. 63 00:05:20,390 --> 00:05:27,410 Skip fish addresses some common problems that tend to be associated with web security scanners. 64 00:05:27,410 --> 00:05:30,990 For one thing it does offer high performance. 65 00:05:30,990 --> 00:05:40,940 500 plus requests per scan against responsive Internet targets 2000 plus requests per second on local 66 00:05:40,940 --> 00:05:49,040 area networks and seven thousand plus against local instances have actually been observed while leaving 67 00:05:49,040 --> 00:05:53,090 a modest network and memory footprint. 68 00:05:53,100 --> 00:06:00,210 This can be attributed to multiplex single thread fully a synchronous network IO and data processing 69 00:06:00,270 --> 00:06:08,900 model that eliminates memory management scheduling the IP inefficiencies present in multi thread clients. 70 00:06:08,910 --> 00:06:13,160 Remember this was developed by Google engineers. 71 00:06:13,220 --> 00:06:22,400 It also offers advanced each TTP one point one features such as range requests content compression and 72 00:06:22,400 --> 00:06:29,840 people live connections as well as forced response size limiting to keep network level overhaul in check 73 00:06:31,040 --> 00:06:39,080 small response caching and Advanced Server behavior her sticks are used to minimize unnecessary traffic 74 00:06:39,980 --> 00:06:49,210 it is written in pure C and includes a custom each TTP stack if all of that sounded a bit like I'm trying 75 00:06:49,210 --> 00:06:55,950 to sell you a used car I do apologize but that is kind of what this tool is. 76 00:06:55,990 --> 00:07:03,430 It's old but the engine still runs well and so that's why it's kept around in terms of ease of use. 77 00:07:03,460 --> 00:07:06,010 Skip fish is highly adaptive. 78 00:07:06,130 --> 00:07:12,340 The scanner itself features heuristic recognition of obscure path and query based parameter handling 79 00:07:12,340 --> 00:07:20,380 schemes gracefully handling of multi framework sites where certain requests obey completely different 80 00:07:20,380 --> 00:07:25,670 semantics or are subject to different filtering rules. 81 00:07:25,680 --> 00:07:32,400 It also engages an automatic word list construction based on site content analysis. 82 00:07:32,610 --> 00:07:39,580 That is what I meant when I said it likes to edit your word lists probabilistic scanning features. 83 00:07:39,600 --> 00:07:46,470 Also allow periodic time bound assessments of arbitrarily complex sites. 84 00:07:46,470 --> 00:07:54,030 Finally it well-designed security checks as the tool is meant to provide accurate and meaningful results. 85 00:07:54,030 --> 00:07:59,850 Google developed it to improve the safety of Web sites and made it as simple as possible to use. 86 00:07:59,970 --> 00:08:03,840 Though of course its black hat usage is considerable. 87 00:08:03,840 --> 00:08:10,340 With all that being said Skip fish is not a silver bullet and it may be unsuitable for certain processes. 88 00:08:10,440 --> 00:08:17,910 For instance it does not satisfy most of the requirements outlined in the web application security scanner 89 00:08:17,910 --> 00:08:19,720 evaluation criteria. 90 00:08:19,950 --> 00:08:26,940 And unlike most other programs of this type it does not come with an extensive database of known vulnerabilities 91 00:08:27,240 --> 00:08:29,520 for banner type checks. 92 00:08:29,520 --> 00:08:36,460 Also be aware that this tool is used by security professionals and is somewhat experimental in nature. 93 00:08:36,510 --> 00:08:44,220 It may provide false positives or miss obvious security problems even when it is operating perfectly. 94 00:08:44,230 --> 00:08:47,100 Do not take its output at face value. 95 00:08:47,520 --> 00:08:53,190 And last but not least the scanner is simply not designed for dealing with rogue or misbehaving each 96 00:08:53,190 --> 00:08:54,810 TTP servers. 97 00:08:54,810 --> 00:08:59,640 So if you try it against them you've been warned to end the scan early. 98 00:08:59,640 --> 00:09:03,210 Just press control see one way or another. 99 00:09:03,210 --> 00:09:06,720 Once the process is complete or interrupted. 100 00:09:06,840 --> 00:09:13,760 Skip fish will generate a report by creating a file in the path we specified when we ran the program. 101 00:09:13,770 --> 00:09:16,890 So in this case it's going to be located on the desktop 102 00:09:24,260 --> 00:09:30,060 so if we open up the results we can see that the file is actually pretty complex. 103 00:09:30,260 --> 00:09:37,070 But to view them in the simplest terms we just double click on the index each PML file. 104 00:09:37,090 --> 00:09:43,720 This will open our default web browser and present us with a report of course because our target was 105 00:09:43,720 --> 00:09:45,260 a met a spoiler war machine. 106 00:09:45,270 --> 00:09:58,080 There are going to be a lot of flaws and vulnerabilities. 107 00:09:58,130 --> 00:10:06,620 For more information on a particular problem just click on it to expand we could open this in another 108 00:10:06,620 --> 00:10:08,330 tab which I will do 109 00:10:15,110 --> 00:10:22,540 and we can see all of the potential problems of which there are many 110 00:10:27,630 --> 00:10:29,190 going back to the scan results 111 00:10:33,040 --> 00:10:38,320 interesting files incorrect or missing character sets 112 00:10:43,020 --> 00:10:51,490 X SS vector her in document body most of these to do with motility which is built to be exploitable 113 00:10:51,760 --> 00:10:53,560 but you get the idea. 114 00:10:53,800 --> 00:10:59,110 Now of course against a real machine hopefully you're not going to see anything like this many results 115 00:10:59,500 --> 00:11:03,640 but the ones you do see should be very enlightening. 116 00:11:03,640 --> 00:11:10,850 You can then look online and read about the particular vulnerability if the information that the report 117 00:11:10,850 --> 00:11:16,600 itself presents you is inadequate by itself. 118 00:11:16,610 --> 00:11:17,310 All right. 119 00:11:17,510 --> 00:11:23,660 So at this point you have enough information to run skip fish against a target and hopefully you have 120 00:11:23,660 --> 00:11:27,080 a good grasp of what this tool actually does. 121 00:11:27,110 --> 00:11:32,930 If you don't feel like this would be a useful addition to your tool kit or you prefer other applications 122 00:11:33,320 --> 00:11:36,410 you could end the video right here. 123 00:11:36,410 --> 00:11:41,600 For those of you who wish more information I shall now be elaborating on some of Skip Fisher's more 124 00:11:41,930 --> 00:11:44,630 detailed run options. 125 00:11:44,630 --> 00:11:50,930 This section of the module will be somewhat short on demonstration as I'll be going over these one by 126 00:11:50,930 --> 00:11:51,670 one. 127 00:11:51,890 --> 00:11:57,270 As I said You do have enough information that you could now start playing with this on your own. 128 00:11:57,440 --> 00:12:03,320 So continue if you wish and if not thank you for your attention to begin. 129 00:12:03,320 --> 00:12:10,130 If you don't want to store the learned keywords anywhere which is to say you don't want skip fish to 130 00:12:10,130 --> 00:12:18,230 mess around with editing your word lists you can run skip fish with the tack w tack option it would 131 00:12:18,230 --> 00:12:19,940 look like this. 132 00:12:25,860 --> 00:12:30,690 Pardon my typing. 133 00:12:30,710 --> 00:12:38,110 Be aware also that you can run skip fish with multiple starting you are L's all of them will be crawled. 134 00:12:38,300 --> 00:12:44,390 It is even possible to read your URLs from a file using the following syntax 135 00:12:48,590 --> 00:12:51,460 skip fish tap WD hack 136 00:12:58,470 --> 00:13:06,180 at and then you give it the path whatever the path happens to be two and then your list which might 137 00:13:06,180 --> 00:13:09,070 be your RL list. 138 00:13:09,150 --> 00:13:15,570 Text for example the tool will display some helpful stats will the scan is in progress. 139 00:13:15,570 --> 00:13:21,770 You can also switch to a list of a real time each TTP requests by pressing the return key 140 00:13:24,490 --> 00:13:26,860 so in the examples you've seen. 141 00:13:26,860 --> 00:13:34,000 Skip fish scans the entire Web site or Web sites including services on other ports. 142 00:13:34,060 --> 00:13:40,960 If they're linked to from the main page it then writes a report to the output directory that you specified 143 00:13:41,380 --> 00:13:46,680 the index start each time l file that we launched to view the report is static. 144 00:13:46,690 --> 00:13:55,180 But keep in mind the actual results are stored as a hierarchy of J S O N files suitable for machine 145 00:13:55,180 --> 00:14:00,460 processing or different presentation front ends if need be. 146 00:14:00,460 --> 00:14:06,900 In addition a list of all the discovered new URLs will be saved in a single file which is called Pivot 147 00:14:06,900 --> 00:14:11,860 stock text a simple companion script. 148 00:14:12,050 --> 00:14:21,290 S F S can diff can be used to compute a Delta for two scans executed against the same target with the 149 00:14:21,290 --> 00:14:22,850 same flags. 150 00:14:23,060 --> 00:14:30,260 The newer report will be non destructively annotated by adding red background to all new or changed 151 00:14:30,260 --> 00:14:38,860 nodes and blue background to all new or changed issues found some sites may require authentication. 152 00:14:39,110 --> 00:14:45,110 In most cases you'll be wanting to use the form authentication method which is capable of detecting 153 00:14:45,110 --> 00:14:52,010 broken sessions in order to re authenticate once re authenticated certain you are ls on the site may 154 00:14:52,010 --> 00:14:53,600 log out your session. 155 00:14:53,780 --> 00:14:56,950 You can combat that in two ways. 156 00:14:56,960 --> 00:15:00,040 The first is by using the TAC an option. 157 00:15:00,370 --> 00:15:03,070 So that would look like Skip fish. 158 00:15:03,260 --> 00:15:03,950 Oh 159 00:15:06,580 --> 00:15:08,810 desktop results. 160 00:15:08,830 --> 00:15:09,290 Tech. 161 00:15:09,290 --> 00:15:09,810 And. 162 00:15:09,820 --> 00:15:12,820 And then of course you'd supply your target 163 00:15:16,800 --> 00:15:21,650 that causes the scanner to reject attempts to set or delete cookies. 164 00:15:21,660 --> 00:15:24,090 You can also use the x parameter 165 00:15:27,080 --> 00:15:31,700 which would look like X lock outs 166 00:15:35,770 --> 00:15:39,070 which prevents matching your L's from being fetched 167 00:15:42,560 --> 00:15:50,420 the tack X option is also useful for speeding up your scans by excluding items documents and manual 168 00:15:50,420 --> 00:15:55,220 directories and other standard mundane locations like them. 169 00:15:55,220 --> 00:16:01,810 In general you can use tack X and tack I to limit the scope of a scan. 170 00:16:01,830 --> 00:16:02,990 Any way you'd like. 171 00:16:03,140 --> 00:16:09,400 Even restricting it to only a specific protocol or port for example skip fish tack. 172 00:16:09,410 --> 00:16:13,410 I player target 173 00:16:17,700 --> 00:16:23,940 tech I restricts it to only spider you are ls matching substring and the colon. 174 00:16:23,940 --> 00:16:29,590 One two three four is limiting the tool to scan only ports. 175 00:16:29,700 --> 00:16:38,870 One two three four or whatever port you specify another related function is TAC K which allows you to 176 00:16:38,870 --> 00:16:42,990 specify parameter names not to fuzz. 177 00:16:43,040 --> 00:16:51,770 This would be useful for applications that put session ideas in the U or L's themselves and still another 178 00:16:51,770 --> 00:16:59,030 useful scoping option is TAC D which allows you to specify additional hosts or domains to consider in 179 00:16:59,030 --> 00:17:02,030 scope for the test by default. 180 00:17:02,030 --> 00:17:10,010 All hosts appear in the command line you URLs are added to the list but you can use tack D to broaden 181 00:17:10,010 --> 00:17:13,580 these rules for example. 182 00:17:14,530 --> 00:17:22,660 Skip fish tap D and we'll say test dot dot com. 183 00:17:22,660 --> 00:17:24,970 Pardon me rooftop desktop 184 00:17:29,860 --> 00:17:32,450 and then we would supply our target which 185 00:17:35,080 --> 00:17:42,430 we go and it would look like that or you could add a domain wildcard match which would look like this 186 00:17:44,200 --> 00:17:51,510 example dot com and then just whatever your output directory is. 187 00:17:51,790 --> 00:17:54,690 I don't want to retype it and supply your target 188 00:17:58,440 --> 00:17:59,390 in some cases. 189 00:17:59,400 --> 00:18:04,980 You do not want to actually crawl a third party domain but you trust the owner of that domain enough 190 00:18:04,980 --> 00:18:10,560 not to worry about cross domain content inclusion from that particular location. 191 00:18:11,580 --> 00:18:16,490 For example Google Analytics that would be a very popular one. 192 00:18:16,560 --> 00:18:24,830 So to suppress those warnings you can use the TAC B option which would look like this fish tank would 193 00:18:24,830 --> 00:18:36,390 be Google Analytics dot com and then tech B again. 194 00:18:37,000 --> 00:18:44,110 Google Apple stock com and any other parameters that you like of course by default. 195 00:18:44,110 --> 00:18:52,360 Skip fish sends minimalistic H TTP headers to reduce the amount of data exchanged over the wire. 196 00:18:52,420 --> 00:19:00,610 Some sites examine user agent strings or header ordering to reject unsupported clients though in such 197 00:19:00,610 --> 00:19:13,270 a case you can use the TAC lowercase b such as tack lower case b f f o x or tack lower case b phone 198 00:19:13,570 --> 00:19:18,510 to mimic one of those two popular browsers or iPhone. 199 00:19:18,520 --> 00:19:23,230 So in this case FFO X would be Firefox. 200 00:19:23,260 --> 00:19:32,240 In other words for example when it comes to customizing your h TTP requests you can also add the TAC 201 00:19:32,360 --> 00:19:41,000 each option to insert any additional non-standard headers or the TAC F to define a custom mapping between 202 00:19:41,000 --> 00:19:50,970 a host and an IP the alter feature is particularly useful for any not yet launched or legacy services. 203 00:19:50,970 --> 00:19:58,920 Some sites may be too big to scan in a reasonable timeframe particularly if you are under strict time 204 00:19:58,920 --> 00:20:03,110 constraints by which you may conduct your penetration test. 205 00:20:03,690 --> 00:20:12,030 If the site features well-defined tar pits for example 100 thousand nearly identical user profiles like 206 00:20:12,060 --> 00:20:21,500 as part of a social network for instance these specific locations may be excluded with the TAC X and 207 00:20:21,500 --> 00:20:24,400 tech s switches. 208 00:20:24,590 --> 00:20:32,150 In other cases you may need to report other settings tack lower case d limits the crawl depth to a specific 209 00:20:32,150 --> 00:20:39,600 number of sub directories TAC lowercase c limits the number of children per directory and tack lower 210 00:20:39,600 --> 00:20:45,250 case x limits the total number of descendants per crawl tree branch. 211 00:20:45,290 --> 00:20:52,760 Finally tack lower case r limits the total number of requests to send in a scan an interesting option 212 00:20:52,760 --> 00:20:56,780 is available for repeating assessments tack P. 213 00:20:57,080 --> 00:21:06,630 By specifying a percentage between 1 to 100 percent it is possible to tell the crawler to follow fewer 214 00:21:06,630 --> 00:21:13,700 than 100 percent of all the links and try fewer than 100 percent of all the directory entries. 215 00:21:13,710 --> 00:21:20,760 This naturally limits the completeness of a scan but unlike most other settings it does so in a balanced 216 00:21:20,820 --> 00:21:23,190 non deterministic manner. 217 00:21:23,280 --> 00:21:30,340 It is extremely useful when you are setting up time bound but periodic assessments of your infrastructure. 218 00:21:30,420 --> 00:21:39,750 Another related option is TAC Q which sets the initial random seed for the crawler to a specific value. 219 00:21:39,770 --> 00:21:46,420 This can be used to exactly reproduce a previous scan to compare results. 220 00:21:46,430 --> 00:21:54,140 Randomness is relied upon most heavily in the TAC P mode but also for making a couple of others scan 221 00:21:54,140 --> 00:22:03,730 management decisions elsewhere some particularly complex or broken services may involve a very high 222 00:22:03,730 --> 00:22:08,130 number of identical or nearly identical pages. 223 00:22:08,410 --> 00:22:15,250 Although these occurrences are by default grade out in the report they still use some screen estate 224 00:22:15,280 --> 00:22:25,980 and take a while to process on a javascript level in such extreme cases you may use the TAC queue option 225 00:22:25,980 --> 00:22:31,550 to suppress reporting of duplicate nodes altogether before the report is written. 226 00:22:31,560 --> 00:22:38,640 This may give you a less comprehensive understanding of how the site is organized but has no impact 227 00:22:38,640 --> 00:22:40,670 on the test coverage itself. 228 00:22:40,680 --> 00:22:47,690 Some sites that handle sensitive user data care about SSL and about getting it right. 229 00:22:47,830 --> 00:22:55,840 Skip fish may optionally assist you in figuring out problematic mixed content or password submission 230 00:22:55,840 --> 00:22:57,340 scenarios. 231 00:22:57,340 --> 00:23:00,990 Use the TAC M option to enable this. 232 00:23:01,030 --> 00:23:09,490 The scanner will complain about situations such as each TTP scripts being loaded on each TTP pages but 233 00:23:09,490 --> 00:23:18,000 we'll disregard non risk scenarios such as images likewise certain pedantic sites may care about cases 234 00:23:18,330 --> 00:23:25,650 where caching is restricted on each TTP one point one level but no explicit. 235 00:23:25,710 --> 00:23:35,330 Each TTP one point zero caching directive is given on specifying tak e in the command line which causes 236 00:23:35,340 --> 00:23:44,210 skip fish to log all such cases carefully in some occasions you may want to limit the requests per second. 237 00:23:44,340 --> 00:23:54,270 To limit the load on the target's server or to possibly bypass denial of service protection the TAC 238 00:23:54,630 --> 00:23:56,130 lowercase L. 239 00:23:56,220 --> 00:24:04,870 Pardon me lowercase L flag can be used to set this limit and the value given is the maximum amount of 240 00:24:04,870 --> 00:24:13,140 requests per second that you want skip fish to perform scans typically should not take weeks. 241 00:24:13,250 --> 00:24:20,360 In many cases you probably want to limit the scan duration so that it fits within a certain time window. 242 00:24:20,360 --> 00:24:29,180 This can be done with the TAC K flag which allows the amount of hours minutes and seconds to be specified 243 00:24:29,690 --> 00:24:34,730 in hours minutes seconds format. 244 00:24:37,440 --> 00:24:41,070 Like that use of this flag can affect the scan coverage. 245 00:24:41,070 --> 00:24:45,090 If the scan timeout occurs before testing all pages. 246 00:24:45,090 --> 00:24:52,170 Lastly in some assessments that involve self-contained sites without extensive user content the auditor 247 00:24:52,230 --> 00:25:01,630 may care about any external emails or each TTP link seem even if they have no immediate security impact. 248 00:25:01,650 --> 00:25:06,180 Use the TAC uppercase U option to have these logged 249 00:25:11,630 --> 00:25:14,490 dictionary management is a special topic. 250 00:25:14,690 --> 00:25:16,390 For more information on this. 251 00:25:16,400 --> 00:25:18,470 Check out the dictionaries text. 252 00:25:18,470 --> 00:25:28,890 Located in the documents file some of the relevant options for dictionaries include Tak s and Tak w 253 00:25:28,950 --> 00:25:34,860 which we covered earlier and Tak L to suppress auto learning. 254 00:25:34,860 --> 00:25:44,530 That would be if you don't want the dictionaries to be edited as you're going with keywords that Skip 255 00:25:44,530 --> 00:25:53,950 fish finds on the Web site Tak g to limit the keyword guest jar size tack or to drop old dictionary 256 00:25:53,950 --> 00:26:05,180 entries and tack y to inhibit expensive keyword extension fuzzing skip fish also features a form auto 257 00:26:05,180 --> 00:26:14,230 completion mechanic in order to minimize scan coverage the values should be non malicious as they are 258 00:26:14,230 --> 00:26:21,120 not meant to implement security checks but rather to get past input validation logic. 259 00:26:21,370 --> 00:26:28,510 You can also define additional rules or override existing ones with the TAC t option. 260 00:26:28,630 --> 00:26:39,750 For example Tak t log n equals one two three TAC t password equals test three two one. 261 00:26:39,910 --> 00:26:48,130 Although note that tax C and TAC a or a much better method of logging in and this is getting really 262 00:26:48,130 --> 00:26:57,750 long so I'm going to delete some of these. 263 00:26:57,800 --> 00:27:05,540 There's also a handful of performance related options use TAC lowercase G to set the maximum number 264 00:27:05,540 --> 00:27:09,640 of connections to maintain globally to all targets. 265 00:27:09,920 --> 00:27:17,030 It is generally sensible to keep this under 50 or so to avoid overwhelming the TPP IP stack on your 266 00:27:17,030 --> 00:27:21,540 system or on the nearby net firewall devices. 267 00:27:22,040 --> 00:27:31,970 And there's also tack m to set the per IP limit experiment a bit 2 to 4 is usually good for local host 268 00:27:32,300 --> 00:27:41,030 4 to 8 for local networks and maybe 10 to 20 for external targets 30 plus for really lagged or non keep 269 00:27:41,030 --> 00:27:42,660 alive hosts. 270 00:27:42,740 --> 00:27:43,760 You can also 271 00:27:47,080 --> 00:27:52,390 you can also use the TAC lowercase W to set the IO timeout. 272 00:27:52,390 --> 00:27:59,980 For example skip fish will wait only so long for an individual reader right and tack t to set the total 273 00:27:59,980 --> 00:28:01,080 request timeout. 274 00:28:01,090 --> 00:28:11,770 That's lower case t to account for really slow or really fast sites finally tack lowercase f controls 275 00:28:11,770 --> 00:28:16,300 the maximum number of consecutive H TTP errors. 276 00:28:16,300 --> 00:28:25,000 You were willing to see before boarding the scan and tack lowercase s sets the maximum length of a response 277 00:28:25,000 --> 00:28:33,640 to fetch and pass longer responses will be truncated when scanning large multimedia heavy sites. 278 00:28:33,650 --> 00:28:38,210 You may also wish to specify tack lowercase e. 279 00:28:38,270 --> 00:28:44,960 This prevents binary documents from being kept in memory for reporting purposes and frees up a lot of 280 00:28:44,960 --> 00:28:46,450 RAM. 281 00:28:46,670 --> 00:28:53,630 Further rate limiting is available through third party user mode tools such as trickle or kernel level 282 00:28:53,630 --> 00:29:03,030 traffic shaping and last but not least real time scanning statistics can be suppressed with TAC lowercase. 283 00:29:03,050 --> 00:29:07,740 You all right that pretty much does it for Skip fish. 284 00:29:07,760 --> 00:29:13,570 I hope you found this tutorial helpful and that you find skip fish to be a useful tool in your tool 285 00:29:13,570 --> 00:29:15,580 box. 286 00:29:15,590 --> 00:29:15,980 Thank you.