1 00:00:00,05 --> 00:00:03,01 - You've probably seen some pretty convincing CGI 2 00:00:03,01 --> 00:00:05,08 and special effects in recent science fiction shows 3 00:00:05,08 --> 00:00:07,00 and movies. 4 00:00:07,00 --> 00:00:08,05 And movie magic has been used 5 00:00:08,05 --> 00:00:11,01 to bring back actors who are no longer with us 6 00:00:11,01 --> 00:00:14,01 or make older actors look much younger. 7 00:00:14,01 --> 00:00:16,08 Welcome to the world of deep fakes. 8 00:00:16,08 --> 00:00:19,08 Deep fakes are created using AI based software 9 00:00:19,08 --> 00:00:23,07 to mimic people's voices and even their faces. 10 00:00:23,07 --> 00:00:24,08 Outside of Hollywood, 11 00:00:24,08 --> 00:00:26,08 criminals are starting to use deep fakes 12 00:00:26,08 --> 00:00:31,01 to make social engineering cyber attacks more convincing. 13 00:00:31,01 --> 00:00:33,02 Deep fake voice attacks have been reported 14 00:00:33,02 --> 00:00:35,08 where the cloned voice of a company's CEO 15 00:00:35,08 --> 00:00:39,09 or other executive was used to commit payment fraud. 16 00:00:39,09 --> 00:00:43,03 The cloned voice tricks employees into making large payments 17 00:00:43,03 --> 00:00:45,02 or changing the payment process 18 00:00:45,02 --> 00:00:48,02 to send funds to a scammer's bank account. 19 00:00:48,02 --> 00:00:50,02 These attacks can be used in conjunction 20 00:00:50,02 --> 00:00:52,09 with business email compromise, or BEC attacks 21 00:00:52,09 --> 00:00:55,05 which I describe in another video. 22 00:00:55,05 --> 00:00:57,02 For instance, an email spoof 23 00:00:57,02 --> 00:00:59,03 to look like it came from your CEO 24 00:00:59,03 --> 00:01:02,07 is sent to someone in your finance department. 25 00:01:02,07 --> 00:01:04,09 The email is requesting an urgent payment 26 00:01:04,09 --> 00:01:08,08 to be processed for an important business transaction. 27 00:01:08,08 --> 00:01:12,00 That's followed up with a phone call using a deep fake audio 28 00:01:12,00 --> 00:01:15,00 of your CEO's voice asking the finance person 29 00:01:15,00 --> 00:01:18,06 to quickly make the payment referred to in the email. 30 00:01:18,06 --> 00:01:20,07 As you can imagine, this combination attack 31 00:01:20,07 --> 00:01:23,02 can be very persuasive. 32 00:01:23,02 --> 00:01:24,09 Deep fake audio applications 33 00:01:24,09 --> 00:01:28,08 that can do this type of voice cloning are easy to find. 34 00:01:28,08 --> 00:01:31,02 And the results can be hard to detect as fake 35 00:01:31,02 --> 00:01:34,05 even to people who know the deep faked person well. 36 00:01:34,05 --> 00:01:37,00 All the attackers need is a good amount of audio 37 00:01:37,00 --> 00:01:39,03 of the target person speaking. 38 00:01:39,03 --> 00:01:41,01 And in the age of podcasting, 39 00:01:41,01 --> 00:01:42,06 it's not hard to find interviews 40 00:01:42,06 --> 00:01:46,03 of high profile people like your company's CEO. 41 00:01:46,03 --> 00:01:49,00 As deep fake technology rapidly develops, 42 00:01:49,00 --> 00:01:51,00 we can expect that it will continue to grow 43 00:01:51,00 --> 00:01:55,00 as a dangerous attack on the cybersecurity threat landscape.