WEBVTT

00:00:00.970 --> 00:00:03.510
Let's continue looking at security operations for

00:00:03.510 --> 00:00:05.830
the Certified in Cybersecurity.

00:00:05.830 --> 00:00:09.590
Security operations and administration is where we

00:00:09.590 --> 00:00:12.690
actually put to practice all of the good security

00:00:12.690 --> 00:00:15.610
principles we've talked about until now.

00:00:15.610 --> 00:00:18.910
We started out by looking at protecting data,

00:00:18.980 --> 00:00:21.580
the most valuable asset of the organization.

00:00:22.050 --> 00:00:25.680
Now we'll look at the way we introduce and then,

00:00:25.680 --> 00:00:26.440
of course,

00:00:26.440 --> 00:00:31.400
enforce good security practices before we go on to security awareness

00:00:31.400 --> 00:00:34.940
training and the exam review tips and techniques.

00:00:36.260 --> 00:00:39.720
It is important that a security program is managed.

00:00:40.110 --> 00:00:43.280
Managed means that we have correct oversight,

00:00:43.280 --> 00:00:43.960
ownership,

00:00:43.960 --> 00:00:49.570
and certainly that we're monitoring what is the health of our security program.

00:00:50.230 --> 00:00:53.760
This is something we looked at earlier when we called it governance,

00:00:53.760 --> 00:00:59.990
governance being that ownership by management of the security program.

00:01:00.780 --> 00:01:05.349
The management should help us to develop and certainly be

00:01:05.349 --> 00:01:08.040
committed to the security strategy.

00:01:08.910 --> 00:01:12.600
They should be actively interested in whether or not we are

00:01:12.600 --> 00:01:17.210
compliant with good practices through monitoring and certainly

00:01:17.210 --> 00:01:19.410
complying with the various laws, privacy,

00:01:19.410 --> 00:01:22.380
for example, types of laws,

00:01:22.380 --> 00:01:26.820
and just good practices and protecting the assets of the organization.

00:01:27.990 --> 00:01:30.490
One of the hard parts of this is enforcement.

00:01:30.880 --> 00:01:34.850
There are times when we are going to have to take some type of disciplinary

00:01:34.850 --> 00:01:39.800
action if a person is not following the policies and procedures.

00:01:39.800 --> 00:01:44.020
But security is not something that happens just by nature.

00:01:44.420 --> 00:01:47.320
It's something that has to be put in place through

00:01:47.320 --> 00:01:49.330
the use of things like policies.

00:01:49.540 --> 00:01:55.030
So people know what they must and must not do, of course.

00:01:55.030 --> 00:02:00.040
Policies are actually signed and issued on behalf of management.

00:02:00.460 --> 00:02:01.950
Management should sign them,

00:02:01.950 --> 00:02:08.560
which attest to the fact that they are committed to the enforcement and,

00:02:08.560 --> 00:02:11.570
of course, the use of these policies.

00:02:11.570 --> 00:02:15.560
But a policy is no good if nobody knows about it.

00:02:15.560 --> 00:02:18.700
It has to be communicated to the staff,

00:02:18.700 --> 00:02:24.700
and it should be communicated in such a way that it mandates requirements.

00:02:25.270 --> 00:02:26.720
I had an example of this.

00:02:26.720 --> 00:02:32.280
I was teaching some security awareness programs for a government,

00:02:32.280 --> 00:02:35.770
and the problem was that I had put together all of what

00:02:35.770 --> 00:02:39.410
their policies were and was showing this,

00:02:39.410 --> 00:02:42.570
first of all, to the managers, and the manager said yes,

00:02:42.570 --> 00:02:43.450
but you're missing something.

00:02:43.450 --> 00:02:45.760
Well, what am I missing?

00:02:46.200 --> 00:02:48.730
You didn't put the policy number on every slide.

00:02:49.870 --> 00:02:51.950
I said well, this is the policy.

00:02:51.950 --> 00:02:56.430
But they said but it won't be seen as authoritative if it

00:02:56.430 --> 00:02:58.320
doesn't have the policy number with it.

00:02:58.970 --> 00:03:00.060
And they were right.

00:03:00.060 --> 00:03:01.570
They knew their staff,

00:03:01.570 --> 00:03:05.690
and they knew that the moment the students that attended

00:03:05.690 --> 00:03:09.960
those programs saw that not only was this good practice and

00:03:09.960 --> 00:03:14.690
something we are encouraging, but it was something was mandated by policies like,

00:03:14.700 --> 00:03:17.800
oh, okay, we have to do this.

00:03:17.800 --> 00:03:19.450
And that is the thing.

00:03:19.450 --> 00:03:24.250
People have to realize that policy is not just a suggestion or a good idea.

00:03:24.830 --> 00:03:30.370
The idea is that through policy we can influence people's behaviors,

00:03:30.810 --> 00:03:33.510
A lot of what we're trying to do when we look at security

00:03:33.510 --> 00:03:37.370
awareness is actually change people's actions.

00:03:37.370 --> 00:03:42.120
Well changing people's actions can be done either through pressure.

00:03:42.730 --> 00:03:45.880
If you don't do this, you can lose your job.

00:03:45.880 --> 00:03:50.600
Or we can change their actions by influencing their beliefs.

00:03:51.090 --> 00:03:53.740
Do they believe this is a good thing to do?

00:03:53.830 --> 00:03:54.880
Well then they'll do it.

00:03:55.500 --> 00:04:01.210
So policy in many ways tries to communicate what is a good thing to do.

00:04:01.410 --> 00:04:04.110
So hopefully they believe in it and then follow it.

00:04:04.780 --> 00:04:09.910
We know that a policy should have some type of disciplinary action in it.

00:04:10.670 --> 00:04:15.770
Failure to abide by this policy may result in disciplinary action

00:04:15.780 --> 00:04:18.130
up to and including termination of employment,

00:04:18.130 --> 00:04:19.180
for example.

00:04:19.990 --> 00:04:22.540
It is important that our policies are up to date.

00:04:22.540 --> 00:04:28.330
For many organizations, the policies that they have are years out of date.

00:04:28.550 --> 00:04:33.310
They don't even address today's real world of technologies for example.

00:04:33.790 --> 00:04:37.850
And so people don't get the direction they need on what they should and,

00:04:37.850 --> 00:04:39.400
of course, shouldn't do.

00:04:40.530 --> 00:04:44.830
We said that a high‑level policy should not be technical because a

00:04:44.830 --> 00:04:49.180
high‑level policy provides that strategic overview,

00:04:49.180 --> 00:04:55.220
and you don't want to have to change that policy every time technology changes.

00:04:55.620 --> 00:05:01.350
With many organizations, coming out with a new policy might take 2 years.

00:05:02.390 --> 00:05:03.000
So therefore,

00:05:03.000 --> 00:05:07.290
we have a high‑level policy that is then reinforced or

00:05:07.290 --> 00:05:10.350
supported by functional policies.

00:05:10.650 --> 00:05:14.500
Functional policies are the ones that address specific

00:05:14.500 --> 00:05:17.560
technologies or an area of concern,

00:05:17.560 --> 00:05:21.820
for example remote access or bring your own device,

00:05:21.820 --> 00:05:22.890
for example.

00:05:23.920 --> 00:05:26.220
But a policy is nothing but words.

00:05:26.520 --> 00:05:31.560
It sounds good on paper, but it has to be backed up with the how to.

00:05:31.560 --> 00:05:33.910
And that is why we have procedures.

00:05:33.910 --> 00:05:38.170
Procedures tell us how to abide by the policy.

00:05:38.170 --> 00:05:42.510
Standards set the bar of what is acceptable or not.

00:05:42.790 --> 00:05:49.600
And baselines ensure that we set this bar consistently across the organization,

00:05:49.990 --> 00:05:53.600
not that everybody interprets policy in a different way.

00:05:55.180 --> 00:05:58.300
Some examples of common functional policies,

00:05:58.310 --> 00:06:01.670
for example, in an area of concern, is acceptable use.

00:06:02.230 --> 00:06:06.510
Acceptable use policy is something that's kind of interesting because a

00:06:06.510 --> 00:06:10.480
number of years ago when people had a phone on their desk,

00:06:10.480 --> 00:06:13.760
it was often said this phone is for business use only.

00:06:14.640 --> 00:06:16.980
Well, that was a little bit unrealistic.

00:06:16.980 --> 00:06:20.150
And of course, that's by no means good with today's world.

00:06:21.060 --> 00:06:24.670
The organizations very often had to change that and

00:06:24.670 --> 00:06:27.970
say this is for business use.

00:06:28.080 --> 00:06:34.720
But depending on your manager, there could be reasonable personal use as well.

00:06:34.720 --> 00:06:39.440
If you have to call your partner and say I'm going to be late

00:06:39.440 --> 00:06:41.770
for dinner because I'm working on something,

00:06:41.930 --> 00:06:44.860
well, that's reasonable personal use.

00:06:44.860 --> 00:06:49.970
And there's where we, again, use a word that is open to interpretation.

00:06:49.970 --> 00:06:51.610
What is reasonable?

00:06:52.330 --> 00:06:54.950
We don't want someone spending the whole day,

00:06:54.960 --> 00:07:00.280
for example, just chatting with friends when they should be doing some work.

00:07:00.760 --> 00:07:04.040
So we often see this with acceptable use of the internet.

00:07:04.570 --> 00:07:07.520
It's there primarily for work purposes.

00:07:07.900 --> 00:07:12.430
But if you have a few minutes and want to check something on the latest news,

00:07:12.430 --> 00:07:15.220
that's reasonable use.

00:07:15.220 --> 00:07:19.660
A lot comes down to the culture of our organization.

00:07:20.450 --> 00:07:23.870
Are we very restrictive, or are we very permissive?

00:07:24.630 --> 00:07:28.670
And this is not to say one is better than the other.

00:07:28.670 --> 00:07:34.170
But obviously, we need to know what the culture of our organization is.

00:07:34.290 --> 00:07:39.060
Very restrictive controls that sort of try to limit everything within a

00:07:39.060 --> 00:07:43.640
very tight set of behaviors or very permissive that says,

00:07:43.640 --> 00:07:45.110
yeah, you can do pretty much anything.

00:07:45.110 --> 00:07:47.790
Just don't do anything that's really bad.

00:07:49.070 --> 00:07:52.890
We see one of the areas here is the area of piracy.

00:07:53.680 --> 00:07:58.030
We see a lot of people try to download something maybe off the internet,

00:07:58.040 --> 00:08:03.390
maybe some cool software or a game or even some type of entertainment.

00:08:03.390 --> 00:08:08.150
That could very much be in violation of piracy laws

00:08:08.480 --> 00:08:10.830
because piracy laws say that, yeah,

00:08:10.830 --> 00:08:16.020
you can't just download this and use it without paying for the use of that,

00:08:16.020 --> 00:08:16.880
for example.

00:08:17.300 --> 00:08:20.860
And it's not good for an organization all of a sudden to find

00:08:20.860 --> 00:08:23.300
out that they have all kinds of pirated,

00:08:23.460 --> 00:08:26.190
say, movies on their servers, for example.

00:08:27.750 --> 00:08:31.760
Another common policy is data handling, an important one.

00:08:32.049 --> 00:08:34.380
We know that data protection is important.

00:08:34.760 --> 00:08:38.730
Data handling tells people how to actually handle the

00:08:38.730 --> 00:08:41.440
data usually based on classification.

00:08:42.049 --> 00:08:45.140
So some of the areas we would address in this would include

00:08:45.990 --> 00:08:48.750
only share data on a need‑to‑know basis.

00:08:49.420 --> 00:08:52.270
We don't need to make data available to someone who

00:08:52.270 --> 00:08:54.650
doesn't need it in order to do their job.

00:08:55.360 --> 00:08:58.760
And when it comes to sharing data, can you share it with a customer,

00:08:58.760 --> 00:09:02.390
can you share with a business partner, with a coworker?

00:09:02.920 --> 00:09:05.100
And of course, if it's law enforcement,

00:09:05.100 --> 00:09:09.030
what are the conditions under which that sharing must take place?

00:09:10.230 --> 00:09:13.290
One of the things we have to worry about is inference.

00:09:13.770 --> 00:09:18.790
If I tell a person one thing, they could maybe guess something else.

00:09:19.290 --> 00:09:23.770
So that's why we have to be careful with what data we share and don't share.

00:09:24.380 --> 00:09:26.600
If we turn around and say to somebody no,

00:09:26.600 --> 00:09:28.460
I'm not allowed to tell you about that,

00:09:28.460 --> 00:09:31.510
they can infer that there's something there that is

00:09:31.510 --> 00:09:33.410
important that they are not aware of.

00:09:33.410 --> 00:09:36.600
They can infer that or deduce that.

00:09:36.600 --> 00:09:40.880
Closely related to that is the area of aggregation.

00:09:41.190 --> 00:09:46.290
Aggregation is where, if I give a personal bunch of small pieces of information,

00:09:46.540 --> 00:09:49.470
they could collect them together, aggregate them,

00:09:49.470 --> 00:09:52.050
to learn something that I didn't want them to know.

00:09:52.660 --> 00:09:56.100
So when we're dealing with data, we have to be careful with these things.

00:09:56.350 --> 00:10:01.420
What data are we sharing that could lead to someone learning something they

00:10:01.420 --> 00:10:04.900
shouldn't know through either inference or aggregation?

00:10:05.840 --> 00:10:10.920
We should have a clear policy on how data must be destroyed at end of life.

00:10:10.920 --> 00:10:14.680
Does it need to be shredded, or can it just be deleted?

00:10:15.190 --> 00:10:17.150
Is overwriting good enough?

00:10:17.160 --> 00:10:17.370
Well,

00:10:17.370 --> 00:10:20.300
that doesn't work too well in the cloud where we can't guarantee

00:10:20.300 --> 00:10:22.830
it will even overwrite into the same place.

00:10:23.500 --> 00:10:28.210
Many organizations have policies such as the clean desk and clear screen.

00:10:28.750 --> 00:10:30.230
If you're not at your desk,

00:10:30.490 --> 00:10:35.030
there should be nothing on your desk or on your screen which is confidential.

00:10:35.030 --> 00:10:35.990
Lock it,

00:10:36.260 --> 00:10:39.660
put things away so that nobody walking by could see

00:10:39.660 --> 00:10:42.230
something that should not be then disclosed.

00:10:43.900 --> 00:10:46.610
Most organizations have a password policy,

00:10:46.610 --> 00:10:50.370
and there's been a lot of debate over the years about

00:10:50.370 --> 00:10:53.730
passwords and what's a sufficiently long password.

00:10:53.730 --> 00:10:57.750
Should we require people to change their password?

00:10:58.340 --> 00:11:01.020
We're not going to try and tell you what you should do.

00:11:01.020 --> 00:11:01.880
But certainly,

00:11:01.880 --> 00:11:05.440
you should have a policy so at least people know what

00:11:05.440 --> 00:11:07.830
is expected in your organization.

00:11:08.020 --> 00:11:10.280
How long does the password need to be?

00:11:10.590 --> 00:11:12.590
A minimum of eight characters?

00:11:12.600 --> 00:11:14.090
A minimum of four?

00:11:14.170 --> 00:11:19.980
Well, we at least tell people what the baseline is, say it's eight characters.

00:11:20.370 --> 00:11:23.800
You can have a 12 or 15‑character password if you want,

00:11:23.880 --> 00:11:27.100
but it must be at least 8 characters as an example.

00:11:27.920 --> 00:11:31.200
Does it have to include upper and lowercase?

00:11:31.480 --> 00:11:35.780
Is it something that has to have numbers or special characters?

00:11:36.400 --> 00:11:39.870
That is the complexity of the password that can make it much more

00:11:39.870 --> 00:11:42.760
difficult for someone to guess that password.

00:11:43.150 --> 00:11:47.330
And of course, we quite often say don't base it just on a common word.

00:11:48.490 --> 00:11:49.750
Should it expire?

00:11:50.160 --> 00:11:52.060
There's been a lot of debate about this.

00:11:52.490 --> 00:11:54.620
Even the security community leaders,

00:11:54.620 --> 00:11:58.800
people like Bruce Schneier who often says don't make passwords expire.

00:11:59.590 --> 00:12:03.660
He says you have to think of what do we say to our users every day.

00:12:03.660 --> 00:12:08.120
Choose a password that's impossible to remember, but never write it down.

00:12:08.250 --> 00:12:10.610
Well, of course, that's unrealistic.

00:12:10.970 --> 00:12:15.640
So we have to try to make security possible and simple for people.

00:12:15.640 --> 00:12:19.020
And maybe we have them expiring, maybe not.

00:12:19.450 --> 00:12:23.240
It depends on what is the risk and what is the way our

00:12:23.240 --> 00:12:25.860
organization has chosen to deal with this,

00:12:25.860 --> 00:12:27.670
but then communicate that.

00:12:28.770 --> 00:12:29.450
And of course,

00:12:29.450 --> 00:12:33.170
we should have guidelines and policy and say can you

00:12:33.170 --> 00:12:35.020
share your password with somebody else?

00:12:35.610 --> 00:12:37.300
IT is famous for this.

00:12:37.720 --> 00:12:38.920
IT, very often,

00:12:38.920 --> 00:12:43.130
we have a number of people sharing an account known as administrator

00:12:43.130 --> 00:12:46.880
all with the same user account number and password.

00:12:47.360 --> 00:12:49.840
And maybe that's something that we should,

00:12:49.840 --> 00:12:56.770
in our password policy, certainly try to prohibit or discourage,

00:12:56.770 --> 00:12:58.680
but it's hard to do.

00:12:59.220 --> 00:13:01.670
You take with the executive management team.

00:13:01.670 --> 00:13:02.660
Very often,

00:13:02.660 --> 00:13:05.590
you'll find that the senior executives share their

00:13:05.590 --> 00:13:10.170
password with their assistants and so on so they can get

00:13:10.170 --> 00:13:12.210
into their email accounts and so on.

00:13:12.220 --> 00:13:14.130
So right at the top,

00:13:14.130 --> 00:13:19.530
we have a failure to abide by what an organization's policy might be.

00:13:20.670 --> 00:13:21.310
And of course,

00:13:21.310 --> 00:13:23.980
one of the things that's important here is not to use the

00:13:23.980 --> 00:13:26.100
same password for multiple systems.

00:13:26.550 --> 00:13:31.530
This is where single sign‑on is good so you don't have to have many passwords.

00:13:31.530 --> 00:13:36.210
But let's say you have a password on your social media account.

00:13:36.210 --> 00:13:41.140
You think who cares if somebody gets into my Facebook

00:13:41.140 --> 00:13:43.630
or Twitter or Signal or WhatsApp.

00:13:43.640 --> 00:13:44.690
Who cares?

00:13:45.150 --> 00:13:45.970
But,

00:13:45.970 --> 00:13:50.410
if that's the same password I use on some of the very sensitive systems

00:13:50.410 --> 00:13:54.400
in the company and someone learns my Twitter password,

00:13:54.520 --> 00:13:57.730
does that mean they can now get into the other systems

00:13:57.730 --> 00:14:00.380
that we really do want protected?

00:14:00.880 --> 00:14:03.480
So that's why we have to be very careful and provide

00:14:03.480 --> 00:14:07.500
guidance on whether or not you should ever use the same

00:14:07.500 --> 00:14:10.430
password on multiple types of systems.

00:14:11.010 --> 00:14:15.060
And of course, should we ever share our passwords with anybody else?

00:14:16.380 --> 00:14:20.280
One of the things that's certainly been emerging in

00:14:20.280 --> 00:14:24.890
the last few years is this BYOD, bring your own device.

00:14:24.890 --> 00:14:29.180
And, in fact, in some companies, it's actually called choose your own device.

00:14:29.190 --> 00:14:33.350
Here's a series of different, say, phones you could choose from.

00:14:33.660 --> 00:14:35.810
You can't just choose any phone,

00:14:35.810 --> 00:14:40.790
but you can choose from these three or four the one you would think is best.

00:14:40.790 --> 00:14:46.140
And many organizations have moved towards this BYOD.

00:14:46.140 --> 00:14:51.610
Use the equipment you're comfortable with, the laptop, the phone, whatever.

00:14:52.310 --> 00:14:57.950
But then we should certainly ensure that those devices are properly configured.

00:14:58.460 --> 00:15:03.050
Does that mean, for example, it would have things like antispam,

00:15:03.060 --> 00:15:06.660
a firewall on it, antimalware.

00:15:07.070 --> 00:15:10.390
And what happens if there's a problem?

00:15:10.390 --> 00:15:13.680
If we have to do an investigation,

00:15:14.510 --> 00:15:18.830
does the company have the right to seize that phone and look at it.

00:15:19.320 --> 00:15:23.000
Well this is where we get into a little bit of a legal problem in some

00:15:23.000 --> 00:15:27.480
cases because it could well be that if I have a phone that has both my

00:15:27.480 --> 00:15:30.410
personal data and corporate data on it,

00:15:30.780 --> 00:15:34.580
under no circumstances does the company necessarily have

00:15:34.580 --> 00:15:36.940
the right to look at my personal data.

00:15:37.730 --> 00:15:42.380
And so this is something we should certainly address in policy as well,

00:15:42.380 --> 00:15:45.880
the right to review or audit those devices.

00:15:45.880 --> 00:15:49.460
If I'm going to keep sensitive corporate data,

00:15:49.460 --> 00:15:52.130
organizational data on my phone,

00:15:52.130 --> 00:15:56.400
do I have then some type of mobile device management,

00:15:56.400 --> 00:16:01.260
which would create a sandbox on it so that all of the company's

00:16:01.260 --> 00:16:05.400
data is kept in a virtual machine that maybe even could be remotely

00:16:05.400 --> 00:16:08.210
white if that phone was lost or stolen.

00:16:09.610 --> 00:16:11.440
Another, of course, is privacy,

00:16:11.440 --> 00:16:15.290
having a policy here that is compliant with the laws and

00:16:15.290 --> 00:16:18.510
regulations and the concept of need to know.

00:16:18.510 --> 00:16:21.750
This may require that we obfuscate data.

00:16:21.750 --> 00:16:25.160
We make data so that, if we are displaying it,

00:16:25.170 --> 00:16:27.680
you can't see the whole credit card number.

00:16:27.680 --> 00:16:29.340
You can only see the last four digits.

00:16:29.340 --> 00:16:33.330
A person types in their password and all we see is a series of dots,

00:16:33.330 --> 00:16:34.200
for example.

00:16:34.200 --> 00:16:38.400
That's a type of masking or obfuscation of the data.

00:16:38.800 --> 00:16:42.330
So someone who is shoulder surfing or looking over your shoulder

00:16:42.330 --> 00:16:45.520
while you're typing your password wouldn't actually be able to see

00:16:45.520 --> 00:16:47.640
what the password characters were.

00:16:48.930 --> 00:16:51.210
It is very good to use screen filters.

00:16:51.560 --> 00:16:56.360
I've seen this as I've flown a lot in airplanes and somebody opens up

00:16:56.360 --> 00:16:59.250
their laptop and starts working on something for work.

00:16:59.250 --> 00:17:02.950
And here I am squeezed in the seat beside them and I can

00:17:02.950 --> 00:17:06.770
see confidential corporate data that really they shouldn't

00:17:06.770 --> 00:17:09.020
be showing in a public place.

00:17:09.020 --> 00:17:13.089
Buy a screen filter so that a person who's not

00:17:13.089 --> 00:17:15.470
directly behind the screen can't read it,

00:17:15.470 --> 00:17:19.260
but of course especially when you're dealing with sensitive information

00:17:19.450 --> 00:17:23.310
on our phones and our laptops and iPads and so on.

00:17:24.450 --> 00:17:28.410
We also have to address the area of investigation.

00:17:28.780 --> 00:17:30.460
If there's something that's gone wrong,

00:17:30.940 --> 00:17:34.800
we need to make sure that everything about that investigation is kept

00:17:34.800 --> 00:17:38.890
quiet until the investigation is then closed off.

00:17:39.190 --> 00:17:42.590
We don't want to reveal the fact that we're looking at something

00:17:42.830 --> 00:17:45.680
because it could well be that there is nothing there.

00:17:46.070 --> 00:17:48.170
And if there's no suspicion,

00:17:48.400 --> 00:17:54.320
it can easily be that black mark never leaves that person.

00:17:54.320 --> 00:17:57.590
He was under investigation for that, for example.

00:17:58.410 --> 00:18:02.230
So these are examples of some of the functional policies

00:18:02.510 --> 00:18:05.000
that many organizations put in place.

00:18:05.070 --> 00:18:10.910
So they have now taken the concepts of security and turned them

00:18:10.910 --> 00:18:14.840
into real practical operations and administration.
