Category Archives: Security Awareness

Applying Science To Cyber Security

How do we know something works?  The debate about security awareness training continues to drag on, with proponents citing remarkable reductions in losses when training is applied and detractors pointing out that training doesn’t actually stop employees from falling victim to common security problems.  Why is it so hard to tell if security awareness training “works”?  Why do we continue to have this discussion?

My view is, as I’ve written previously, cyber security is an art, not a science.  We collectively “do stuff” because we think it’s the right thing to do.  One night last week, over dinner I was talking to my friend Bob, who works for a large company.  His employer recently performed a phishing test of all employees after each receiving training on identifying and avoiding phishing emails. Just over 20% of all employees fell for the test after being trained.  I ask Bob how effective his company found the training was at reducing the failure rate.  He didn’t know, since there wasn’t a test performed prior to the training.  That’s a significant opportunity to gain insight into the value of training lost.

Bob’s company spent a considerable amount of money on the training and the test, but they don’t know if the training made a difference, and if so, by how much.  Would 60% of employees had fallen for the phishes prior to training?  If so, that would likely indicate the training was worthwhile.  Or would only 21% have fallen for it, and the money spent on the training would have been much better spent on some other program to address the risks associated with phishing?  Should Bob’s employer run the training again this year?  If they do, at least they will be able to compare the test results to last year’s results and hopefully derive some insight into the effectiveness of the program.

But that is not the end of the story.  We do not have only two options available to us: to train or not to train.  There are many, many variations, on the content of the training, the delivery mechanism, the frequency, and the duration, to name a few.  Security awareness training seems to be a great candidate for randomized control tests.  Do employees who are trained cause less security related problems than those who are not trained? Are some kinds of training more effective than other kinds of training? Do some kinds of employees benefit from training or specific types of training more than other types of employees?  Is the training effective against some kinds of attacks and not others, indicating that the testing approach should be more comprehensive?

I don’t know because we either don’t do this kind of science, or we don’t talk about it if we are doing it.  Instead, we impute benefits from tangentially related reports and surveys interpreted by vendors who are trying to impart the importance of having a training regiment, or by vendors who are trying to impart the importance of a technical security solution.

My own view by the way, which is fraught with biases but based on experience, is that security awareness training is good for reducing the frequency of, but not eliminating, employee-induced security incidents.  Keeping this in mind serves two important purposes:

  1. We understand that there is significant risk which must be addressed despite even the best security training.
  2. When an employee is the victim of some attack, we don’t fall into the trap of assuming the training was effective and the employee simply wasn’t paying attention or chose to disregard the training delivered.

We wring our hands about so many aspects of security: how effective is anti-virus and is it even worth the cost, given it’s poor track record? Does removing local administrator rights really reduce the instances of security incidents?  How fast do we need to patch our systems?

These are all answerable questions.  And yes, the answers often rely at least in part on specific attributes of the environment they operate in.  But we have to know to ask the questions.

Human Nature And Selling Passwords

A new report by Sailpoint indicating that one in seven employees would sell company passwords for $150 is garnering a lot of news coverage in the past few days.  The report also finds that 20% of employees share passwords with coworkers.  The report is based on a survey of 1,000 employees from organizations with over 3,000 employees.  It isn’t clear whether the survey was conducted using statistically valid methods, so we must keep in mind the possibility for significant error when evaluating the results.

While one in seven seems like an alarming number, what isn’t stated in the report is how many would sell a password for $500 or $1,000.  Not to mention $10,000,000.  The issue here is one of human nature.  Effectively, the report finds that one in seven employees are willing to trade $150 for a spin of a roulette wheel where some spaces result in termination of employment or end his or her career.

Way back in 2004, an unscientific survey found that 70% of those surveyed would trade passwords for a chocolate bar, so this is by no means a new development.

As security practitioners, this is the control environment we work in.  The problem here is not one of improper training, but rather the limitations of human judgement.

Incentives matter greatly.  Unfortunately for us, the potential negative consequences associated with violating security policy, risking company information and even being fired are offset by more immediate gratification: $150 or helping a coworker by sharing a password.  We shouldn’t be surprised by this: humans sacrifice long term well being for short term gain all the time, whether smoking, drinking, eating poorly, not exercising and so on.  Humans know the long term consequences of these actions, but generally act against their own long term best interest for short term gain.

We, in the information security world, need to be aware of the limitations of human judgement.  Our goal should not be to give employees “enough rope to hang themselves”, but rather to develop control schemes that accommodate limitations of human judgement.  For this reason, I encourage those in the information security field to become familiar with the emerging studies under the banner of cognitive psychology/behavioral economics.  Better understanding the “irrationalities” in human judgement, we can design better incentive systems and security control schemes.

Day 2: Awareness of Common Attack Patterns When Designing IT Systems

One of the most common traits underlying the worst breaches I’ve seen, and indeed many that are publicly disclosed, is related to external attackers connecting to a server on the organization’s Active Directory domain.

It seems that many an IT architect or Windows administrator are blind to the threat this poses. An application vulnerability, misconfiguration and so on can provide a foothold to an attacker to essentially take over the entire network.

This is just an example, but it’s a commonly exploited tactic. Staff members performing architecture-type roles really need to have some awareness and understanding of common attacker tactics in order to intelligently weigh design points in an IT system or network.

Cyber Security Awareness Month

Tomorrow starts National Cyber Security Awareness Month. Many different organizations will be posting security awareness information to help your employees not get cryptolockered and to help your friends and family keep their private selfies private.

I’m going take a different path with this site for the month of October. I’m going to talk about security awareness for US – IT and infosec people.

Crazy, right?

I have been working in this field for a long time. I see stunningly bad decisions by IT behind the worst incidents I’ve been involved in. These decisions weren’t malicious, but rather demonstrate a lack of awareness about how spectacularly IT infrastructures can fail when they are not designed well, when we misunderstand the limitations of technology and when we’re simply careless while exercising our administrative authority.