Wednesday, March 22, 2017

SXSW Day 4 session 4: A new normal: user security in an insecure world

Panel discussion with Alina Selyukh (NPR technology reporter), Bob Lord (Yahoo's chief information security officer (CISO) and Christopher Kirchhoff (previously assistant to Director of Join Chiefs of Staff at Pentagon)

First off, the moderator addressed the most interesting question - the Yahoo attack and how Bob Lord handled it.  He said it was most likely a nation state sponsored attack, which he called "the new norm".  He said that if in the past nation state sponsored attacks were primarily directed at government or military targets, today many corporations are also attacked by nation states, either for industrial espionage, to give its own corporations an advantage or even for revenge (e.g. Sony hacks attributed to North Korea).  He said that many corporations do not understand the meaning and impact of having a nation state attacker - the dedication of resources, time, money and people involved.
Nation state attacks are different than regular hacks in that they are very well funded and can be planned and executed over many years.

Similarly, Kirchhoff was asked about his experience - he is the one who had to deliver the news of the Snoden leaks to the joint chiefs of staff.

Bob Lord was asked about what he does at Yahoo to improve security.  He mentioned a number of things:
1. Red team/Blue team exercises - like many companies, Yahoo conducts red team exercises, where they take a group of hackers and try to penetrate their own systems (red team) while another team tries to detect and stop them (blue team).  He says the red team always wins; they were never able to stop them.  He recommend not building the red team from the people who are in charge of security at your company, as they may fall into certain patterns of thinking and take assumptions due to their knowledge of the security defenses.
2. Phishing exercises - IT sends out phishing emails to employees, to see if any of them click on the included links.  He says he usually uses this technique to test how good the security orientation was.  He said that one of the lessons he learned from Phishing attacks was that the security orientation for new employees was too detailed, with so much information people didn't remember very much of it.  He now prefers more focused sessions on key points he wants employees to remember, so they're not overwhelmed with information.

He said that most failures are procedural - not using proper existing protocols, not making sure updates are applied immediately - a lot of human error.  He said that security problems are not just technical problems, they are also cultural ones.

On the subject of security culture, Kirchhoff described how the Navy developed a "high reliability" culture, to be used in places where small mistakes can have big impacts (such as nuclear submarines).  He mentioned it had 5 principles and mentioned two of them - Forceful backup (he gave an example of not sending two people to do a critically delicate job, even if only one is needed for it), and integrity (if you make a mistake, speak up regardless of consequences).

[As a side note, I think he confused "high reliability" with "operational discipline"; the two pillars he mention come from them (the five pillars of operational discipline are Questioning Attitude, Level of Knowledge, Forceful Watch-team Backup, Formality and Integrity).  High reliability Organizations have different characteristics (Preoccupation with failure, Reluctance to simplify interpretations, Sensitivity to operations, Commitment to resilience, and Deference to expertise)]

Lord mentioned that adding security after the fact cannot work, security has to be designed right into whatever is being developed as it's being developed.  For regular users, rather than corporations, he mentioned these as some of the more important steps to take to improve personal security:

  1. Keep things patched and up to date
  2. Use two factor authentication
  3. Shut down old accounts - hackers know that a lot of people reuse their passwords, so the more accounts you have out there, the better the chance a hacker can stumble across one of your passwords (oh, and don't reuse passwords).
Asked about biometrics, he was not enthusiastic.  He said they can sometimes be captured passively (people leave fingerprints everywhere) and then used against you.  Unlike passwords or digital certificates, you can't revoke and reissue your biometric markers.

He said that the average time between penetration and detection is 200 days; so you need to have your teams ready to do research that far back.  He also mentioned that two thirds of breaches are discovered by the company when someone calls it to tell it it was hacked.

What are the top things they learned?
  • Log retention - because of the long detection time, it is important to keep logs very far back.  This is very expensive, but worth it when the hack happens.
  • Conduct red team/blue team exercises
  • Get top management familiar with the risks and the issues up front, and include them in exercises; so that when something happens, they are more ready to deal with it, rather than having to educate them in the middle of a crisis.

No comments:

Post a Comment