Habitu8 Webinar Outline: 5 Key Metrics for Building Security Awareness Programs with Jason Hoenich, Founder
This webinar aired on May 23, 2018 (watch/listen here). Jason’s notes are as follows:
A. Proving Your Program Has Value
Metrics can be used to prove the value of your program and show a return on your investment. It’s important to take the time to identify which metrics will best support your story so you can set them up and start to gather data ahead of time.
B. Two Approaches to Using Metrics
Reactive: More common approach
- Data is pulled from dashboard reports
- Stories are told based on historical data that doesn’t align with a goal
- No baseline period for comparisons
Proactive: Better approach
- Metrics should drive your strategy and your annual program plan
- Metrics should be used to tell if your strategy is on track or off track
- Using the metrics from an automated dashboard doesn’t allow you to look at the bigger picture and pick from all the data points.
- Start with a goal
- Create a baseline
- This allows you to tell your story from the beginning
- Example: Proving that you need to increase your headcount by tracking the number of projects and the expected completion dates. If you had more people, the projects could be completed sooner. Automated dashboard metrics can’t be used to tell that story.
C. Quality vs Quantity
One good metric is better than five lower quality metrics. It’s not just about the primary metric but the other dimensions of data. Any time you see a metric that doesn’t make sense, there's an opportunity for discussion and to dissect the data further to understand what is going on.
Example: Percentage of endpoints that are fully patched. You start with 1,000 endpoints that are perfectly patched. If you wait a month, then you have 1,000 endpoints that are missing one month of patches. If you wait two months, the data is even more out of date. Without looking at other dimensions of this data, we wouldn’t know that Field operatives can only patch when they are physically in the office, so this data point is out of date.
Quality of the Data
- How clean is your data? Can you tell an accurate story?
- Are all the attributes you are tracking in your uploaded data?
- Problem areas:
- HRIS systems
- Identity management programs and authorization logic
- Someone else inputting the fields
- Solution: You may need to build your own data or work with the teams who input the data.
D. Debunking Common Guidance in the Industry
There are problems with each of the following metrics. Don’t get misguided. It’s not setting your program up to be successful.
- You shouldn’t track training completions. Don’t read into it too much
- The data is only as good as your app or platform AND your ability to enforce it
- Measuring the completion rate of a mandatory activity like training is a very poor engagement metric
- If you don’t have the relationships and the processes on the back end, it isn’t complete
Phishing Metrics - Click Rate
- Doesn’t tell a fair story of your activities
- Track the Report Rate
- Click rates can be manipulated
Look at the Ownership of the Data
- For example: Tracking monthly computer viruses
- There are so many contingencies if the malware will make it to the computer
- Lost or stolen devices
- Ask yourself: Can I claim ownership of the data? Are there dependencies that will muddy this in some way?
- Intranet sites are mostly a required resource for visibility and compliance
- Unless you’re a full content marketing team, you will likely have very little activity on it
Tip: Focus on Behavior Change
- Pick three areas of focus
- Do you have a clear definition of an incident?
- Is there an easy way to report it?
- Is there friction to reporting?
E. What Metrics Are Must Haves (Training-Based)
- If annual training: Training completion rate
- Compliance policy sign-off tracking
- If phishing training: 1-2 phishing based metrics
- Click rate, with background
- Report rate is the star
- Talk about the narrative behind a rise in report rate
F. Reporting Stuff
- Plan from the beginning of your program strategy
- Metrics should be built in from the beginning
- Create a theory
- What initiatives should drive this behavior?
- What results are expected?
- When you gather data:
- Clear definition of an incident
- Report stuff easily (remove friction) to reduce negative results
- Know what will have a negative impact on the results
G. The Sizzle
- Ways to track engagement
- Hours spent learning voluntarily (people opt to attend a seminar)
- Great base metric
- Shows engagement
- Requests for training
- You know you are having an impact
- Shows engagement
- Requests for Security Services
- Third party assessments
- Proactive measures
Tips on getting good responses:
- Get feedback from coworkers and users
- Keep it short (25 questions is too long)
- Base it on a single behavior you want to change, such as locking up social media accounts
- Anonymity is a must.
- Keep the survey to 3-5 questions:
- What did you get out of this?
- What’s one action you are going to take based on this?
- Anecdotes can be very powerful responses
Question: Is it possible that an increase in the amount of time spent in training will be viewed negatively? Will it seem like people are spending too much time training and not enough time doing their jobs?
Answer: If you get that question, it is a cultural issue, and there’s a need to build awareness of your program.
Question: What about measuring net promoter score?
Answer: Transactional NPS can be useful. The timing of the survey matters, and keeping it anonymous is vital for accurate responses. Also measure end to end— the journey of that user interacting with the security apparatus, not just the responses from the training.
Question: How do we take it to the next level? What will the next webinar be on?
Answer: We throw the question back out to you.
Question: How do you convince non-security peers that the checkbox training approach doesn’t work? How do you get people to move past the basic mandatory training and focus on the more advanced training that you know is valuable?
Answer: Discover the issues that concern them, and turn it back on them. Tell them that you can help them get security out of the way. Anticipate what they are going to want to do in terms of a time commitment and the structure of the training.
Question: I’ve faced situations where management thought that the awareness programs were enough to build a “human firewall” and neglected to invest in tools for detection and response. What would you recommend to avoid this situation?
Answer: I hate the phrase “human firewall”. Humans are not firewalls. They are not your first line of defense. They are people. They should not be part of the security fabric. If you prepare for an increase in report rate, it will put the stress on the incident response. Also, I spend time educating my leadership about what the awareness program is and aligning the expectations of the program. I also do a roadshow with the departments.