My experience with developer security training
I've been busy this past year which has resulted in almost no updates to this site. Consider this one of many rants/posts of my experience/s in the industry during this time. This post covers a topic I think many people implement poorly, which is security training targeting developers.
How most people implement developer focused security training
I've developed security training programs at several large/medium enterprises, and have spent a lot of time reviewing existing training materials. I've found most of them:
- Focus primarily on learning the OWASP top ten
- Teach how to perform attacks
- Point people to third party material which are almost entirely attack focused
- Tell people what not to do, but not how to do them
- Fail to provide an association with internally discovered issues
- Don't discuss the consequences of a breach that occurred at a real company due to a software flaw
When I see training following this pattern, it's obvious they were created by an author with a security only background.
Realities
"Developers care as much about security, as security cares about learning more about legal and compliance."
What do I mean by this statement exactly? Well, we know that legal and compliance is important, and if asked by either department to support them in some way we will. Does this mean we're going to start buying books on law or compliance and turn them into our latest hobby? A small fraction of us might, but most of us will just see it as something that needs to be done, and we'll be glad to get it over with so we can focus on other more interesting things. The reality is, most people in development know security is important, but will never consider a career in it, and aren't going to have the passion you might about it.
A better approach: Six things to make security training more memorable
Accept that at best a double digit percentage of people taking your training will remember most of it. When building material assume this will be true, and ensure that when you are trying to demonstrate a concept or teach a point, that you present it in the best way for that audience. Don't drown the student in information, only teach them exactly what they need to know. I've found the following approaches to be far more memorable to a developer than the attack focused style of training. This is based on dozens of interviews with developers at several companies I've worked at.
- Telling them where to go for help: Repeating where to go for help is essential and must remain a main theme in your training. This can apply to patching, design decisions, vulnerability fixes and more. Even if they don't remember the training, they do remember that the training tells them where to go for help. If they can't remember where to go for help they can at least pull up the training material and look it up.
- Associating security concerns with development activities: A developer doesn't think about XML Injection, they think about an XML parser to use and it's configuration. Split up your training into topics such as File IO & Permissions, Transporting and storing data (XML & serialization, SQL, etc) , Input and output handling, cryptography & data integrity, authorization and authentication components, etc... Within each category mention the 'attacks' that can occur if certain considerations are not handled. This brings us to our next item.
- Tell dev what they SHOULD do/which tools to use: Prior to rolling out a security training program you should evaluate the data access layer, encryption solutions, networking libraries, authorization/authentication libraries and ensure they have no obvious gaps. Next you should be incorporating the use of these libraries into each section and let development know 'if you use xyz in this way, abc vulnerabilities should not occur'. If they find themselves needing to use something non standard you need to tell them where to go for help (see first point). Most people will fail to remember the attacks/weaknesses and how to mitigate them, but if you tell them that using X prevents a bunch of security issues, it tends to stick with them better. The focus should be on what APIs/libs, design patterns, and features to use to solve a problem.
- Show them actual vulns in your code: For each section of your training attempt to identify actual vulnerabilities in your product. Pull up the bugid in your tracker and show the before and after code, along with possible consequences if it were attacked. When no internal examples exist, find external examples and how they embarrassed a company and focus on the damage it caused.
- Talking about 'vulnerabilities' as defects: Developers don't look at vulnerabilities, they look at bugs/defects. When speaking about 'vulnerabilities' I would advise saying 'security defect' instead, and providing a mechanism within your bug tracker to indicate if it's security related in nature. It makes you come off as less of an outsider. More information on bug tracker changes can be found in an article I wrote on QASEC.
- Create a training incentive program: One can argue people should just learn security because they want to be better developers, but the reality is this doesn't scale. Adobe was one of the first companies promoting their security incentive program (slide 3) where developers could earn 'levels' and gain respect internally. I was fortunate enough to speak with adobe and other companies about how they run their programs and incorporate those lessons into my own training programs. I've adapted adobe's belt format and injected my own modifications, such as prizes (security labeled lanyards, mugs, raspberry pi's, hacker dvd's, etc) , and additional training opportunities (such as a trip to defcon/blackhat) with great success. It's also a great stepping stone for developers wanting to get into the security space.
Conclusions
I expect some people will disagree with me. If you do disagree, and haven't rolled out a security training program before (to 100's-1000's of developers), then don't bother bitching. I can be found @robertauger on twitter if you want to share your own experiences. These views do not represent that of my employer.
Excellent six points! I'd add two more:
- _DO_ use actual exploits to demonstrate vulnerabilities and let the attendees perform these exploits themselves. Nothing sticks better than actually seeing one of those theoretical weaknesses become an actual avenue for data loss/unauthorised access.
- Avoid using "insecure" and "secure" to describe functionality. Instead discuss security bugs in the context of a risk continuum. Have the attendees debate things like likelihood, impact and ease of exploitation. Not every security bug needs to be fixed, and they should be equipped for that debate.
Agree wholeheartedly with the approach that trainers should speak developers' language, and I'd say that the points you raise could be applied to security testing too. Security tests should pass, fail and error; they should be defined up-front. In short they should align more with QA and unit testing processes.
Posted by: Stephen de Vries | Jan 21, 2015 12:52:41 AM
Hi Robert,
interesting article.
We do many security trainings for developers. First agenda time ago was 50% of theory and 50% of examples (no pratical).
After years we modify the agenda and now we focus on perform test on real code and how to fix it.
Real code to fix is the key to have a succesful transfer knowledge!
Adobe approach is interesting to motivate the students (there is an error on the figure at pag 3, it seems that brown and black belts are at the same level)
Cheers,
Mat
Posted by: Matteo Meucci | Jan 21, 2015 10:33:03 AM
One thing that also helps the devs to appreciate and maybe internalise secure coding is learning falsification bias.
Posted by: Bedirhan Urgun | Jan 23, 2015 4:31:55 AM
"Even if they don't remember the training, they do remember that the training tells them where to go for help."
Excellent point! And if they remember who they can ask for help they might be less likely to wade around and try to "fix" it on their own, which could mean they don't actually fix anything.
Posted by: Jessica Dodson | Jan 30, 2015 12:21:16 PM