« Microsoft's SDL and the CWE/SANS Top 25 | Main | XKCD Security Comic on crypto »

The security industry needs to re-align its training expectations for QA

I've been involved in the security community for over 10 years and have worked for small, medium, and
large companies. I have also worked in Quality Assurance and base my comments here on my experiences being a QA tester, and speaking with them as an outsider. I've seen advice in articles, and conferences discussing the need for security training for development and in the last 4-5 years quality assurance.

QA understands business use cases provided to them, and ensuring that the business use cases work (positive testing). Good QA people add negative testing to this mix typically to generate errors/crash things to ensure the platform is fairly stable. The majority of QA people aren't interested in becoming security engineers or having a thorough understanding of vulnerabilities such as sql injection, os commanding, or http response splitting. You may be lucky at your company and have a few that do care about these details but as a general rule they are in short supply and rarely sustainable.

Good training programs should use wording that makes sense to QA. For example most security related input validation testing would be classified as negative testing. Something as small as terminology can go a long way to help to communicate the purpose of why an organization needs to test for a given issue.

Much of the QA focused security training discussed in the industry involves training QA on weaknesses/attacks/vulnerabilities specified in a top 10/25 list. While top 25 lists can provide good insight into what issues you may be concerned with,  I don't think this is always the best approach. A better approach in my opinion is to identify the top 10/25/x/ attacks/weaknesses/vulnerabilities that are likely to affect your own organization and to

- Identify which issues require a human to identify
    - Can this be identified by QA?
    - Can this only be identified by development?
- Identify which ones can be tested in a repeatable automated fashion
    - Using existing QA tools
    - Using Security tools in the QA department
    - Identify which vulns are automatically identifiable in a reliable fashion based on the tools you have
      available to you/after a proper tool evaluation.

For manual issues having good test plan templates (when possible) for certain classes of flaws can go a long way. QA testers speak test plans and test plans covers steps, and expected behaviors. I am a firm believer that many of the vulnerabilities in software that pen testers gloat about being able to find, are really rather trivial and can be taught to anyone. Writing sample test cases for identifying OS commanding, Reflective XSS, or XML Injection is achievable. Keep in mind that 'average' QA testers won't be able to necessarily exploit a flaw, but given the right instruction, flag something needing further review.

Last but not least inform QA who in your organization is available for questions pertaining to these security tests.

Comments welcome :)

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.


All Comments are Moderated and will be delayed!



so true, so true.


There is no standard SQA/SQC organization or individual. There is barely even a standard for developers. I know of at least two major SQE certifications that differ wildly. There are at least two major Java developer certifications, but really only one .NET and one PHP (although multiple Oracle and MySQL ones).

All of these are better than Software Security Engineering, which has just about nothing -- even worse than nothing if you include CSSLP or GSSP.

There are a multitude of issues that arise with comparing developer-testing to quality testing to security testing, all of which are highly dependent on several factors. I would argue that since security is a requirement -- that if a quality test group exists, then that group would be somewhat responsible for testing for those security properties -- and that this would obviously be considered positive testing (not negative testing). However, this is a very popular argument, and somewhat obvious.

This is really uninteresting unless you are talking about case studies. Microsoft Press has a book on "How We Test Software at Microsoft". They have a lot of standards around development, testing, security testing, et al. Many other organizations are not as mature as Microsoft. They are somewhere else, but there is a lot of variety, even between market horizons -- ISV's and Enterprises -- as well as industry verticals -- Financial, E-Commerce, Health, Energy, Government, et al.

Some organizations don't have a well-defined software test group, or it's completely outsourced. Some like Google call their developer-testers SQE's, when really they are developers. I'm not sure that you can make these generalizations like you do.

Certainly, there is a lot of overlap between quality test automation tools and security testing tools, just like there is a lot of overlap between development tools and quality testing tools / security testing tools. For example, you'd see many a web developer using FireBug, but you'd equally see it in the hands of a software security engineer (even if it's used a bit differently).

I consider the quality testing tools to be better than a lot of the security testing tools today. For example, I much prefer Sahi, Selenium, Ghost Train, or Chickenfoot to the Ajax testing engines in AppScan, WebInspect, or OWASP Sprajax. However, it's probably much easier to use a tool like RATS or Fortify on the Javascript source code.

In the same way, fuzz testing tools such as Sulley or PaiMei (even file fuzzing) works a lot less efficiently than would using Klocwork K7 (defunct?). Even when using lcov or other code coverage statistics, it's much better to go directly to the source code for the bug hunt (or at least have it available to utilize these kinds of code comprehension techniques).

No, quality testing can and should focus on whatever developer-testing and/or security testing is missing. Of course it should be mostly positive, functional testing based on the use cases. However, using quality testers to augment or replace developer-testers and/or software security engineers is not something that I recommend.

Keep the three groups working on their separate goals, although they most certainly should be working together and cross-training on the areas/expertise where they do overlap. In other words, SDET's, SQE's, and SSE's all perform basically the same function and require very similar technical/tactical skills. But SDET's are rewarded for defect reduction before integration, while SQE's are rewarded before acceptance testing, and SSE's are also rewarded before acceptance testing -- but only on security-related defects. Strategically, they are working on completely different goals and business drivers -- and each requires its own set of background experience and knowledge that is vastly different than their neighbor.

Tools have nothing to do with this. There shouldn't be a DevInspect, QAInspect, WebInspect, etc. There should only be one. Can SDET's, SQE's, and SSE's all use FindBugs? Of course. Coverity? Yes, definitely. Canoo WebTest? Why not?

As you can see, I don't disagree with you, but I just think differently. Top X "bad stuff" lists are usually great for the tools, not the individuals or teams. Instead, those teams should index their findings by method/person/tool (probably using some sort of reputation system), and decide on their own which set of process, talent, and technology to put towards any given defect-prevention cycle or bug hunt.


"Developers by nature are detail oriented and typically (the good ones anyways) have a deep understanding of flows, and processes from start to finish" - so is a good tester

"The majority of QA people aren't interested in becoming security engineers or having a thorough understanding of vulnerabilities such as sql injection, os commanding, or http response splitting" - I wish I could disagree with this as a lot of the testers I hang around love knowing the techy details. Sadly though they are in the minority and I keep coming across questions from people wanting to know how to do security testing wihtout knowing even the basics

Interesting article apart from the phrase "QA tester" which I hate...


I'm going to echo my colleague Dennis Hurst's "so sad" comment - but I'm going to go a little deeper.

I've written some articles and have been giving [educational, not sales] presentations to QA teams for just over a year now and feel strongly (as you seem to) that QA doesn't necessarily need to understand security... but therein lies the problem of compartmentalization of issues.

Software "security" has typically been the *security team's* problem, and with that every time someone in a different group hears security they automatically assume security will take care of the problem. Unfortunately, those of us IN security understand that's simply not possible. This leaves us with a bit of a problem - but I'll address that when I publish my next article... for now I'll make 2 points

1) When you talk about terminology remember to use "defect" instead of "vulnerability". I've discussed this at length so I won't re-state but ask yourselves how those words are perceived by someone who is a QA engineer.
2) Consider workflow-based "security defects"... which 'scanner' products are historically tragic at finding (for good reason). QA testing is critical-path here... as they understand the possible use-case paths through the application.

Anyway, without getting too long-winded (oops) you're 100% correct. There needs to be more on this topic, and we need to reach QA departments with more than just tools and "gotta-do" orders.

Cheers
./Raf


@Phil

I don't disagree that good testers exist and are capable. I was making a point about the culture/skills/drive of 'average' QA people in my experiences.


We have been doing this training QA people around 8 years now with good/excellent results, so a bit suprised this is realigning. We are Fortune Global 200 oompany (don't want to be more specific as it reveals the company


@Anonymous
I never said that nobody was doing does this correctly. The general vibe from the security industry is to teach QA people a list of vulnerabilities/attacks from some 3rd party top ten list and either say they've done their job, or if it doesn't work out blame them for not absorbing the material as they'd hoped.

Gunner posted an entry today echoing this

"Good training programs should use wording that makes sense to QA", totally agree, they are not going to read the whole OWASP guide as much as we might like them to."

at http://1raindrop.typepad.com/1_raindrop/2009/02/allies-are-where-you-find-them.html .

Your company should talk about this as not many people are. Use the contact form to contact me directly so we can chat.


Having worked for companies supposed to be meeting DoD, FAA, FCC, or FDA standards - I can tell you little is being done to implement "Structured Testing" (Google "Structured Testing" +NIST to find), much less data driven testing or fault injection testing - and absolutely not mathematical modeling. Every manager I have worked with (or interviewed with) has said something to the effect of "Those things are way over our heads here - maybe someday...".

Since there is not a single QA/QC organization that requires an engineering degree to join, I see the situation as hopeless (IEEE/CS started a “Certified Software Engineer” program, but got scared and dropped “Engineer”). Good thing we have all the Six Sigma Black Belts to save the day (look at the MOT and BAC stock prices for the last few years to see how that worked out). Read up on Deming’s rules about “Drive fear from the organization” (20/70/10 rule) and “No slogans” (Six Sigma).

Post a comment







Remember personal info?