Software Security - Application Security Verification Project

OWASP Open Souce Review Project

In groups of about 4 students, you do a security code review of a small web application, using the OWASP Application Security Verification Standard (ASVS) available as PDF, and using static analysis tools to support the manual code review.

Groups

See here the current list of groups!

Goals

Goals of the project include We realise that this is throwing you in at the deep end. Some of you will have more experience with web applications, PHP, etc. than others. Finding security vulnerabilities in the application is less important than having a sensible & well-argued opinion about the ASVS, static code analysis tools, and the quality and design of the code in the end.

Whether you find all or indeed any security vulnerabilities is not so important, so resist the temptation of getting ideas from other groups. Indeed, we are hoping to use this experiment to get some empirical data on code reviews -- how many eyeballs make for secure code?

Practicalities

As a guide, the goal is that everyone spends around 50 hours in total for this project. You still have around 12 weeks, so spend a morning or afternoon a week.

The first time you get together with your group, fill in the questionnaire [odt] [doc] together and email it to Erik Poll.

What to do

For the given application, check the Verification Requirements listed in OWASP ASVS [PDF] EXCEPT FOR V13 and V17.
Begin with a Level 2 verification (i.e. Standard Level); if you have time, move on to a Level 3 verification (i.e. Advanced Level).

Some things you may run into:

What to report

At the end, you have to provide a report which gives a motivated verdict for all the verification requirements, and which provides some reflection on the whole process, including the ASVS and the use of static code analysis tools. This report MUST be in PDF and mention your names and group number on the front page.

A suggested organisation of the report would be in the sections as listed below, but feel free to diverge of this if you think makes sense. In the organisation below the section giving the verdict would be the longest, simply because it has to list all the ASVS security requirements. Describing the organisation or process for the review might only take half a page or so, and the reflection would probably be longer than that.

  1. Organisation
    Briefly describe the way you organised the review.

    E.g. did you split the work by files in the code, or by category of security requirements? Did you double-check important findings? Or did you use several pairs of eyeballs on the same code/ security requirement, in the hope that more eyeballs spot more problems? (How) did you integrate using the static code analysis tools into this process? Did you use other tools and methods?
    Have you tried to run the application? (If so, was this useful, and did you find than running the application was helpful to then review the code, understanding its functionality better? But you might want to dicuss this in the Reflection section.)

  2. Verdict
    Give your judgement for each of the verification requirements, with a short motivation.

    The judgement could be PASS or FAIL, but don't hesitate to say

    If need be, come up with other sensible judgments, besides PASS, FAIL, DON'T KNOW, etc.

    With respect to motivation: ideally you'd like to give a brief and concise justification, of say one short sentence, for your verdict of a verification requirement. But if a verification requirement is very clear and it is straightforward how one would check it, it might not be worthwhile to write up anything. Conversely, sometimes it may be quite hard to argue that some verification requirement is met: the violation of a requirement is often easier to motivate (namely with a counterexample) than the satisfaction.

    If you resorted to sampling to judge some requirement (or group of requirements) or if you considered some configuration aspects out of scope, that would be something to mention too. You could also say that in Section 1, if that makes more sense.

  3. Reflection
    Reflect on the whole process, including For example, questions to consider are:

  4. (optional) Appendix: vulnerabilities
    Instead of describing vulnerabilities that you came across as part of the motivation in your verdict in Section 2, you could also move the details of these vulnerabilities to an appendix, say in a numbered list, that you can then refer to in the `Verdict' section.
June 19 we'll have some discussion in class to compare results.

Pointers to code analysis tools


Misc sources of information about PHP


Documentation generation tools

The tools below automatically generate some documention and API information from source code.