Software Security - Application Security Verification Project
OWASP ASVS Souce Code Review Project
In groups of 4 or 5 students, we do a security code review of a small
web application, using the
OWASP ASVS (Application Security Verification Standard),
available as
ASVS 3.0.1 (PDF)
and using static analysis tools to support the manual code review.
Groups
See here the current list of groups
Initial findings using the static analysis tools
Goals
Goals of the project include
- Experiencing the process of doing a code review, where you struggle with limited time, people, and knowledge - and too much code without good design documentation;
- Finding out about the capabilities and limitations of code analysis tools;
- Finding out how good current best practices for doing security code reviews are, in particular the OWASP, are; the ASVS is one of the mature documents about (web-)application security.
- Getting some idea of how security design decisions and choices should be performed and documented to facilitate security reviews.
We realise that this is throwing you in at the deep end. Some of
you will have more experience with web applications, PHP, etc.
than others. Finding security vulnerabilities in the
application is less important than having a sensible &
well-argued opinion about the ASVS, static code analysis tools,
and the quality and design of the code in the end.
Whether you find all or indeed any security vulnerabilities is
not so important, so resist the temptation of getting ideas from
other groups.
Practicalities
As a guide, the goal is that everyone spends around 30 hours in
total for this project. You still have around 10 weeks till
Xmas, so spend a morning or afternoon a week.
The first time you get together with your group, fill in the questionnaire
[odt]
[docx]
together and email it to Erik Poll.
What to do
For the given application, check Verification Requirements V2 up to and incl. V9 and V11 in the
OWASP ASVS 3.0.1 [PDF]. We'll skip requirements 2.21, 2.26, 4.5, 4.15, 4.16, 5.1, 7.9, 9.5, 9.10, 11.3, because they are not applicable and/or not in scope if only look at the code and not a particular installation.
Use this Excel sheet for collecting
your results.
All the requirements to skip are already marked there.
- As a very first step, look at the output of the automated code
scan (see info in Brightspace) and map the tool warnings to the ASVS requirements.
Make a list of the requirements that the tool produced a
warning about, and for each of these list how many warnings you get
(and, if you already know, whether these are true or false positives).
We'll compare preliminary findings November 16.
This provides a rough indication of the kind of things that
automated code scanners can help with. Of course, there may
be false positives here, and there will plenty of false
negatives.
And also, maybe not all warnings easily map to requirements.
- Then dig deeper: begin with a Level 1 verification (i.e. Opportunistic);
if you have time, move on to a Level 2 verification (i.e. Standard).
The findings of the tools are one place to start, but once you've
confirmed them as true positives (or false positives?), this
still leaves some remaining requirements.
Some things you may run into:
- The need for sampling?
Some verification requirements will be too labour intensive
to check exhaustively for the entire codebase. In such cases
you might resort to sampling, i.e. only check the requirements
in a few places, and then base your verdict on that sample.
Different strategies are possible for this:
(i) you could choose a random sample (e.g. randomly pick some
files of source code, or pick some part of the interface of the
application to investigate in detail), or (ii) you could decide to
focus on the most security-critical functionality parts
of the code, or (iii) use a combination of these approaches.
- Code vs configuration?
At some places you may run into issues which depend not so much
on the code but rather on the configuration of a particular
installation of the code. The border between code and
configuration is sometimes hard to draw; moreover, one can debate
about how defensive code should be when it comes to
facilitating/enforcing good and secure configuration.
The bulk of the requirements that about the configurations are
already excluded from the list we look at.
If you come across such issues, better to concentrate on the code
first and not to get distracted into considering configuration
issues. It is then good to document where you drew the line
in doing the review. If you note that there are or may be important
configuration issues that impact security, it is of course useful
to flag these in your report.
- Finding one vs all security flaws?
A single security flaw of a certain kind can already show
that some verification condition is broken. So if you find one
flaw, it is not
necessary to then go hunting for more flaws of the same type.
If you do notice that some problem occurs multiple times,
then it is good to record this, as it suggest the verification
condition is broken consistently, and you have not stumbled
on the one place the problem occurs.
- Relevance/applicability
Some of the requirements are not applicable or
relevant for a given application. Mark these as being
'trivially' passed.
- Unclarity
For some security requirements you maybe unable to provide
a judgement, not only because of the issues above, but also
because the seecurity requirement may not be clear to you
or because it is unclear to you if the code meets the
requirement as you understand it, or both.
What to report
At the end, you have to provide
-
a summary of your verdict in this Excel checklist
(to quickly compare results in class.)
This is due in December.
- a report which gives a motivated
verdict for all the verification requirements, and which
provides some reflection on
the whole process, including the ASVS and the use of static code
analysis tools. This report MUST be in PDF and mention your
names, RU student numbers, and group number on the front page.
This is due in January.
A suggested organisation of the report would be in the sections
as listed below, but feel free to diverge of this if you think
makes sense. In the organisation below the section giving the
verdict would be the longest, simply because it has to list all
the ASVS security requirements. Describing the organisation or
process for the review might only take half a page or so, and the
reflection would probably be longer than that.
- Organisation
Briefly describe the way you organised the review.
E.g. did you split the work by files in the code, or by
category of security requirements? Did you double-check important
findings? Or did you use several pairs of eyeballs on the same
code/ security requirement, in the hope that more eyeballs spot
more problems? (How) did you integrate using the static code
analysis tools into this process? Did you use other tools and
methods?
Have you tried to run the application? (If so, was this
useful, and did you find than running the application was helpful
to then review the code, understanding its functionality better?
But you might want to dicuss this in the Reflection section.)
- Verdict
Give your judgement for each of the verification
requirements, with a short motivation.
The judgement could be
- PASS
- FAIL
- DON'T KNOW,
if you are unsure of whether the code meets the requirement,
if you are unsure what the ASVS requirement means, or
or if you did not have time to look at it.
- NOT RELEVANT,
if a given requirements does not apply, and therefore you don't
need to look at it. So it is effectively a PASS, for the
trivial reason that it does not apply.
If you need addition judgments besides these, feel free to
introduce them, but -- as always -- do make sure your use
of terminology is consistent.
With respect to motivation: ideally you'd like to give a brief
and concise justification, of say one short sentence, for your
verdict of a verification requirement. But if a verification
requirement is very clear and it is straightforward how one would
check it, it might not be worthwhile to write up anything.
Conversely, sometimes it may be quite hard to argue that some
verification requirement is met: the violation of a requirement
is often easier to motivate (namely with a counterexample) than
the satisfaction.
If you resorted to sampling to judge some requirement (or group of
requirements) or if you considered some aspects out
of scope (e.g. because it is related to the configuration rather
than the code), that would be something to mention too. You could
also say that in Section 1, if that makes more sense.
- Reflection
Reflect on the whole process, including
- the ASVS,
- the use of static code analysis tools,
- the way you organised the process,
- and possibly also the TestCMS code.
For example, questions to consider are:
- How good (useful, clear, ...) is the ASVS? How could it be
improved?
- How useful were code analysis tools? How could they be
improved?
How did you experience the rates and amounts of false and true
positives? How might that be improved?
- What were the bottlenecks in doing the security review in your
experience?
- Maybe in the points above you can distinguish different
(types of) security flaws or verification requirements.
E.g., are some (categories of) verification requirements
easier to check than others?
- If you would have to do something like this again, what would you do
differently?
Eg. about organising things within the group: i.e.,
in retrospect, what do you think the
best approach is to organise and divide the work in a team? (Dividing the verification requirements over the
team members? Or by dividing the code? Or letting everyone
look at everything, because different people will spot different
things? Or work in pairs where one person confirms the findings
of the other? ...)
- About the TestCMS code: are there important aspects that could (or
should) be changed to improve security? Or aspects that could be
changed to facilitate doing a security review? (E.g. in certain
design decisions you'd like to see explicitly described in
documentation accompanying the code.)
- (optional) Appendix: vulnerabilities
Instead of describing vulnerabilities that you came across as
part of the motivation in your verdict in Section 2, you could
also move the details of these vulnerabilities to an appendix,
say in a numbered list, that you can then refer to in the `Verdict'
section.
At the last lecture we'll have a discussion in class to compare results.
Static code scanning
Info about this is in Brightspace.
Tool support for the ASVS
If you want, you can use the
Security Knowledge Framework, also available here, is a tool that supports using the ASVS in development, with pointers to more background knowledge, code samples, etc. It's more geared towards development rather than code reviews, but the pointers to background info on the ASVS maybe useful. For more info, there a screencast walking you through the online demo.
Misc sources of information about PHP
Documentation generation tools
The tools below automatically generate some documention and API
information from source code, which might be useful to browse the
code. Of course, your favourite IDE for PHP might also have
support for this.