Use O2 on Owasp Benchmark

50 views
Skip to first unread message

Dinis Cruz

unread,
Aug 19, 2015, 3:09:17 AM8/19/15
to O2 Platform

That would be a nice way to show O2 capabilities :)

See thread below for more details

---------- Forwarded message ----------
From: "johanna curiel curiel" <johanna...@owasp.org>
Date: 19 Aug 2015 01:29
Subject: Re: Project Reviews -New volunteers introduction
To: "Jim Manico" <jim.m...@owasp.org>
Cc: "me" <timog...@runbox.com>, "projects-...@owasp.org" <projects-...@owasp.org>

Hi Jim

I'm just brainstorming what will require to use this properly. Example:

We must have a vulnerable web application and know all its vulnerabilities before hand (like Webgoat or Security Ninja). Thats what Dave did(using Webgoat)
  • run the tools against the vulnerable app
  • evaluate the results of the tool vs the real bugs
  • evaluate false positives and accuracy based on these results.
  • Evaluate how accurate did Benchmark hit the spot😉
Example, its results with ZAP are that ZAP in  not so good with scanning and finding automated bugs. (SQL and Command Injection where the best scores of ZAP with 32%, the rest 0%)

I think is like testing Benchmark and the rest of the tools at the same time

I like the project but the Benchmark results can be biased depending on what kind of vulnerable app he is testing and how well does the tool execute its analysis attack on that. We have a couple of things here. The Benchmark could be good for testing Tools that promote themselves in the category of vulnerability scanners of SAST and DAST categories.

The following OWASP tools fall under the SAST and DAST categories. Like:


And the benchmark only counts for Automated attacks benchmarks.

I would like to have a chat with Dave to understand this better

So if there are volunteers that want to try it, please come forward and let us know.

regards

Johanna

On Tue, Aug 18, 2015 at 8:04 PM, Jim Manico <jim.m...@owasp.org> wrote:
So far, the following.

The Benchmark can generate results for the following tools:

Free Dynamic Analysis Tools (DAST):

If you have access to other DAST Tools, PLEASE RUN THEM FOR US against the Benchmark, and send us the results file so we can build a scorecard generator for that tool.

Free Static Analysis Tools (SAST):

Note: We looked into supporting Checkstyle and Error Prone but neither of these free static analysis tools have any security rules, so they would both score all zeroes, just like PMD.

Commercial Static Analysis Tools (SAST):



More like, I suggest you evaluate it to see if it will help your cause. It might not be the right thing for what you are doing. But an application security benchmarking tool seems very cool. It has limited categories to test from, but for the categories they do have - the number of tests is quite large so it looks promising.

No, I do not know how accurate it is, but they have published a lot of data so far. Very interesting.

Yes, it will require volunteer efforts. Of course....

Again, just throwing an idea over the fence. You can delete and discard if you want, all good!

Aloha,
Jim




On 8/18/15 2:00 PM, johanna curiel curiel wrote:
Ok, so far which OWASP tools have been tested by Dave using Benchmark?

>And please Johanna, this is just a suggestion to consider. I do not want to force any process or tool onto you and your team.

I'm all ears, I just want to understand your suggestion and how much you know of the tool.

Like I said, if I understand you well
-You suggest to us to use benchmark since it has multiple test cases
-We do not know yet how accurate is Benchmark off course(is an incubator project and being tested)
-It will require volunteers to use benchmark and test some of the projects that fall in this category

Agree?




On Tue, Aug 18, 2015 at 7:56 PM, Jim Manico <jim.m...@owasp.org> wrote:
No.... Its a evaluation test suite for any IAST, SAST or DAST tool.

"The OWASP Benchmark for Security Automation (OWASP Benchmark) is a test suite designed to evaluate the speed, coverage, and accuracy of automated vulnerability detection tools and services (henceforth simply referred to as 'tools')."

" You can use the OWASP Benchmark with Static Application Security Testing (SAST) tools, Dynamic Application Security Testing (DAST) tools like OWASP ZAP and Interactive Application Security Testing (IAST) tools. The current version of the Benchmark is implemented in Java. Future versions may expand to include other languages."


On 8/18/15 1:51 PM, johanna curiel curiel wrote:
But the only OWASP tool tested here is ZAP, correct me if I'm wrong...

On Tue, Aug 18, 2015 at 7:49 PM, Jim Manico <jim.m...@owasp.org> wrote:
All I am saying is that IF you want to evaluate tools for their security ability, the OWASP Benchmark has the following test cases and capabilities. I am just mentioning this as something to consider.

Vulnerability Area Number of Tests CWE Number
Command Injection 2708 78
Weak Cryptography 1440 327
Weak Hashing 1421 328
LDAP Injection 736 90
Path Traversal 2630 22
Secure Cookie Flag 416 614
SQL Injection 3529 89
Trust Boundary Violation 725 501
Weak Randomness 3640 330
XPATH Injection 347 643
XSS (Cross-Site Scripting) 3449 79
Total Test Cases 21,041

OWASP Benchmark Project

The OWASP Benchmark for Security Automation (OWASP Benchmark) is a test suite designed to evaluate the speed, coverage, and accuracy of automated vulnerability detection tools and services (henceforth simply referred to as 'tools'). Without the ability to measure these tools, it is difficult to understand their strengths and weaknesses, and compare them to each other. The OWASP Benchmark contains over 20,000 test cases that are fully runnable and exploitable.

You can use the OWASP Benchmark with Static Application Security Testing (SAST) tools, Dynamic Application Security Testing (DAST) tools like OWASP ZAP and Interactive Application Security Testing (IAST) tools. The current version of the Benchmark is implemented in Java. Future versions may expand to include other languages.



On 8/18/15 1:44 PM, johanna curiel curiel wrote:
@Timo
 is to also verify that a project (if it is a software project), can actually perform the function that is supposed to and if it makes any claims in terms of its abilities to verify if these claims are true.

I agree but this mostly depends on how much time do reviewers have to use and test the tool.
Mario suggested that we set as criteria that projects have and demo installation video, which is a great idea and make the process of reviewing easier but not per se we know if the tools does work or not.

You need to install it and test it. In other words we are testers, so far the problem has been that people does not have time to test at this level.


@Jim , you suggest that we(the reviewers aka testers) to use benchmark tool, as test cases? 



On Tue, Aug 18, 2015 at 9:40 AM, me <timog...@runbox.com> wrote:
I don't know if I saw this mentioned in Johanna's email but another
important thing in my opinion as part of reviews is to also verify that
a project (if it is a software project), can actually perform the
function that is supposed to and if it makes any claims in terms of its
abilities to verify if these claims are true.

So for example if you make a tool that can find sql injection in a web
application you should test it on a vulnerable app and see if it can
actually find sqli.

Regards.
Timo

On 13/08/2015 14:42, johanna curiel curiel wrote:
> Hi All
>
> Thank you for reacting to this call. Take the time to read careful the
> following text ;-).
>
> We are looking for volunteers help us review the actual project
> inventory, especially Code & Tool projects.
>
> For that purpose, in 2008 and later in 2013, a similar group was
> gathered to help review new upcoming projects but also to check the
> progress of them. That group worked for a period of 6 months to create a
> health& quality criteria to help evaluate the projects
>
> 2008 criteria
> https://www.owasp.org/index.php/Category:OWASP_Project_Assessment
>
> 2013 criteria
> https://www.owasp.org/index.php/Category:OWASP_Project#tab=Project_Assessments
> https://docs.google.com/spreadsheets/d/1upIyG0L-P-myUM6EPg0aJmCTDvJrdqaVdnjdNBME9is/edit#gid=1
>
> Our goal is  to observe its development, observe its adoption by the
> community and gather some basic information to safeguard a minimum
> quality such as:
>
> _For Code & Tool projects _holding Flagship & LAB status, we closely
> monitor their health every 6 months on the following, among other key
> indicators:
>
>   * Can the project be built correctly?
>   * Does the project has any activity(commits) in the last 6 months?
>   * Does the project had any releases in the last 6 months?
>   * Has the project leaders updated his wiki or website to reflect
>     latest releases?
>
>
> As you can see that requires you to:
>
>   * Get the source code and latest release installation executable(if
>     available)
>   * Check if it builds and installs without issues.
>   * Take print screens and notes of the issues found
>   * Determine how well documented is the project for new devs/volunteers
>     willing to join that project
>
> _For documentation projects_
> we check the health criteria. Quality of text can be so subjective and
> we have not found this easy to do, so we based our judgment on a minimum
> health criteria and the reactions from the
> community.(https://docs.google.com/spreadsheets/d/1upIyG0L-P-myUM6EPg0aJmCTDvJrdqaVdnjdNBME9is/edit#gid=1)
>
>
> Another research I do is to check how is the document being refer in
> publications (through Google books,Amazon and Safari). If the project is
> being heavily used in major and important books, that already tell us
> about its usability on major stream publications.
>
>
> The report results of some major reviews are found here:
> https://www.owasp.org/index.php/LAB_Projects_Code_Analysis_Report
>
>
> We communicate first with the Project leaders to clarify any issues
> encountered during this process. Based on their reaction , a final
> report is released.
>
> As you can see, this is quite a lot of work. Taking the time to get the
> code, read documentation, build the project and smoke test it or do some
> research.
>
> Now that you know what involves this process:
>
>   * Take the time to read first all the wiki regarding Assessments
>   * I'm open to suggestion but please be practical. Most volunteers do
>     not have a lot of time and these reviews require a lot of work
>   * Determine if you have time
>   * Which kind of projects are you interested on reviewing
>   * How many projects would you like to review in the upcoming 3 months
>   * Feedback with me (cc project-task-force) and let us know your
>     thoughts and if you are able to contribute
>   * Questions: I'm available through the day on hangout or Skype.
>
> Thank you for your time and consideration
>
>
> Best regards
>
> Johanna Curiel
> Team leader Project Review Task force
>
>
>
>
>
> --
> You received this message because you are subscribed to the Google
> Groups "OWASP Projects Task Force" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to projects-task-f...@owasp.org
> <mailto:projects-task-f...@owasp.org>.
> To post to this group, send email to projects-...@owasp.org
> <mailto:projects-...@owasp.org>.
> To view this discussion on the web visit
> https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CACxry_1DPrTXp%3D2uh_5g%3DCtUobq7J%3D%3DFjx%2B7PX89vOKrt83Jgg%40mail.gmail.com
> <https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CACxry_1DPrTXp%3D2uh_5g%3DCtUobq7J%3D%3DFjx%2B7PX89vOKrt83Jgg%40mail.gmail.com?utm_medium=email&utm_source=footer>.

--
You received this message because you are subscribed to the Google Groups "OWASP Projects Task Force" group.
To unsubscribe from this group and stop receiving emails from it, send an email to projects-task-f...@owasp.org.
To post to this group, send email to projects-...@owasp.org.
To view this discussion on the web visit https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/55D335BD.7010005%40runbox.com.

--
You received this message because you are subscribed to the Google Groups "OWASP Projects Task Force" group.
To unsubscribe from this group and stop receiving emails from it, send an email to projects-task-f...@owasp.org.
To post to this group, send email to projects-...@owasp.org.
To view this discussion on the web visit https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CACxry_3QQNHbNUPsa3Cx94tdV7wCBa5AtadQc-5kOw-%2B%3DYJSWQ%40mail.gmail.com.

-- 
Jim Manico
Global Board Member
OWASP Foundation
https://www.owasp.org
Join me at AppSecUSA 2015!

--
You received this message because you are subscribed to the Google Groups "OWASP Projects Task Force" group.
To unsubscribe from this group and stop receiving emails from it, send an email to projects-task-f...@owasp.org.
To post to this group, send email to projects-...@owasp.org.
To view this discussion on the web visit https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CACxry_1ckZruNYWiNhB7%3DOGYn9QK5RCN2q6tn3yk6JGiV8MR%3DA%40mail.gmail.com.

-- 
Jim Manico
Global Board Member
OWASP Foundation
https://www.owasp.org
Join me at AppSecUSA 2015!


-- 
Jim Manico
Global Board Member
OWASP Foundation
https://www.owasp.org
Join me at AppSecUSA 2015!

--
You received this message because you are subscribed to the Google Groups "OWASP Projects Task Force" group.
To unsubscribe from this group and stop receiving emails from it, send an email to projects-task-f...@owasp.org.
To post to this group, send email to projects-...@owasp.org.
To view this discussion on the web visit https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/55D3C81C.9070903%40owasp.org.

--
You received this message because you are subscribed to the Google Groups "OWASP Projects Task Force" group.
To unsubscribe from this group and stop receiving emails from it, send an email to projects-task-f...@owasp.org.
To post to this group, send email to projects-...@owasp.org.
To view this discussion on the web visit https://groups.google.com/a/owasp.org/d/msgid/projects-task-force/CACxry_3kz5bFtD%3DEaNAxsrL-1BaXg51YsU9%3DOxhfnaGmUTRSOQ%40mail.gmail.com.
Reply all
Reply to author
Forward
0 new messages