[SAMM] Penetration tests

17 views
Skip to first unread message

fabian....@optimabit.com

unread,
Sep 12, 2011, 3:58:39 AM9/12/11
to OpenSAMM
Hello,

as part of my Bachelor's Thesis I was pondering on the subject of
penetration tests, the result of which you can see here:

http://code.google.com/p/opensamm/issues/detail?id=8

Basically, the tests as they are described in ST1B are not "penetration
tests" as they are described by e.g. BSIMM and they do also not match my
experience working for a security company. Rather, I see the OpenSAMM
"penetration tests" as simple security tests.

By trying to stay repeatable, they lack an explorative approach to
vulnerability detection: "Once specified, security test cases can be
executed by security-savvy quality assurance or development staff"

My question is: why were they omitted from SAMM? The only "discussion"
of the topic I could find was this post on the list:

https://lists.owasp.org/pipermail/samm/2010-September/000217.html

I can see the arguments made there, but I don't think they warrant a
total exclusion of the concept of penetration tests from OpenSAMM.

Any opinions?

Fabian Streitel

--
Start using GPG! (http://www.gnupg.org/)


_______________________________________________
SAMM mailing list
SA...@lists.owasp.org
https://lists.owasp.org/mailman/listinfo/samm

Christian Heinrich

unread,
Sep 13, 2011, 6:33:42 PM9/13/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

On Mon, Sep 12, 2011 at 5:58 PM, <fabian....@optimabit.com> wrote:
> Basically, the tests as they are described in ST1B are not "penetration
> tests" as they are described by e.g. BSIMM and they do also not match my
> experience working for a security company. Rather, I see the OpenSAMM
> "penetration tests" as simple security tests.
>
> By trying to stay repeatable, they lack an explorative approach to
> vulnerability detection: "Once specified, security test cases can be
> executed by security-savvy quality assurance or development staff"

The subtle difference with "penetration testing" in OpenSAMM is that
it is undertaken before release the vendor's environment i.e. maybe
dev or UAT and is similar to
https://www.owasp.org/index.php/How_to_bootstrap_your_SDLC_with_verification_activities
and they are a repeatable e.g. measuring the entropy of session
cookies.

What you are referring to as "penetration testing" is once the
software has shipped and installed within the end user's environment.


--
Regards,
Christian Heinrich
http://www.owasp.org/index.php/user:cmlh

Justin Clarke

unread,
Sep 14, 2011, 3:32:00 AM9/14/11
to Software Assurance Maturity Model (SAMM)
Another thing worth noting is that SAMM covers the development
process. Penetration testing as it is traditionally practiced is
external to the development process.

Justin


Sent from a mobile device

On 13 Sep 2011, at 23:33, Christian Heinrich

fabian....@optimabit.com

unread,
Sep 14, 2011, 6:27:39 AM9/14/11
to OpenSAMM
Hello,

In reply to Christian:

> > The subtle difference with "penetration testing" in OpenSAMM is that
> > it is undertaken before release the vendor's environment i.e. maybe
> > dev or UAT and is similar to
> > https://www.owasp.org/index.php/How_to_bootstrap_your_SDLC_with_verification_activities
> > and they are a repeatable e.g. measuring the entropy of session
> > cookies.

Are you sure you posted the right link? It seems to me this document is
specific to design review. Although parts of it could also conceivably
be applied to penetration tests.

> >
> > What you are referring to as "penetration testing" is once the
> > software has shipped and installed within the end user's environment.

This is only partly true. Being as close as possible to a production
environment is crucial for penetration testing, yet a test environment
is of course preferrable to a live environment, since damage to live
data can always occur.

I see OpenSAMMs "penetration testing" as a form of security testing,
since a) it doesn't cover the problems that arise once a software is
deployed in its live environment and b) they do not offer room for
explorative searching for vulnerabilities.

Seeing that BSIMM places a strong emphasis on both and that it is common
practice to conduct such tests (at least in my experience), I find it
strange that OpenSAMM does not mention the concept at all.


In reply to Justin:

On Wed, Sep 14, 2011 at 08:32:00AM +0100, Justin Clarke wrote:
> Another thing worth noting is that SAMM covers the development
> process. Penetration testing as it is traditionally practiced is
> external to the development process.

I do not agree because of three reasons:

1. OpenSAMM does indeed have activities that are not directly
related to the development process, especially in the Deployment
business function, e.g. "Identify and deploy relevant operations
protection tools".

2. I don't think excluding operations and deployment is not desirable for a model
that adresses security, since they form the basis of a secure
application. This may be less true for some kinds of applications but is
especially true for example for web applications.

3. Penetration testing may itself not be part of the development process
in practice (though one could argue that it should be), but its results
are fed back into it, so it has some relevance for it.

Regards,

Pravir Chandra

unread,
Sep 14, 2011, 10:10:08 AM9/14/11
to OpenSAMM
Just to chime in on this thread, I think there's a ton of overloaded terms here. The main one, "penetration testing", can mean a wide variety of things. In my career, I've seen it used (at least) for security testing of software releases, certifications of production deployments, and even "red-team" style activities with specific attacker goals. In SAMM, we're talking about the first of those, and SAMM describes it as follows (per ST1B): "Penetration testing cases should include both application-specific tests to check soundness of business logic as well as common vulnerability tests to check the design and implementation." Now, in ST1B, we also allow for various roles to be the "owners" of the pen testing activities, but we do say that security auditors should closely monitor it if they aren't the ones conducting the testing themselves. We might be able to make that more clear in the activity descriptions if folks feel that isn't clear as it is.

As for the comparisons to the BSIMM testing activities, I have yet to see a strong argument for a difference between "pen testing" and "security testing" that doesn't simply boil down to who's doing the testing or when the testing is performed. In SAMM, we do not split up practices or activities based on these factors. SAMM aims to capture *what* should be performed and leaves room for organizations to specify (within some reasonable bounds) the who and then when. So, describe to me what the difference is in terms of goals, or stated differently, what SAMM says organizations should be doing.

Fabien, just some specific thoughts on what you stated: I disagree that production environments are "crucial" to pen testing as you state (based on my clarification on the term "pen testing" above). I also disagree that SAMM's definition does "not offer room for explorative searching for vulnerabilities"... what in SAMM led you to believe that? It seems like we may need to clarify the descriptions if folks pick up this sentiment in general since it's certainly not the intention.

p.

fabian....@optimabit.com

unread,
Sep 14, 2011, 12:01:04 PM9/14/11
to Software Assurance Maturity Model (SAMM)
Thanks for the clarification, Pravir.

On Wed, Sep 14, 2011 at 07:10:08AM -0700, Pravir Chandra wrote:
> Just to chime in on this thread, I think there's a ton of overloaded
> terms here. The main one, "penetration testing", can mean a wide
> variety of things. In my career, I've seen it used (at least) for
> security testing of software releases, certifications of production
> deployments, and even "red-team" style activities with specific

> attacker goals. In SAMM, we're talking about the first of those, [...]


>
> As for the comparisons to the BSIMM testing activities, I have yet
> to see a strong argument for a difference between "pen testing" and
> "security testing" that doesn't simply boil down to who's doing the
> testing or when the testing is performed.

> [...]


> So, describe to me what the difference is in terms of goals, or
> stated differently, what SAMM says organizations should be doing.

My argument is that there are no activities in SAMM that prescribe or
allow exporative testing of an application. Coming mainly from web
applications, I can say that not all vulnerabilities can be checked for
by specifying a security or penetration test-case in advance. It is
always hard to fully specify what should happen and impossible to
specify everything that shouldn't happen.

Please correct me if I'm wrong, but SAMM's penetration tests do not
allow for an explorative approach. Each test-case is predefined:
"Using the set of security test cases identified for each project,
penetration testing should be conducted to evaluate the system's
performance against each case"
So judging by that, the penetration tests are the security tests
specified in ST1A and were specified before the test, there is no
additional testing based on exploration of the application.

This is what I see as the difference between pen testing and security
testing: Pen testing being explorative, security testing being
"prescriptive" (for lack of a better word), with all the test cases
known before the test.

> Fabien, just some specific thoughts on what you stated: I disagree
> that production environments are "crucial" to pen testing as you
> state (based on my clarification on the term "pen testing" above). I

Indeed with your definition of the term, they are not. Yet it is in my
opinion preferrable to perform an explorative test on an application
that has been deployed in a situation that at least mimicks the
production environment as closely as possible, since it is an effective
tool to catch errors that only surface in the deployed application. It's
hard to test for deployment issues with a security test.

> also disagree that SAMM's definition does "not offer room for
> explorative searching for vulnerabilities"... what in SAMM led you
> to believe that? It seems like we may need to clarify the
> descriptions if folks pick up this sentiment in general since it's
> certainly not the intention.

I hope I made clear above why I think that explorative pen testing is
not allowed within ST1B.

Regards,
Fabian

Pravir Chandra

unread,
Sep 14, 2011, 12:40:39 PM9/14/11
to Software Assurance Maturity Model (SAMM), Software Assurance Maturity Model (SAMM)
I agree on the value of exploratory pen testing and agree with much of what you said, but I'm still not seeing how you have the opinion that SAMM doesn't allow for this. Do you not interpret the sentence from ST1B that I quoted in my previous mail as calling for such testing?

p.

fabian....@optimabit.com

unread,
Sep 14, 2011, 1:08:24 PM9/14/11
to Software Assurance Maturity Model (SAMM)
Pravir,

On Wed, Sep 14, 2011 at 12:40:39PM -0400, Pravir Chandra wrote:
> I agree on the value of exploratory pen testing and agree with much of what you said, but I'm still not seeing how you have the opinion that SAMM doesn't allow for this. Do you not interpret the sentence from ST1B that I quoted in my previous mail as calling for such testing?

Your quote was:

Penetration testing cases should include both application-specific tests to
check soundness of business logic as well as common vulnerability tests to check the
design and implementation.

Which appears after my quote:

Using the set of security test cases identified for each project,
penetration testing should be conducted to evaluate the system's

performance against each case.

That tells me that the security tests should fulfill the properties of
your quote, not that I should conduct explorative tests that fulfill
your quote. Nowhere in ST1B is there any mention of free testing, it's
all limited to the set of security tests in ST1A with that first
sentence.

Maybe we should redo that sentence or add another that explicitly states
that one should also conduct explorative testing?

Regards,
Fabian

Christian Heinrich

unread,
Sep 14, 2011, 6:52:23 PM9/14/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

On Wed, Sep 14, 2011 at 8:27 PM, <fabian....@optimabit.com> wrote:
> Are you sure you posted the right link? It seems to me this document is
> specific to design review. Although parts of it could also conceivably
> be applied to penetration tests.

ASVS has produced a number of diagrams in relation to the SDL and
"penetration testing",
https://www.owasp.org/index.php/How_to_bootstrap_your_SDLC_with_verification_activities
was the first one that I had at hand since it is directly referenced
on https://www.owasp.org/index.php/ASVS

On Wed, Sep 14, 2011 at 8:27 PM, <fabian....@optimabit.com> wrote:
> This is only partly true. Being as close as possible to a production
> environment is crucial for penetration testing, yet a test environment
> is of course preferrable to a live environment, since damage to live
> data can always occur.

This would require the developer to have access to their customers
live environment which might not possible with COTS - in this case
http://www.eweek.com/c/a/Windows/Microsoft-Takes-LSD-to-Test-Vista-Security/
(as an example) would apply as "penetration testing" prior to
shipping.

On Wed, Sep 14, 2011 at 8:27 PM, <fabian....@optimabit.com> wrote:
> I see OpenSAMMs "penetration testing" as a form of security testing,
> since a) it doesn't cover the problems that arise once a software is
> deployed in its live environment and b) they do not offer room for
> explorative searching for vulnerabilities.

OpenSAMM is related to the maturity of the developer only, not the consumer.

Furthermore, the consumer might not be able to reverse the binaries
due to license restrictions neither would they be provided (read-only)
access to the vendor's SCM to repeat the (security) unit tests, etc.

On Wed, Sep 14, 2011 at 8:27 PM, <fabian....@optimabit.com> wrote:
> Seeing that BSIMM places a strong emphasis on both and that it is common
> practice to conduct such tests (at least in my experience), I find it
> strange that OpenSAMM does not mention the concept at all.

BSIMM is based on the first (public release) beta of OpenSAMM and
hence they were not constructed completely in parallel once their
respective final releases were published.

Pravir Chandra

unread,
Sep 14, 2011, 7:12:27 PM9/14/11
to Software Assurance Maturity Model (SAMM), Software Assurance Maturity Model (SAMM)
I'm cool with that in principle. Want to take a crack at the language and add it to the issue tracker?

p.

fabian....@optimabit.com

unread,
Sep 15, 2011, 7:19:56 AM9/15/11
to OpenSAMM
Pravir,

I have tried so here:
http://code.google.com/p/opensamm/issues/detail?id=18
Any comments are greatly appreciated.

Christian,

> This would require the developer to have access to their customers
> live environment which might not possible with COTS - in this case
> http://www.eweek.com/c/a/Windows/Microsoft-Takes-LSD-to-Test-Vista-Security/
> (as an example) would apply as "penetration testing" prior to
> shipping.

Indeed that is not always possible, but the reverse is also true. SAMM
is not only useful for software vendors but also for big companies that
produce and host their own software in-house. And if that is not the
case, you can still try and simulate the environment as closely as
possible.
I see no problem with the kind of penetration testing as shown in the
link you provided.

> OpenSAMM is related to the maturity of the developer only, not the consumer.

I'm sorry if this was unclear, but my intention is not to have consumers
of software perform penetration tests. It's for the developing company
(which might in some cases also be the consumer) to simulate a consumer
environment as closely as possible and conduct a test there.

> BSIMM is based on the first (public release) beta of OpenSAMM and
> hence they were not constructed completely in parallel once their
> respective final releases were published.

Indeed, and I would not have you think I imply that SAMM should be 100%
equal to BSIMM. Yet it strikes me as odd that BSIMM can spend a whole
practice on the topic while SAMM only mentions it in passing.

Regards,
Fabian

Christian Heinrich

unread,
Sep 16, 2011, 1:12:39 AM9/16/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

On Thu, Sep 15, 2011 at 9:19 PM, <fabian....@optimabit.com> wrote:
> Indeed that is not always possible, but the reverse is also true. SAMM
> is not only useful for software vendors but also for big companies that
> produce and host their own software in-house. And if that is not the
> case, you can still try and simulate the environment as closely as
> possible.
> I see no problem with the kind of penetration testing as shown in the
> link you provided.

The intent of OpenSAMM is to be applicable to both COTS and in-house.

Even an in-house development environment does not reflect their
production implementation and neither would this be accepted as an
internal recharge i.e. to reflect the production environment by
rebuilding the development implementation.

The main concern is that you expecting the developers to replicate
DNS, e-mail, etc (which is outside their expertise) or that the
application be rejected if the developer is compromised with a client
side exploit i.e. not a direct fault of the application?

On Thu, Sep 15, 2011 at 9:19 PM, <fabian....@optimabit.com> wrote:
> I'm sorry if this was unclear, but my intention is not to have consumers
> of software perform penetration tests. It's for the developing company
> (which might in some cases also be the consumer) to simulate a consumer
> environment as closely as possible and conduct a test there.

This is open to exploitation if a vendor has to hire an external
auditor and they have an agreement to pay a referral fee in the case
of COTS.

A consumer would prefer
http://www.eweek.com/c/a/Windows/Microsoft-Takes-LSD-to-Test-Vista-Security/
(even though it's ulterior motive is marketing).

On Thu, Sep 15, 2011 at 9:19 PM, <fabian....@optimabit.com> wrote:
> Indeed, and I would not have you think I imply that SAMM should be 100%
> equal to BSIMM. Yet it strikes me as odd that BSIMM can spend a whole
> practice on the topic while SAMM only mentions it in passing.

BSIMM reflects the implementation of the top ~32 companies globally
for Software Assurance. In my opinion BSIMM is not as low level as I
would prefer i.e fuzzing, SAST, DAST, etc but this may be due to
Cigital and Fortify being accused of an ulterior motive in selling
services.

The differences between BSIMM and OpenSAMM for this specific Security
Practice have been documented by Privar and Brian Chess within
http://www.opensamm.org/2011/03/bsimm-activities-mapped-to-samm/

Christian Heinrich

unread,
Sep 19, 2011, 6:57:04 PM9/19/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

On Mon, Sep 19, 2011 at 7:53 PM, <fabian....@optimabit.com> wrote:
> You misunderstood. I suggest that a testing environment (or call it
> staging environment if you will) can be used to simulate the production
> environment. To give a concrete example:
>
> Say you develop a web application that you deploy in-house. Then your
> production team (i.e. admins etc.) know the setup of your web-server and
> can replicate that closely on a second server, which we call the testing
> server. On that server, the application can be deployed as it would be in
> the production environment and tests can be conducted on it.

I have never seen a UAT/development build reflect the production
build, unless the UAT is transition to production and then it is
rebuild (in production) somewhere in the future.

All organisations do not have spare hardware to support this or
technical people to design the specific build (i.e. in addition to the
post installation) from the SOE. Furthermore, this extends to
virtualised implementations.

On Mon, Sep 19, 2011 at 7:53 PM, <fabian....@optimabit.com> wrote:
> I expect that to be done as a cooperation between the developers and the
> production people in case of an in-house deployment. If we are talking
> about COTS, the developers should at least be able to get their own
> software running on a production-like system, say install it on a PC or
> on a simple server system.

This is not possible for a COTS developer as each SOE (per customer)
is different.

It is only feasible if the in-house developer tests on the SOE and
associated the post-installation procedure for the (generic) web
server.

On Mon, Sep 19, 2011 at 7:53 PM, <fabian....@optimabit.com> wrote:
> The idea is to be as close to production as
> is managable, not to force exact compliance with an ideal production
> environment without looking at costs.

This has a substantial cost associated with it and it has to be
rebuild after each test.

On Mon, Sep 19, 2011 at 7:53 PM, <fabian....@optimabit.com> wrote:
> To your second point:
> I never asked for a penetration test of the production system. If your
> production environment is vulnerable to say a virus, that is of course
> not in scope for the penetration test. But if your application itself is
> vulnerable, because it reacts differently in the production environment
> than while in development then that is in-scope.

The production environment is always different to the development environment.

The intended outcome is flawed if you don't test the production environment.

ASVS (which I am not endorsing) excludes tests below the application
i.e. Operating System and network.

>> On Thu, Sep 15, 2011 at 9:19 PM,  <fabian....@optimabit.com> wrote:
>> This is open to exploitation if a vendor has to hire an external
>> auditor and they have an agreement to pay a referral fee in the case
>> of COTS.
>>
>> A consumer would prefer
>> http://www.eweek.com/c/a/Windows/Microsoft-Takes-LSD-to-Test-Vista-Security/
>> (even though it's ulterior motive is marketing).
>

> I'm sorry, but I don't understand your point here. Could you elaborate
> that?

As far as the consumer is concerned if they have hired external
auditors who found some vulnerabilities in the application then they
have undertaken their duty of care.

fabian....@optimabit.com

unread,
Sep 26, 2011, 7:42:45 AM9/26/11
to OpenSAMM
Christian,

> I have never seen a UAT/development build reflect the production
> build, unless the UAT is transition to production and then it is
> rebuild (in production) somewhere in the future.

I didn't talk about UAT.

> All organisations do not have spare hardware to support this or
> technical people to design the specific build (i.e. in addition to the
> post installation) from the SOE. Furthermore, this extends to
> virtualised implementations.

But some do. Doesn't mean we should exclude it for that reason.

> This is not possible for a COTS developer as each SOE (per customer)
> is different.

But as you said, SAMM is for in-house as well as COTS.

> It is only feasible if the in-house developer tests on the SOE and
> associated the post-installation procedure for the (generic) web
> server.

Indeed, I don't see a problem there.

> This has a substantial cost associated with it and it has to be
> rebuild after each test.

As I said, as close as is reasonable. That doesn't imply 100% compliance
with the production environment, but 80% compliance can be achieved
rather easily. Most stuff is needed to get the application running in
the first place anyways.

> The production environment is always different to the development environment.
>
> The intended outcome is flawed if you don't test the production environment.

My point exactly.

> As far as the consumer is concerned if they have hired external
> auditors who found some vulnerabilities in the application then they
> have undertaken their duty of care.

As I said, I don't want consumers to do the penetration testing, it's
the developing side that has to do it.

I think we may have gotten a little side-tracked with our discussion and
it seems a bit one-sided that only the two of us argue about it. Aren't
there any other people who have an opinion?

I'd really like some feedback on my proposed changes to ST1B:
* http://code.google.com/p/opensamm/issues/detail?id=18
* http://code.google.com/p/opensamm/issues/detail?id=8

Regards,
Fabian

--
Start using GPG! (http://www.gnupg.org/)

Christian Heinrich

unread,
Sep 26, 2011, 10:35:09 PM9/26/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> I didn't talk about UAT.

Mature organisations have UAT.

On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> But some do. Doesn't mean we should exclude it for that reason.

OpenSAMM is intended for all types of developers and not a specific subset.

On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> But as you said, SAMM is for in-house as well as COTS.

Therefore the "activities" of OpenSAMM have to be applicable to both
COTS and in-house to avoid excluding one over the other.

On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> Indeed, I don't see a problem there.

The problem is that you are expecting a COTS developer to test against
the nuance(s) of each customers' web server build.

On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> As I said, as close as is reasonable. That doesn't imply 100% compliance
> with the production environment, but 80% compliance can be achieved
> rather easily. Most stuff is needed to get the application running in
> the first place anyways.

80% is misleading then based on the intent of the metric that you are proposing.

On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> As I said, I don't want consumers to do the penetration testing, it's
> the developing side that has to do it.

As a customer you don't trust the vendor, rather an independent third party.


--
Regards,
Christian Heinrich
http://www.owasp.org/index.php/user:cmlh

fabian....@optimabit.com

unread,
Sep 27, 2011, 5:10:07 AM9/27/11
to OpenSAMM
Christian,

On Tue, Sep 27, 2011 at 12:35:09PM +1000, Christian Heinrich wrote:
> On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> > But some do. Doesn't mean we should exclude it for that reason.
>
> OpenSAMM is intended for all types of developers and not a specific subset.

And yet it includes activities that do not fit every organization, which
is a good thing! You can't expect OpenSAMM to be the minimal common
ground between all software organizations. That would not make for a
good model, or for very secure software for that matter.
You have to include activities that only fit a subset of the
organizations as well in order to respect their differences.
Deployment may not be an issue for some
developing companies, yet OpenSAMM covers it. Compliance may not be an
issue for some other companies, yet OpenSAMM covers it. Penetration
testing may not be an issue for some companies; doesn't mean OpenSAMM
shouldn't cover it.

>
> On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> > But as you said, SAMM is for in-house as well as COTS.
>
> Therefore the "activities" of OpenSAMM have to be applicable to both
> COTS and in-house to avoid excluding one over the other.

Same argument as above.

>
> On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> > Indeed, I don't see a problem there.
>
> The problem is that you are expecting a COTS developer to test against
> the nuance(s) of each customers' web server build.
>
> On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> > As I said, as close as is reasonable. That doesn't imply 100% compliance
> > with the production environment, but 80% compliance can be achieved
> > rather easily. Most stuff is needed to get the application running in
> > the first place anyways.
>
> 80% is misleading then based on the intent of the metric that you are proposing.

Alright, point taken. Maybe we should reword these lines then:

This should preferably be done in a test environment that mimics the produc-
tion situation as closely as possible. If there is no such environment,
test the system in production, but ensure its safety and integrity.

And instead have something along those lines:

This should preferrably be done in a test environment that mimics
the production situation as closely as is possible and reasonable.
As a rule of thumb, aim for roughly 80% compliance with the
production environment.

How about that?

>
> On Mon, Sep 26, 2011 at 9:42 PM, <fabian....@optimabit.com> wrote:
> > As I said, I don't want consumers to do the penetration testing, it's
> > the developing side that has to do it.
>
> As a customer you don't trust the vendor, rather an independent third party.

If a penetration test finds
vulnerabilities, those should probably be fixed. Whether an independant
third party found them or not is not of importance. If your customers
demand that, so be it. We don't have to prescribe anything in that
matter, though, and if you look at my wording in
http://code.google.com/p/opensamm/issues/detail?id=8, I didn't (first
sentence).

Regards,
Fabian

Christian Heinrich

unread,
Sep 27, 2011, 5:13:14 AM9/27/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

On Tue, Sep 27, 2011 at 7:10 PM, <fabian....@optimabit.com> wrote:
> And yet it includes activities that do not fit every organization, which
> is a good thing! You can't expect OpenSAMM to be the minimal common
> ground between all software organizations. That would not make for a
> good model, or for very secure software for that matter.
> You have to include activities that only fit a subset of the
> organizations as well in order to respect their differences.
> Deployment may not be an issue for some
> developing companies, yet OpenSAMM covers it. Compliance may not be an
> issue for some other companies, yet OpenSAMM covers it. Penetration
> testing may not be an issue for some companies; doesn't mean OpenSAMM
> shouldn't cover it.

This is the intent of the case studies and BSIMM, which I hear on the
grapevine their next release is RSN.

That stated, the consumer wants a baseline in which to measure the
maturity of vendor x against vendor y

On Tue, Sep 27, 2011 at 7:10 PM, <fabian....@optimabit.com> wrote:
> Alright, point taken. Maybe we should reword these lines then:
>
>    This should preferably be done in a test environment that mimics the produc-
>    tion situation as closely as possible. If there is no such environment,
>    test the system in production, but ensure its safety and integrity.
>
> And instead have something along those lines:
>
>    This should preferrably be done in a test environment that mimics
>    the production situation as closely as is possible and reasonable.
>    As a rule of thumb, aim for roughly 80% compliance with the
>    production environment.
>
> How about that?

How are you intending to measure 80% conformance?

To reach an agreement would have to be reached on what the excluded
20% of http://cisecurity.org/ is a political argument that should be
avoided if the intent is raising maturity.

On Tue, Sep 27, 2011 at 7:10 PM, <fabian....@optimabit.com> wrote:
> If a penetration test finds
> vulnerabilities, those should probably be fixed. Whether an independant
> third party found them or not is not of importance. If your customers
> demand that, so be it. We don't have to prescribe anything in that
> matter, though, and if you look at my wording in
> http://code.google.com/p/opensamm/issues/detail?id=8, I didn't (first
> sentence).

You didn't consider that vulnerabilities are easier (i.e. less cost)
to fix prior to release.

--
Regards,
Christian Heinrich
http://www.owasp.org/index.php/user:cmlh

Christian Heinrich

unread,
Oct 13, 2011, 12:04:01 AM10/13/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

Two more counterpoints to consider:

1. A WAF will reduce the total number of vulnerabilities confirmed
within the final penetration test.

2. As account password lockout is implemented in production there will
be a significant increase in the time required to brute force or
spider the web application as a result of the session cookies expiring
and then locking the test accounts.

Christian Heinrich

unread,
Oct 18, 2011, 7:00:21 PM10/18/11
to fabian....@optimabit.com, Software Assurance Maturity Model (SAMM)
Fabian,

Also, consider that the X.509 Server Side Certificate won't be valid
(under test) until the application is moved to the production host.

Reply all
Reply to author
Forward
0 new messages