Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

ICSA Certification vs VB Comparative Testing

7 views
Skip to first unread message

LHigdon

unread,
Apr 3, 2000, 3:00:00 AM4/3/00
to
Maybe someone can help me understand. Some products which are ICSA
Certified fail VB Comparative Testing, routinely. I don't want to give
examples, for obvious reasons, but there are some well known names.
Supposedly, ICSA Certification is given for products that detect
and/or disinfect 100% of the ITW Viruses. Products that fail to do so
are given 7 days to submit a corrected version that does meet the
tests, or lose certification. Now, perhaps, I misunderstand ICSA
Certification. Can someone help me sort this out. I've been to the
ICSA web-site and am still quite confused.

kurt wismer

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to

what you've said so far alone should be enough to explain what's going
on... vb100 (assuming you mean vb100, since you can't really 'fail' a
comparative test, it's not a pass/fail test) may well not give developers
the 'second chance' that icsa does...

nick posted some details of a new test criterion aswell, which may not be
in practice at icsa and could mean that the icsa test is easier to pass...

--
". . . and i was looking so good, shamoo took a shining to me. and they're
so smart those things, you know, they got all these human emotions. love,
lust, green hundred year old eyed jealousy. barthalamoo - was *livid*.
unbeknownst to me, i can't hear a god damned thing underwater."


Nick FitzGerald

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
LHigdon <lhigd...@mindspring.com> wrote:

> Maybe someone can help me understand. Some products which are ICSA

> Certified fail VB Comparative Testing, routinely. ...

I'll try...

> ... I don't want to give


> examples, for obvious reasons, but there are some well known names.

Spoil-sport! 8-)

> Supposedly, ICSA Certification is given for products that detect
> and/or disinfect 100% of the ITW Viruses. Products that fail to do so
> are given 7 days to submit a corrected version that does meet the
> tests, or lose certification. Now, perhaps, I misunderstand ICSA
> Certification. Can someone help me sort this out. I've been to the
> ICSA web-site and am still quite confused.

What you describe is, as I understand it, the ICSA
testing protocol (although I thought it was a tad longer
than seven days for removal from the certified list).

Anyway, as Kurt suggested, the VB tests do not allow
resubmission and re-testing against failures. It's a
case of "you pay your money and you take your chance".
VB tests are free, unlike ICSA's or Secure Computing's,
and some of the test criteria are tougher (like the ItW
test-set will be matched to the latest WLO list --
usually released two weeks before product submission
date for the testing).

Remember the ICSA tests follow a protocol developed in
consultation with representatives of the AV companies
whose products are to be tested and there has to be a
very high level of cross-vendor acceptance of the
protocol. In the past this has meant that ICSA tests
have used two month (minimum and sometimes quite a bit
older) WildLists as the basis of its certifications.
Some of the more worrying flaws of such "tests" have
now been removed, but also note that the high cost of
entry to the AVPD consortium (a pre-requisite for
having your products tested at all) plus the hihg costs
of actually having your product tested have kept some
of the best products out of ICSA tests for much of
their history because of inability to meet (or at least
justify) the expense. (This is also true of the Secure
Computing tests.)

I'll stop now...


--
Nick FitzGerald

Arthur Kopp

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
On Tue, 04 Apr 2000 10:54:49 GMT, "Nick FitzGerald"
<ni...@virus-l.demon.co.uk> wrote:

>Anyway, as Kurt suggested, the VB tests do not allow
>resubmission and re-testing against failures.

It has always struck me that this is the proper way to do it. Why let
lower quality vendors off the hook by allowing them to make
corrections based upon independent tests? I want to know which vendors
can pass realistic tests without having their hands held.

Art


Randy Abrams

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
George Wenzel <gwe...@telusplanet.net> wrote in message
news:MPG.1353a60f1...@news.edmonton.telusplanet.net...
> In article <38e925a8...@nntp.mindspring.com>, lhigdon115
> @mindspring.com says...

> >Maybe someone can help me understand. Some products which are ICSA
> >Certified fail VB Comparative Testing, routinely.
>
<snip>

>
> >Products that fail to do so
> >are given 7 days to submit a corrected version that does meet the
> >tests, or lose certification.

Which, of course results in a better product for the consumer. The "carrot"
of certification forces improvement.

> Sounds right - inferior products can fail the ICSA certification, then
> resubmit their product and have it pass and obtain the certification.
> VB doesn't do that - you submit your product and it does well or it
> doesn't do well. Period.

Superior products can, as well. no one is perfect, but there are trends that
will yield meaningful information. I think the ICSA (if they don't already)
should provide data as to which products require retests. If a product
requires a retest to pass on one occasion that's one thing, if it is a
regular procedure that tells you something else.

> Certifications are simply a baseline measure of a product's quality -
> they say that a given product has a minimum level of detection ability.
> It doesn't say whether a product was well above the minimum level, or
> just barely above it.

True. The example I use is the US Department of Transportation (DOT). The
USDOT "certifies" vehicles for road worthiness (safety issues). Does that
mean a Mercedes S class, A Rolls Royce, and a Ford Escort are equivalent
cars? Of course not.

>
> Comparative reviews (such as the ones done by Virus Bulletin) compare
> the relative quality of products, allowing for more useful analysis.

Virus Bulletin does comparative reviews. You have to get the periodical to
see them. The VB100 award is a form of certification. The chart of VB 100
historical results is a comparative.

Regards,

Randy
--
--
The opinions expressed in this message are my own personal views
and do not reflect the official views of the Microsoft Corporation.

Nick FitzGerald

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
Arthur Kopp <art...@mindsprung.com> wrote:

> It has always struck me that this is the proper way to do it. Why let
> lower quality vendors off the hook by allowing them to make
> corrections based upon independent tests? I want to know which vendors
> can pass realistic tests without having their hands held.

I, of course, agree.

Remember that the ICSA test protocol is "industry approved"
whereas VB's is "enforced". Some vendors who do not like
their chances of not looking good (some based on a quite a
history of dismal results in VB tests) choose to not submit
product for VB testing (though may be happy to tiff in
their tens of thousands a year for an AVPD membership and
ICSA testing fees).

I know Larry does not like the suggestion that ICSA
certifications are "bought" -- ICSA has de-certified
products and so on -- but the reality is several "high-
profile in the market" products cannot, on a fairly regular
basis, pass tests of ItW detection within two weeks of new
WildLists coming out. Given that many of the "new" viruses
on the WildList have actually been "in the wild" for a
month or more before making a list (read my VB'99 paper for
one example) that means many products "certified" by ICSA
(and Secure Computing/West Coast Labs) are actually quite
inadequate (assuming that you accept the very low standard
of 100% ItW detection as a grade of "adequacy").


--
Nick FitzGerald

Nick FitzGerald

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
Randy Abrams <ran...@microsoft.com> wrote:

> > >Products that fail to do so
> > >are given 7 days to submit a corrected version that does meet the
> > >tests, or lose certification.
>
> Which, of course results in a better product for the consumer. The "carrot"
> of certification forces improvement.

True, but doesn't it make you wonder, about those products
that have persistent re-tests, what must pass for Q&A and
product testing within those companies?

I think we should be careful about confusing "detection
rates" with "product quality". A good product must have a
high detection rate, but a high detection rate does not
mean it is a good product.

> > Sounds right - inferior products can fail the ICSA certification, then
> > resubmit their product and have it pass and obtain the certification.
> > VB doesn't do that - you submit your product and it does well or it
> > doesn't do well. Period.
>
> Superior products can, as well. no one is perfect, but there are trends that
> will yield meaningful information. I think the ICSA (if they don't already)
> should provide data as to which products require retests. If a product
> requires a retest to pass on one occasion that's one thing, if it is a
> regular procedure that tells you something else.

ICSA did have a product submission, testing and re-testing
record on its web site. I've not looked for a while, but
assume it is still there. It was a long overdue step when
they finally introduced it. Such a record, viewed over time
is very useful in gauging some product development quality
issues.

<<snip good car/DOT analogy>>


> > Comparative reviews (such as the ones done by Virus Bulletin) compare
> > the relative quality of products, allowing for more useful analysis.
>
> Virus Bulletin does comparative reviews. You have to get the periodical to
> see them. The VB100 award is a form of certification. The chart of VB 100
> historical results is a comparative.

Yep. You should not take the fact that a product received a
VB100 award as meaning much. Look at how it has done through
time. Read the full text of several reviews of the product on
the same platform *and* across platforms looking for recurring
themes of product reliability, stability and testing problems.
(Remember that testing a product with thousands of viruses is,
in one sense, not very "realistic" but can uncover problems
that very heavy use may throw up -- like the nightmare
scenario that your whole company becomes heavily infested with
a completely new virus that you will have to detect and
eliminate "after the fact" rather then stop before it becomes
a problem.)


--
Nick FitzGerald

kurt wismer

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
On Tue, 4 Apr 2000, Arthur Kopp wrote:

> On Tue, 04 Apr 2000 10:54:49 GMT, "Nick FitzGerald"
> <ni...@virus-l.demon.co.uk> wrote:
>
> >Anyway, as Kurt suggested, the VB tests do not allow
> >resubmission and re-testing against failures.
>

> It has always struck me that this is the proper way to do it. Why let
> lower quality vendors off the hook by allowing them to make
> corrections based upon independent tests? I want to know which vendors
> can pass realistic tests without having their hands held.

then bookmark http://www.virusbtn.com... and maybe even get a subscription
(if it's worth that much to you)...

Norman Hirsch

unread,
Apr 4, 2000, 3:00:00 AM4/4/00
to
On Mon, 03 Apr 2000 23:20:57 GMT, lhigd...@mindspring.com (LHigdon)
wrote:

>Maybe someone can help me understand. Some products which are ICSA

>Certified fail VB Comparative Testing, routinely. I don't want to give


>examples, for obvious reasons, but there are some well known names.

>Supposedly, ICSA Certification is given for products that detect

>and/or disinfect 100% of the ITW Viruses. Products that fail to do so


>are given 7 days to submit a corrected version that does meet the

>tests, or lose certification. Now, perhaps, I misunderstand ICSA
>Certification. Can someone help me sort this out. I've been to the
>ICSA web-site and am still quite confused.

ICSA is and always was (even when it was NCSA) and will no doubt
continue to be simply a money making organization despite it's name
(which implies some kind of governmental group). The public likes to
see some authoritative sticker on the box to give them a level of
comfort that the product has passed some test. It is simply this
fact that enables companies such as ICSA to charge A/V companies to
"certify" them. Obviously when a company is paying a 5 figure sum
for certification, they want something in return. Hence it is not
surprising to find the relatively independent up-to-date VB tests
different from the "paid" for tests. The paid tests are merely an
advertising cost designed to give an A/V company something to say to
it's potential customers and a sticker to put on the box. I give
little credibility to the ICSA or Secure Computing tests. I much
rather give credit to forums such as this where real people can
discuss real viruses and how the respective products work.

My hat is off to any and all companies who can get a 100% VB score and
NOT pay ICSA or Secure Computing for their so called "certifications".

Norman Hirsch
nhi...@nha.com

LHigdon

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
Thanks for the explanation. I had not considered the cross-vendor
acceptance of the protocol.

On Tue, 04 Apr 2000 10:54:49 GMT, "Nick FitzGerald"
<ni...@virus-l.demon.co.uk> wrote:

>LHigdon <lhigd...@mindspring.com> wrote:
>
>> Maybe someone can help me understand. Some products which are ICSA

>> Certified fail VB Comparative Testing, routinely. ...
>
>I'll try...
>

>> ... I don't want to give


>> examples, for obvious reasons, but there are some well known names.
>

>Spoil-sport! 8-)


>
>> Supposedly, ICSA Certification is given for products that detect
>> and/or disinfect 100% of the ITW Viruses. Products that fail to do so
>> are given 7 days to submit a corrected version that does meet the
>> tests, or lose certification. Now, perhaps, I misunderstand ICSA
>> Certification. Can someone help me sort this out. I've been to the
>> ICSA web-site and am still quite confused.
>

>What you describe is, as I understand it, the ICSA
>testing protocol (although I thought it was a tad longer
>than seven days for removal from the certified list).
>

>Anyway, as Kurt suggested, the VB tests do not allow

Peace


Lee Higdon
Fayetteville, GA. USA
email to: lthi...@mciworld.com
lhigd...@mindspring.com

LHigdon

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
Thanks.

On Tue, 04 Apr 2000 14:11:47 GMT, George Wenzel
<gwe...@telusplanet.net> wrote:

>In article <38e925a8...@nntp.mindspring.com>, lhigdon115
>@mindspring.com says...

>>Maybe someone can help me understand. Some products which are ICSA
>>Certified fail VB Comparative Testing, routinely.
>

>No surprise here.


>
>>Products that fail to do so
>>are given 7 days to submit a corrected version that does meet the
>>tests, or lose certification.
>

>Sounds right - inferior products can fail the ICSA certification, then
>resubmit their product and have it pass and obtain the certification.
>VB doesn't do that - you submit your product and it does well or it
>doesn't do well. Period.
>

>Certifications are simply a baseline measure of a product's quality -
>they say that a given product has a minimum level of detection ability.
>It doesn't say whether a product was well above the minimum level, or
>just barely above it.
>

>Comparative reviews (such as the ones done by Virus Bulletin) compare
>the relative quality of products, allowing for more useful analysis.
>

>Regards,
>
>George Wenzel
>--
>George Wenzel, B.A. (Criminology) E-Mail: <gwe...@telusplanet.net>
>President & Webmaster, U of A Karate Club - http://www.ualberta.ca/~karate/

Larry Bridwell

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
Well, after reading the conversation thread (even that past this point)
I choose Nick to respond too :-). Actually I have not responded earlier
because I have been unable to get a connection on which I could respond
for a the last day or so. For this reason I may well respond to things
above and below this point :-).

Nick FitzGerald wrote:


>
> Randy Abrams <ran...@microsoft.com> wrote:
>
> > > >Products that fail to do so
> > > >are given 7 days to submit a corrected version that does meet the
> > > >tests, or lose certification.
> >

> > Which, of course results in a better product for the consumer. The "carrot"
> > of certification forces improvement.

That was the design of the program from the beginning, to test products
in a "real-world" type testing environment (or at least as close as you
can in a lab) against a publicly vetted set of criteria. Yes,
developers who wish to achieve certification would pay for the testing,
but that is what they pay for, one year of testing. They do not
automatically receive a certification. However, as Randy stated, when
we first tested products (way back on the first ever test in '94 or so)
only about 33% of the products passed the tests. Because of the
pressure of the certification, in about three months all products (that
were submitted) passed. Again in '95 when the standards were raised,
and everyone said it wasn't hard enough... NO ONE passed the first time
through, within a couple of months the market pressure drove them to
improve the products.

>
> True, but doesn't it make you wonder, about those products
> that have persistent re-tests, what must pass for Q&A and
> product testing within those companies?
>
> I think we should be careful about confusing "detection
> rates" with "product quality". A good product must have a
> high detection rate, but a high detection rate does not
> mean it is a good product.

I am not aware of any objective certification program that can "certify"
quality. Let's face it, quality is fairly subjective. Look at the
threads on this list when someone new asks about the "best" av product.
You always get differing answers from some very reputable folk. A "Seal
of Approval" or a "certification" only speaks to some objective fact. I
know quite a few CISSPs (Certified Information Systems Security
Professionals) and CISAs (Certified Information Systems Auditor). All
of them passed some testing (which they paid for) that was to
objectively determine their knowledge and ability. It does not
guarantee that all CISSPs or CISAs are of the same quality. That is
generally determined by references, as AV products are in this forum.

>
> > > Sounds right - inferior products can fail the ICSA certification, then
> > > resubmit their product and have it pass and obtain the certification.
> > > VB doesn't do that - you submit your product and it does well or it
> > > doesn't do well. Period.
> >

> > Superior products can, as well. no one is perfect, but there are trends that
> > will yield meaningful information. I think the ICSA (if they don't already)
> > should provide data as to which products require retests. If a product
> > requires a retest to pass on one occasion that's one thing, if it is a
> > regular procedure that tells you something else.
>
> ICSA did have a product submission, testing and re-testing
> record on its web site. I've not looked for a while, but
> assume it is still there. It was a long overdue step when
> they finally introduced it. Such a record, viewed over time
> is very useful in gauging some product development quality
> issues.

Yes it is. And it is being revamped somewhat for an easier read. It
was long overdue when it happened, but it was the first on any
certification program and it does show all failures (even those not
associated with detection).

>
> <<snip good car/DOT analogy>>


> > > Comparative reviews (such as the ones done by Virus Bulletin) compare
> > > the relative quality of products, allowing for more useful analysis.

At least some usefulness :-). Actually, I would put more value in a VB
review than in most magazine or even academic reviews. At least VB has
contact with the industry and the real world. Most of the other
producers of comparative reviews do not.

> >
> > Virus Bulletin does comparative reviews. You have to get the periodical to
> > see them. The VB100 award is a form of certification. The chart of VB 100
> > historical results is a comparative.

An it does not certify the quality of a product. It only certifies that
for that month and that month alone, the product passed the object test
criteria. A good thing.

It is a different type of certification with different criteria, that is
all.

>
> Yep. You should not take the fact that a product received a
> VB100 award as meaning much. Look at how it has done through
> time. Read the full text of several reviews of the product on
> the same platform *and* across platforms looking for recurring
> themes of product reliability, stability and testing problems.
> (Remember that testing a product with thousands of viruses is,
> in one sense, not very "realistic" but can uncover problems
> that very heavy use may throw up -- like the nightmare
> scenario that your whole company becomes heavily infested with
> a completely new virus that you will have to detect and
> eliminate "after the fact" rather then stop before it becomes
> a problem.)

Good points, Nick. Choose the products you use if they meet your
needs. Of course if it cannot be certified to protect you from viruses,
detect them if/when they are present on your systems or media, and
provide a means of recovery from the unavoidable, occasional infection,
then you may not want to consider the product any further :-).

Regards,
Larry Bridwell
ICSA Labs

Larry Bridwell

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
George Wenzel wrote:
>
> In article <38e925a8...@nntp.mindspring.com>, lhigdon115
> @mindspring.com says...

> >Products that fail to do so


> >are given 7 days to submit a corrected version that does meet the
> >tests, or lose certification.
>

> Sounds right - inferior products can fail the ICSA certification, then
> resubmit their product and have it pass and obtain the certification.
> VB doesn't do that - you submit your product and it does well or it
> doesn't do well. Period.

So who are the superior ones? Not one company/developer has been free
from missing a wild virus or otherwise failing an initial test or a spot
test (done a minimum of every 60 days) at some point in time. I repeat
NOT ONE.

>
> Certifications are simply a baseline measure of a product's quality -
> they say that a given product has a minimum level of detection ability.
> It doesn't say whether a product was well above the minimum level, or
> just barely above it.

Absolutely correct.

>
> Comparative reviews (such as the ones done by Virus Bulletin) compare
> the relative quality of products, allowing for more useful analysis.

Good point. But the posted results and longevity of ICSA's "continuous"
certification can also give some clues as to usefulness and ability to
respond to new threats and even bugs :-). Every product ICSA certifies
is tested at a minimum of every 60 days to see if the criteria it was
tested under has been maintained. I am not sure how often other testing
programs do this.

Larry Bridwell

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
Norman Hirsch wrote:
>
> On Mon, 03 Apr 2000 23:20:57 GMT, lhigd...@mindspring.com (LHigdon)
> wrote:
>
<snip original question>

>
> ICSA is and always was (even when it was NCSA) and will no doubt
> continue to be simply a money making organization despite it's name
> (which implies some kind of governmental group).

Yep and I hope we continue to make a little money (or at least start), I
need a paycheck :-). Of course it helps to do good quality test when
you have enough money to buy equipment, do independent research, hire
and keep quality personnel, etc. Since 1996 (when I cam on board)
ICSA/NCSA has never made any other claim than to be a For Profit
company. My understanding is that has always been the case. Of course
I cannot speak with authority since I was not around them :-).

> The public likes to
> see some authoritative sticker on the box to give them a level of
> comfort that the product has passed some test. It is simply this
> fact that enables companies such as ICSA to charge A/V companies to
> "certify" them.

Again, since 1996, ICSA/NCSA has never charged for certification. It
has a contract with security product developers for a year of testing
individual products against a publicly derived and vetted criteria. If
these product eventually pass the test they are granted certification
through the life of the testing contract. The testing is dynamic and
continuous. In case of AV, that means each product is tested at least
every 60 days (without prior knowledge of the developer) with the latest
PUBLICLY available updates against the latest test suite.

> Obviously when a company is paying a 5 figure sum
> for certification, they want something in return.

And they get it. They get the results of the tests and and why the
passed or failed. If they pass they are certified, if not they are
given time to upgrade their product and get a PUBLICLY available patch
to us. If they get it to us and it works they retain certification, if
not they are suspended and ultimately de-certified. Of course I wish
each certification fee was 5 figures :-).

> Hence it is not
> surprising to find the relatively independent up-to-date VB tests
> different from the "paid" for tests. The paid tests are merely an
> advertising cost designed to give an A/V company something to say to
> it's potential customers and a sticker to put on the box.

All tests are paid for by someone for some reason. I do not doubt for a
minute that most companies who purchase testing from ICSA hope to
achieve certification and use marketing dollars to pay for it. However,
to imply that all products pass and are ultimately certified is
presumptuous and in fact is false. Also to imply that all developers
pay for testing with marketing dollars is wrong. Several developers use
engineering dollars because it helps them with their QA.

> I give
> little credibility to the ICSA or Secure Computing tests. I much
> rather give credit to forums such as this where real people can
> discuss real viruses and how the respective products work.

I am sorry to hear that you give ICSA little credibility. Of course, I
do not know how you can make such a statement since you have not spoken
to me or anyone in the AV labs here during my tenure. That must mean
you are making such statements at best on little or no factual
experience with ICSA, or at worst with 4-5 year old information. That
would greatly surprise me from someone with your credibility and
knowledge in the security arena. I would hope you would contact me or
some here at ICSA personally and see what we are about and up to these
days :-).

>
> My hat is off to any and all companies who can get a 100% VB score

I agree!

> and
> NOT pay ICSA

Obviously do not agree :-).


or Secure Computing

Better not comment ;-).

Best Regards,
Larry Bridwell
ICSA Labs

PS - Drop me a line and lets talk.
lmb

Ian Whalley

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
[Larry Bridwell wrote:]

>Again, since 1996, ICSA/NCSA has never charged for certification. It
>has a contract with security product developers for a year of testing
>individual products against a publicly derived and vetted criteria.

Oh, come on :-). This is pure sophistry. ICSA charges for
certification -- the money is merely paid yearly, as opposed
to on a per-certification basis.


>But the posted results and longevity of ICSA's "continuous"
>certification can also give some clues as to usefulness and ability
>to respond to new threats and even bugs :-). Every product ICSA certifies
>is tested at a minimum of every 60 days to see if the criteria it was
>tested under has been maintained.

It's actually quite hard to verify that this 60 day retesting limit
has been maintained over the last few months for each and every
certified product. Looking at the ICSA web site, I see
<http://www.icsa.net/html/communities/antivirus/lab/index.shtml>,
which offers month-by-month reports on which products were tested,
but I can't find how to determine the certification history on a
per-product basis, other than by combing through those pages.

So, to find out more, I went to
<http://www.icsa.net/html/communities/antivirus/certification/> (a
page which features links which attempt to open files on my C: drive,
very impressive, in addition to other broken links elsewhere). I
quote from this page:

Certification Maintenance (CM) testing is done at ICSA’s
discretion, but at least quarterly.

'Quarterly' is not 60 days (365 / 4 == 91 1/4).

In addition, the lab section (first URL above) implies that no
testing has been done since December 1999 (somewhat more than both
60 days and 92 days).

However, it is true that ICSA has done a lot to improve certain
things in the anti-virus game other than its own balance sheet --
ICSA certification has progressed extremely well, and since it
was revised to be ItW-based in the mid-90s, it has expanded well
to encompass new tests and new criteria. I still (as I always
did) have a problem with the fact that vendors pay to get certified,
but ICSA needs to pay the bills, and the money has to come from
somewhere.

Best;

inw

P.S. icsa.net also features spelling/punctuation errors aplenty,
and is (in general) a delight for proof-readers worldwide:
ON-ACCES
False/Positive
Lotus Note

--
Ian Whalley
<first name> @ <last name> . org

Robert

unread,
Apr 5, 2000, 3:00:00 AM4/5/00
to
In article <38EB4F42...@mindspring.com>, Larry Bridwell
<lbri...@mindspring.com> wrote

>George Wenzel wrote:
>>
>> In article <38e925a8...@nntp.mindspring.com>, lhigdon115
>> @mindspring.com says...
>
>> >Products that fail to do so
>> >are given 7 days to submit a corrected version that does meet the
>> >tests, or lose certification.
>>
>> Sounds right - inferior products can fail the ICSA certification, then
>> resubmit their product and have it pass and obtain the certification.
[snip]

>
>So who are the superior ones?

That's easy - those who fail least often

> Not one company/developer has been free
>from missing a wild virus or otherwise failing an initial test or a spot
>test (done a minimum of every 60 days) at some point in time.

Which in no way contradicts the fact that products can fail then
resubmit. One suspects that some go this route more than others
--
Robert
It's not because I am paranoid
That the whole world *isn't* against me

Nick FitzGerald

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
Ian Whalley <i...@whalley.org> wrote:

<<snip>>


> P.S. icsa.net also features spelling/punctuation errors aplenty,
> and is (in general) a delight for proof-readers worldwide:
> ON-ACCES
> False/Positive
> Lotus Note

And some people say *I* am anal... 8-)


--
Nick FitzGerald

rod

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to

"Nick FitzGerald" <ni...@virus-l.demon.co.uk> wrote in message
news:01bf9f7f$274109e0$0500000a@mobilenick...

I read somewhere that over 90% of middle class
Americans see an anal-yst regularly.

The mind boggles!

Norman Hirsch

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
On Wed, 05 Apr 2000 11:05:40 -0400, Larry Bridwell
<lbri...@mindspring.com> wrote:


>I am sorry to hear that you give ICSA little credibility. Of course, I
>do not know how you can make such a statement since you have not spoken
>to me or anyone in the AV labs here during my tenure. That must mean
>you are making such statements at best on little or no factual
>experience with ICSA, or at worst with 4-5 year old information. That
>would greatly surprise me from someone with your credibility and
>knowledge in the security arena. I would hope you would contact me or
>some here at ICSA personally and see what we are about and up to these
>days :-).

Ok, Larry, tell me if I'm wrong: The ICSA A/V area has alot of humpty
dumpty words about your testing but in actuality, you are doing
nothing other than testing the A/V programs against a group of viruses
you have collected as being "in the wild".

Looking at your own web site, the latest test was December, 1999 where
everyone passed. in November, there weres two viruses that a few
missed. September and October there were apparently no tests done.
August, everyone passed. July a few missed the same two viruses as
above. June, a few failure by Panda no reason or virus given. May
the same story w/Panda. April showed multiple failures on one virus
and one other virus failure. March one product tested. Feb one
failue, no reason and one passed despite missing one virus! and Jan an
eval product failed where the registered passed.

So for the whole year, with the exception of the one eval program
which failed a bunch although the registered product passed, there
were only 4 viruses found missed in all the products you tested for
the entire year. And these companies paid how much in total for
that.

Each test could be done in 5 minutes or less by any company if they
had the exact viruses you had. The list of viruses the ICSA
maintains at any point in time is questionable and it's availablity to
anyone, especially non paying ICSA members is questionable.

You charge 5 figures per year to do this testing and have the A/V
companies by the balls because they want some certification on their
box.

This is a lucrative business so you've spread the model to firewalls,
etc where, from what I've been told, is much less valid and credible
than A/V.

L DeHaan

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
On Thu, 6 Apr 2000 15:45:48 +1000, "rod" <r...@st.net.au> wrote:

>

>
>I read somewhere that over 90% of middle class
>Americans see an anal-yst regularly.

"somewhere"?


>The mind boggles!

Some minds boggle very easily.


LDH


Ian Whalley

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
Norman Hirsch <NOSPAM...@nha.com> wrote:
Norman Hirsch wrote:

>You charge 5 figures per year to do this testing and have the A/V
>companies by the balls because they want some certification on their
>box.

Well, of course, the cynic in me agrees with Norman. However, the
realist asks 'if anyone can do this, why aren't they?. If
NCSA/ICSA/ICSA.net (whatever...) has AV companies "by the balls"
(ahem), why can't anyone else charge a bit less for the same thing and
be a in similar position of power?'

The answer is (at least, I _think_ the answer is) that
NCSA/ICSA/ICSA.net have invested time, effort and (yes) money in
promoting their testing as something which AV products should have.
We can argue that this is out of a cynical desire to make money (well
duh! This is capitalism), but we could also point out that it has
helped to increase the overall quality (at least when it comes to base
detection ability) of AV products.

As with everything, NCSA/ICSA/ICSA.net charges 'what the market will
bear' for their certification. There's nothing wrong with this,
provided readers of the certification results remember who's paying
who.

As someone who is up to his eyeballs in the housing market right now,
I can provide an analogy (anyone surprised?). When looking to buy a
house in the US, you (the buyer) deal with a single real estate agent.
This agent works out what you want in a house (from your descriptions
and your reactions when you see houses!), and tries to find matches
for you.

It's important to remember, however, that it is the seller that pays
the agent. It's also important to remember that your agent is more
likely to show you houses that his/her agency is selling, as that way
they get to keep all of the commission (as opposed to having to split
it between the seller's and the buyer's agency).

This doesn't mean that you should completely rule out buying a house
that your agent is selling. It just means you need to bear in mind
that your agent has a strong incentive to sell you one of those, over
and above other houses.

It's the same with reviews. Just because someone is paying the
reviewer doesn't necessarily mean that the reviewer is going to force
that someone's products down your throat. It just means you have to
pay attention.

Best;

inw

Larry Bridwell

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
Robert wrote:
>
> In article <38EB4F42...@mindspring.com>, Larry Bridwell
> <lbri...@mindspring.com> wrote
> >George Wenzel wrote:
> >>
> >> In article <38e925a8...@nntp.mindspring.com>, lhigdon115
> >> @mindspring.com says...
> >
> >> >Products that fail to do so
> >> >are given 7 days to submit a corrected version that does meet the
> >> >tests, or lose certification.
> >>
> >> Sounds right - inferior products can fail the ICSA certification, then
> >> resubmit their product and have it pass and obtain the certification.
> [snip]
> >
> >So who are the superior ones?
>
> That's easy - those who fail least often

That may well be, but the intimation of the statement I replied to
seemed to be that "superior" products would never fail while those that
did were "infereior", when in fact all have failed at some time.

>
> > Not one company/developer has been free
> >from missing a wild virus or otherwise failing an initial test or a spot
> >test (done a minimum of every 60 days) at some point in time.
>
> Which in no way contradicts the fact that products can fail then
> resubmit. One suspects that some go this route more than others

Absolutely correct. I was not attempting such a contradiction. In fact
that is the reason we began posting the reports that show test results
over tow years ago :-).

Regrds,
Larry Bridwell
ICSA Labs

Larry Bridwell

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
Norman Hirsch wrote:
>
> On Wed, 05 Apr 2000 11:05:40 -0400, Larry Bridwell
> <lbri...@mindspring.com> wrote:
>
> >I am sorry to hear that you give ICSA little credibility. Of course, I
> >do not know how you can make such a statement since you have not spoken
> >to me or anyone in the AV labs here during my tenure. That must mean
> >you are making such statements at best on little or no factual
> >experience with ICSA, or at worst with 4-5 year old information. That
> >would greatly surprise me from someone with your credibility and
> >knowledge in the security arena. I would hope you would contact me or
> >some here at ICSA personally and see what we are about and up to these
> >days :-).
>
> Ok, Larry, tell me if I'm wrong: The ICSA A/V area has alot of humpty
> dumpty words about your testing but in actuality, you are doing
> nothing other than testing the A/V programs against a group of viruses
> you have collected as being "in the wild".

I won't sya you are wrong :-). WE test products against many samples
and variants of all viruses listed as In the Wild by the Wild List
Organization (both "ON-Demand" and On-Access", we also test the same
products against our many sample and variants of the ICSA virus
collection (many thousands of samples). So I guess from that point you
could say we test against a "group of viruses you have collected as
being in the wild"

>

> Looking at your own web site, the latest test was December, 1999 where
> everyone passed. in November, there weres two viruses that a few
> missed. September and October there were apparently no tests done.
> August, everyone passed. July a few missed the same two viruses as
> above. June, a few failure by Panda no reason or virus given. May
> the same story w/Panda. April showed multiple failures on one virus
> and one other virus failure. March one product tested. Feb one
> failue, no reason and one passed despite missing one virus! and Jan an
> eval product failed where the registered passed.
>
> So for the whole year, with the exception of the one eval program
> which failed a bunch although the registered product passed, there
> were only 4 viruses found missed in all the products you tested for
> the entire year. And these companies paid how much in total for
> that.
>
> Each test could be done in 5 minutes or less by any company if they
> had the exact viruses you had.

Boy would I like to see that. It could save us many hours of testing
:-). On the other hand, I surely would not want to depend on that kind
of test. It actually takes many hours for even one product to be
tested.

Otherwise thanks for pointing out the data on the site, I will find out
why it is not up to date.

> The list of viruses the ICSA
> maintains at any point in time is questionable and it's availablity to
> anyone, especially non paying ICSA members is questionable.

Ah, I thought you may be making decisions with old information. You
seem not to have any idea about ICSA's collection or its sharing
policies. ICSA's collection of viruses is complete and is thoroughly
maintained by our labs. The collection is available in total to NO
ONE. ICSA Labs will exhange or share viruses with know and reputable
virus reasearches that we have built a trust relationship with. I do
understand that the collection may have been shared in total in the
past, but not since the FAll of 1996 I can assure you.

>
> You charge 5 figures per year to do this testing and have the A/V
> companies by the balls because they want some certification on their
> box.
>

> This is a lucrative business so you've spread the model to firewalls,
> etc where, from what I've been told, is much less valid and credible
> than A/V.

Well, I am not sure it is lucrative or not and I am not the person to
talk about the Firewall program as it is not my area to manage, but find
below a public message from a Firwall mailing list. I have the full
contxt if needed.


From: Marcus J. Ranum [mailto:m...@nfr.net]
Sent: Thursday, February 03, 2000 5:31 AM
To: firewall...@nfr.net
Subject: Re: Firewalls - ITSEC Rating?

>The ITSEC evaluation says that the product met the requirements documented
>in its "Security Target" document.

Right, .........<snip extraneous>.........but the ICSA firewall product
certification is orders of
magnitude more valuable to real customers than ITSEC evaluation.

mjr.


Regards,
Larry

Larry Bridwell

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
Ian Whalley wrote:
>
> [Larry Bridwell wrote:]
> >Again, since 1996, ICSA/NCSA has never charged for certification. It
> >has a contract with security product developers for a year of testing
> >individual products against a publicly derived and vetted criteria.
>
> Oh, come on :-). This is pure sophistry.

Sophistry? I never worked for Sophos, you did. Haven't used that word
since my Master's thesis!

> ICSA charges for
> certification -- the money is merely paid yearly, as opposed
> to on a per-certification basis.

Actually it is true. We charge a fee for testing. I guess you could say
for certification testing, but believe it or not some products have paid
for testing, been tested for a full year and never gained
certification. Granted, it is not often in the AV arena, but it really
has happened

>
> >But the posted results and longevity of ICSA's "continuous"
> >certification can also give some clues as to usefulness and ability
> >to respond to new threats and even bugs :-). Every product ICSA certifies
> >is tested at a minimum of every 60 days to see if the criteria it was
> >tested under has been maintained.
>
> It's actually quite hard to verify that this 60 day retesting limit
> has been maintained over the last few months for each and every
> certified product. Looking at the ICSA web site, I see
> <http://www.icsa.net/html/communities/antivirus/lab/index.shtml>,
> which offers month-by-month reports on which products were tested,
> but I can't find how to determine the certification history on a
> per-product basis, other than by combing through those pages.

Yep found out it isn't up to date and that will be changed. We are
working on this.

>
> So, to find out more, I went to
> <http://www.icsa.net/html/communities/antivirus/certification/> (a
> page which features links which attempt to open files on my C: drive,
> very impressive, in addition to other broken links elsewhere). I
> quote from this page:
>
> Certification Maintenance (CM) testing is done at ICSA’s
> discretion, but at least quarterly.
>
> 'Quarterly' is not 60 days (365 / 4 == 91 1/4).

No but our discretion has been 60 days. Come on you have to have an
escape clause. In fact we some times spot check more regularly.

>
> In addition, the lab section (first URL above) implies that no
> testing has been done since December 1999 (somewhat more than both
> 60 days and 92 days).

Just saw it. Working on getting it up to date.

>
> However, it is true that ICSA has done a lot to improve certain
> things in the anti-virus game other than its own balance sheet --
> ICSA certification has progressed extremely well, and since it
> was revised to be ItW-based in the mid-90s, it has expanded well
> to encompass new tests and new criteria. I still (as I always
> did) have a problem with the fact that vendors pay to get certified,
> but ICSA needs to pay the bills, and the money has to come from
> somewhere.

Thanks :-)

> P.S. icsa.net also features spelling/punctuation errors aplenty,
> and is (in general) a delight for proof-readers worldwide:
> ON-ACCES
> False/Positive
> Lotus Note

Well, we need to keep someone busy don't we!

Thanks again, I'll see that is corrected.

Ring me up sometime and let's chat!

Larry
ICSA Labs

Randy Abrams

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to

Norman Hirsch <NOSPAM...@nha.com> wrote in message
news:vi4les8bhpag76qea...@4ax.com...
<snip>

> ICSA is and always was (even when it was NCSA) and will no doubt
> continue to be simply a money making organization despite it's name
> (which implies some kind of governmental group). The public likes to

> see some authoritative sticker on the box to give them a level of
> comfort that the product has passed some test. It is simply this
> fact that enables companies such as ICSA to charge A/V companies to
> "certify" them. Obviously when a company is paying a 5 figure sum
> for certification, they want something in return. Hence it is not
<snip>

The ICSA provide significant services beyond the opportunity for these
companies to have their products tested for certification. If you think that
certification is all that the money goes for you really are not very well
informed with respect to what the ICSA offers.

Norman Hirsch

unread,
Apr 6, 2000, 3:00:00 AM4/6/00
to
On Thu, 06 Apr 2000 14:46:32 -0400, Larry Bridwell
<lbri...@mindspring.com> wrote:


>> Each test could be done in 5 minutes or less by any company if they
>> had the exact viruses you had.
>
>Boy would I like to see that. It could save us many hours of testing
>:-). On the other hand, I surely would not want to depend on that kind
>of test. It actually takes many hours for even one product to be
>tested.

less than 5 minutes is all that is needed/test.

>> The list of viruses the ICSA
>> maintains at any point in time is questionable and it's availablity to
>> anyone, especially non paying ICSA members is questionable.
>
>Ah, I thought you may be making decisions with old information. You
>seem not to have any idea about ICSA's collection or its sharing
>policies. ICSA's collection of viruses is complete and is thoroughly
>maintained by our labs. The collection is available in total to NO
>ONE. ICSA Labs will exhange or share viruses with know and reputable
>virus reasearches that we have built a trust relationship with. I do
>understand that the collection may have been shared in total in the
>past, but not since the FAll of 1996 I can assure you.
>

Here we go again with the "trust relationship". And which companies
do your "trusted" virus researchers work for? And, then which
companies don't have researchers that you "trust"? There is nothing
new here. Just a new twist to maintain control of the viruses you
have so you can continue to keep the companies on the string.

Those "trusted" get their companies easily certified. You find a
new virus in the wild an then of course the ones that aren't trusted
can't detect it until they get a sample by their own means. And then
you look good to the naive. To me it's still a scam.

The answer is a real international government organization that
collects viruses and makes them available to bonafide A/V
organizations in their offices at the very least at no cost with no
subjective "trust" preferences. A/V companies would be obligated to
provide any new viruses to this organization.

>> This is a lucrative business so you've spread the model to firewalls,
>> etc where, from what I've been told, is much less valid and credible
>> than A/V.
>
>Well, I am not sure it is lucrative or not and I am not the person to
>talk about the Firewall program as it is not my area to manage, but find
>below a public message from a Firwall mailing list. I have the full
>contxt if needed.
>
>
>From: Marcus J. Ranum [mailto:m...@nfr.net]
>Sent: Thursday, February 03, 2000 5:31 AM
>To: firewall...@nfr.net
>Subject: Re: Firewalls - ITSEC Rating?
>
>>The ITSEC evaluation says that the product met the requirements documented
>>in its "Security Target" document.
>
>Right, .........<snip extraneous>.........but the ICSA firewall product
>certification is orders of
>magnitude more valuable to real customers than ITSEC evaluation.
>
>mjr.
>

Marcus Ranum is THE authority on firewalls so I won't second guess
him. However, I'd be willing to bet he can easily show huge security
holes in every ICSA certified firewall product. I would like to see
what he says specifically of the ICSA firewall product certification
tests. Firewall testing does take time as it's vastly more
complicated and there are always new things found to be considered.
This is not like testing an A/V program against a group of known
viruses, which again is a few minute task.

kurt wismer

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
On Fri, 7 Apr 2000, George Wenzel wrote:

> In article <38ec2485@grissom>, r...@st.net.au says...


> >I read somewhere that over 90% of middle class
> >Americans see an anal-yst regularly.
>
>

> WE PRAISE THE COLORECTAL SURGEON
> MISUNDERSTOOD AND MUCH MALIGNED
> SLAVING AWAY IN THE HEART OF DARKNESS
> WORKING WHERE THE SUN DON'T SHINE
>
> Those who have heard of the Canadian comedy group "Bowser and Blue" will
> know what this means. Those who haven't will think I'm an idiot until
> they take a look at www.bowserandblue.com/lyrics.html. After that, they
> might still think I'm an idiot...

not me... i only wonder why "bum darts" and "writing my name in the
snow" aren't mentioned... maybe they're too old...

Norman Hirsch

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
On Fri, 07 Apr 2000 01:04:31 GMT, George Wenzel
<gwe...@telusplanet.net> wrote:

>In article <38ECDB88...@mindspring.com>, lbri...@mindspring.com
>says...


>>Ah, I thought you may be making decisions with old information. You
>>seem not to have any idea about ICSA's collection or its sharing
>>policies. ICSA's collection of viruses is complete and is thoroughly
>>maintained by our labs. The collection is available in total to NO
>>ONE. ICSA Labs will exhange or share viruses with know and reputable
>>virus reasearches that we have built a trust relationship with. I do
>>understand that the collection may have been shared in total in the
>>past, but not since the FAll of 1996 I can assure you.
>

>I don't think the point was that you should share your virus collection
>with anybody. If you did that, your certification would be totally
>useless.

I think they should. Isn't the ultimate goal to eliminate viruses.
ICSA's is simply to test whether viruses are eliminated and make money
doing it? Imagine if some organization held a bunch of human viruses
and tested whether anti-biotics stopped them or not while at the same
time not allowing these companies to have samples of the viruses to be
sure they and others could eliminate them. Is there something wrong
with that analogy? I think not. ICSA performs at best a small
incremental service for companies that don't maintain a good QA group
at a high price. The ultimate goal of preventing viruses ASAP is
actually set back by their desire to keep their money flow. If they
did release their viruses to bona fide A/V companies expeditiously,
wouldn't these companies then be able to detect these viruses and
wouldn't viruses be less prevalent? But this is clearly not ICSA's
goal.

>The point is that it seems that you don't announce _which_ viruses you
>have. Why don't you list the CARO-assigned name for the viruses you
>test against? That way, other (independent) testers can replicate your
>tests provided they have the same test suite.
>
Of course they will do whatever it takes to maintain their control of
the viruses so they can continue to keep the A/V companies on a
string. Remember, their goal is to make money under the guise of a
"certification", not to eliminate viruses.

Ian Whalley

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
Larry Bridwell <lbri...@mindspring.com> wrote:
>>>Again, since 1996, ICSA/NCSA has never charged for certification. It
>>>has a contract with security product developers for a year of testing
>>>individual products against a publicly derived and vetted criteria.
>>Oh, come on :-). This is pure sophistry.
>Sophistry? I never worked for Sophos, you did. Haven't used that word
>since my Master's thesis!

Sophistry is a good word!


>>ICSA charges for
>>certification -- the money is merely paid yearly, as opposed
>>to on a per-certification basis.
>Actually it is true. We charge a fee for testing. I guess you could say
>for certification testing, but believe it or not some products have paid
>for testing, been tested for a full year and never gained
>certification. Granted, it is not often in the AV arena, but it really
>has happened

Larry, Larry. There's nothing to be be ashamed about. Charging for
testing is a reasonable thing to do :-).


>>P.S. icsa.net also features spelling/punctuation errors aplenty,
>> and is (in general) a delight for proof-readers worldwide:

>Well, we need to keep someone busy don't we!

That someone would be me :-).


>Ring me up sometime and let's chat!

Best;

inw [trying to find Larry's phone number]

Randy Abrams

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
Norman Hirsch <NOSPAM...@nha.com> wrote in message
news:4ggres085vcd4rsga...@4ax.com...

> On Fri, 07 Apr 2000 01:04:31 GMT, George Wenzel
> <gwe...@telusplanet.net> wrote:
>
> >In article <38ECDB88...@mindspring.com>, lbri...@mindspring.com
> >says...
> >>Ah, I thought you may be making decisions with old information. You
> >>seem not to have any idea about ICSA's collection or its sharing
> >>policies. ICSA's collection of viruses is complete and is thoroughly
<snip>

> >I don't think the point was that you should share your virus collection
> >with anybody. If you did that, your certification would be totally
> >useless.
>
> I think they should. Isn't the ultimate goal to eliminate viruses.
> ICSA's is simply to test whether viruses are eliminated and make money
> doing it?

No, ICSA's goal is patently obviously not just to simply test viruses and
make money at it. The ICSA also plays an exceptionally important role in
obtaining information that greatly improves the quality of anti-virus
products through out the world. ICSA also provides training and prevalence
information. you are not very well informed about what the ICSA does with
regards to anti-virus.

> Imagine if some organization held a bunch of human viruses
> and tested whether anti-biotics stopped them or not while at the same
> time not allowing these companies to have samples of the viruses to be
> sure they and others could eliminate them. Is there something wrong
> with that analogy? I think not.

Of course there's something wrong with that analogy. It starts with the
comparison of computer viruses to biological viruses.

> ICSA performs at best a small
> incremental service for companies that don't maintain a good QA group
> at a high price.

You clearly speak from a position of exceptional ignorance.

> The ultimate goal of preventing viruses ASAP is
> actually set back by their desire to keep their money flow.

The impact the ICSA has had upon the improvement of anti-virus products
proves that your statement is not founded in reality.

> If they did release their viruses to bona fide A/V companies
expeditiously,
> wouldn't these companies then be able to detect these viruses and
> wouldn't viruses be less prevalent?

Can you show me an anti-virus company that does not have the viruses in the
wildlist, or at least have access to them? If you look at the Virus Bulletin
reports you will see that product often fail to detect these. That pretty
much blows your theory about the products detecting them if the ICSA gives
the viruses to the companies out of the water.

> But this is clearly not ICSA's goal.

You have been shown wrong several times. Would you care to come clean with
your hidden agenda. Quit beating around the bush and say what's really
eating you.

>
> >The point is that it seems that you don't announce _which_ viruses you
> >have. Why don't you list the CARO-assigned name for the viruses you
> >test against? That way, other (independent) testers can replicate your
> >tests provided they have the same test suite.

If the vendors can't handle the wildlist viruses, that's really not the
fault of the ICSA. If the vendors can't handles 90% of the ICSAs zoo, then
as a customer I think I should know that the company does not really have
the resources to adequately protect my equipment. Respectable researchers
tend to have little problem acquiring samples.

> Of course they will do whatever it takes to maintain their control of
> the viruses so they can continue to keep the A/V companies on a
> string.

Have you tried aluminum foil. It might block the rays :)

> Remember, their goal is to make money under the guise of a
> "certification", not to eliminate viruses.

Proof? What I've seen is the ICSA playing a key role in enhancing the
quality of anti-virus products and you slinging mud from behind a wall
that's too high for you to see what's going on.

Larry Bridwell

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
George Wenzel wrote:
>
> In article <38ECD5EC...@mindspring.com>, lbri...@mindspring.com
> says...
<snip>

> No, that was not what I was getting at. What I was getting at was that
> superior products should never fail, or should fail very rarely. Do you
> consider a "fail" to be a product that fails, resubmits, and passes, or
> just products that fail after resubmission?

I certainly would agree that top quality products should have few
problems, and in my PERSONAL (emphasis not shouting opinion, should
never need more than the second test (it is after all an open book test
after the first try :-). The word failure is a little vague actually
and we have been trying to come up with better terms for the test
reports. If a product misses a wild virus (in any form), or crashes on
load, or fails to log properly, or misses a ton of zoo viruses, or
otherwise does something that causes it to fail a spot test the 7 day
process is begun and the developer begins the "fix". so technically,
failure is lack of a fix throughout the grace period.

> >Absolutely correct. I was not attempting such a contradiction. In fact
> >that is the reason we began posting the reports that show test results
> >over tow years ago :-).
>

> This is certainly an excellent advance for your certification scheme,
> and should have been done from the start.

Yep. You are right, but some things take time. Hopefully we can
continue to move forward with even better testing and reporting.

Everyone can feel free to keep making suggestions and even contacting me
personally. We can't everything at once, but I will keep pluggin' away!

Regards,
Larry

Larry Bridwell

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
Ian Whalley wrote:
>
> Larry Bridwell <lbri...@mindspring.com> wrote:
> >>>Again, since 1996, ICSA/NCSA has never charged for certification. It
> >>>has a contract with security product developers for a year of testing
> >>>individual products against a publicly derived and vetted criteria.
> >>Oh, come on :-). This is pure sophistry.
> >Sophistry? I never worked for Sophos, you did. Haven't used that word
> >since my Master's thesis!
>
> Sophistry is a good word!

I know! It just seems I have been buried in more technical reading
these days adn haven't kept up with more enjoyable reading which exposes
one to a better vocabulary ;-).


>
> >>ICSA charges for
> >>certification -- the money is merely paid yearly, as opposed
> >>to on a per-certification basis.
> >Actually it is true. We charge a fee for testing. I guess you could say
> >for certification testing, but believe it or not some products have paid
> >for testing, been tested for a full year and never gained
> >certification. Granted, it is not often in the AV arena, but it really
> >has happened
>
> Larry, Larry. There's nothing to be be ashamed about. Charging for
> testing is a reasonable thing to do :-).

Not ashamed, just like to get the last word!

>
> >>P.S. icsa.net also features spelling/punctuation errors aplenty,
> >> and is (in general) a delight for proof-readers worldwide:
> >Well, we need to keep someone busy don't we!
>
> That someone would be me :-).
>
> >Ring me up sometime and let's chat!
>
> Best;
>
> inw [trying to find Larry's phone number]

I'll e-mail it to you.

Larry

Larry Bridwell

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to NOSPAM...@nha.com
Norman Hirsch wrote:
>
> On Thu, 06 Apr 2000 14:46:32 -0400, Larry Bridwell
> <lbri...@mindspring.com> wrote:
>

> less than 5 minutes is all that is needed/test.

I don't agree, but hey why argue over that :-).

>
<snip>

> Here we go again with the "trust relationship". And which companies
> do your "trusted" virus researchers work for? And, then which
> companies don't have researchers that you "trust"? There is nothing
> new here. Just a new twist to maintain control of the viruses you

> have so you can continue to keep the companies on the string.

1. Which companies - there are quite a few actually. Many are part of
our consortium and some are not. I think there may be a couple who do
not even work for AV companies but who are well respected in the
anti-virsu arena and have proven themselves trustworthy to the AV
community. Some may even post to this list.

>
> Those "trusted" get their companies easily certified.

See above. ICSA is not a criteria for a trusted relationship.

> You find a
> new virus in the wild an then of course the ones that aren't trusted
> can't detect it until they get a sample by their own means. And then
> you look good to the naive.

Again, you are simply wrong. I am not trying to be belligerent or a
source of aggravation to you, but simply are making statements which are
not true. ICSA Labs have individuals who report to the Wild List and
other lists reserved for av researchers and exchange with other
researchers regardless of whether they have any business relationships
with ICSA.

> To me it's still a scam.

That is certainly your right to have your own opinion and to express
it. However, to make statements that have no basis in fact as they are
absolutely true is another issue. You are certainly welcome to contact
me or even visit our labs and discuss these issues. If you have
suggestions on how things could be better, lets work on them together.
I am will. Just please don't make blanket accusations or assertions
unless you check out the current faxts first ;-).

>
> The answer is a real international government organization that
> collects viruses and makes them available to bonafide A/V
> organizations in their offices at the very least at no cost with no
> subjective "trust" preferences. A/V companies would be obligated to
> provide any new viruses to this organization.

This seems reasonable. Therefore, I doubt it will happen :-).
Governments working together? :-)

Of course I am not sure how you would determine a "bona fide" A/V
organization and not have a subjective trust issue. Not saying it could
not be done, just I am not sure how to do it.

I do not hink anyone would argue with this. Perfectly good security
products are installed in "perfectly" horrid manners everyday. Even AV
products. No product certification (that I know) of certifies the
product to be idiot proof. Even products certified by government
organizations like ITSEC can be misconfigured and have security
problems.

WEll, I am enjoying the exchange, no really, I am.

But I must be finishing another couple of replies and I will not be
checking newsgroups until Monday. I am spending the weekend with two of
the most beautiful, intelligent and well mannered young men to be born.
My two wonderful grandsons, Ryan and Gavin. Now that description may
bejust a tad prjudiced, but it is a fact none the less :-).

Regards,
Larry

Larry Bridwell

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to
Norman Hirsch wrote:
>
> On Fri, 07 Apr 2000 01:04:31 GMT, George Wenzel
> <gwe...@telusplanet.net> wrote:
>
> >In article <38ECDB88...@mindspring.com>, lbri...@mindspring.com
> >says...

<snip>

> >I don't think the point was that you should share your virus collection
> >with anybody. If you did that, your certification would be totally
> >useless.
>
> I think they should. Isn't the ultimate goal to eliminate viruses.
> ICSA's is simply to test whether viruses are eliminated and make money

> doing it? Imagine if some organization held a bunch of human viruses


> and tested whether anti-biotics stopped them or not while at the same
> time not allowing these companies to have samples of the viruses to be
> sure they and others could eliminate them. Is there something wrong
> with that analogy?

Yep. There is. Do you honestly think that putting all the computer
viruses know to man (not just the ones we have) in a large collection
and giving them to every AV company, AV researchers, University research
lab, etc. would actually eliminate viruses? Come on! Every AV company
in the world has had hundreds of copies of Happy99 for over a year an it
is still running around the globe, every week!

Besides, there are so many new viruses written weekly and variants found
weekly, no one could keep up.

Furthermore, ICSA used to allow all consortium members access to the
collection and folk like yourself yelled at us for that! It seems we
get less yelling this way :-).

> I think not.

Your privilege

> ICSA performs at best a small
> incremental service for companies that don't maintain a good QA group
> at a high price.

A real quick list of those who certify products would seem to belie the
statement about companies who don't maintain good QA. Unless of course,
you place some of the largest and most well respected companies in the
US, Europe, and the Pac Rim in that arena.

> The ultimate goal of preventing viruses ASAP is

> actually set back by their desire to keep their money flow. If they


> did release their viruses to bona fide A/V companies expeditiously,
> wouldn't these companies then be able to detect these viruses and
> wouldn't viruses be less prevalent?

in a word NO.

> But this is clearly not ICSA's
> goal.
>

> >The point is that it seems that you don't announce _which_ viruses you
> >have. Why don't you list the CARO-assigned name for the viruses you
> >test against? That way, other (independent) testers can replicate your
> >tests provided they have the same test suite.

Good point, I will see that we begin work on possibly doing that.

> >
> Of course they will do whatever it takes to maintain their control of
> the viruses

We sure will! :-)

> so they can continue to keep the A/V companies on a
> string.

but not for this reason :-(

>Remember, their goal is to make money

Yep.

> under the guise of a
> "certification",

Not under the guise of , but actually doing certification testing.

> not to eliminate viruses.

And that too!

Norman Hirsch

unread,
Apr 7, 2000, 3:00:00 AM4/7/00
to

Have fun with them, Larry. Nothing is better than that!

Best regards,

Norman Hirsch

Nick FitzGerald

unread,
Apr 8, 2000, 3:00:00 AM4/8/00
to
Norman Hirsch <NOSPAM...@nha.com> wrote:

> >>Ah, I thought you may be making decisions with old information. You
> >>seem not to have any idea about ICSA's collection or its sharing
> >>policies. ICSA's collection of viruses is complete and is thoroughly

> >>maintained by our labs. The collection is available in total to NO
> >>ONE. ICSA Labs will exhange or share viruses with know and reputable
> >>virus reasearches that we have built a trust relationship with. I do
> >>understand that the collection may have been shared in total in the
> >>past, but not since the FAll of 1996 I can assure you.
> >

> >I don't think the point was that you should share your virus collection
> >with anybody. If you did that, your certification would be totally
> >useless.
>
> I think they should. Isn't the ultimate goal to eliminate viruses.

Yes, it is.

Norman, however, seems a little confused about the ICSA's
testing methodology *and* a critical issue of testing
procedure.

The ICSA (and others) tests scanenrs against virus samples.
You see, by definition, viruses replicate and most of them
do so in a parasitic way. This measn that independent of
issues such encrypted and polymorphic code, most viruses
"look different" in each possible sample because they have
attached themselves to different hosts.

The ICSA, like all testing bodies, does not really test a
product's ability to detect a given virus -- it tests the
product's ability to detect a given virus replicated under
certain (perhaps unknown) conditions onto certain host
files. Hopefully the range of host files and conditions
used in the generation of the samples included in its
test-sets represents a suitably broad range of variables
that affect each virus in the test set. To the extent
that the sample generation conditions vary to match the
variables affecting how a virus "looks" in "real world"
infection scenarios, the samples should make a good,
(dare I say "representative") test-set for that virus.

Norman seems to be saying that the ICSA should share its
samples with the developer community. This raises a more
than slight problem with testing fairness.

Imagine the ICSA (or anyone else) were testing JamScan and
SpamScan for certification. To simplify things, let's also
imagine that there are just two viruses in the test-set --
VirusX and VirusY. Further, there is only one sample of
each virus. (These are all simplifications to keep the
example easily trackable in your head -- true they do
exaggerate the effects of a single sample miss.)

On first testing, CSA gets this result (if you are not
viewing with a monospaced font, tough!):

Samples detected
VirusX VirusY
JamScan 1 0
SpamScan 0 1

Clearly each product has a 50% hit rate.

What should ICSA do now?

Well, resubmission or update submission are allowable, so
ICSA contacts the developers with the appropriate news about
which virus(es) their product missed. Unless things have
changed since I last checked, the ICSA also sends a sample
of the missed virus. This is where a major "problem" can
arise -- does the ICSA send *the* missed sample, or just
"some" sample of the same virus from when the ICSA lab staff
replicated it?

I am not entirely clear of the ICSA's protocols here. I
believe that in the (distant?) past the ICSA sent the actual
missed sample. Therein lies one potential problem -- what
should be done about that sample when it comes to re-testing
the product when the update is received? A devious
developer (and aren't they all?? 8-) ) could add a quick-
and-dirty check for the specific file that is the "problem"
sample, assuring their product of full detection when the
update is re-tested...

One school of thought has it that the sample in the test-set
should be replaced with a new sample, preferably generated
under similar conditions and infecting a similar host (if
these are thought/known to affect the way the sample
"looks"). In the case above, all two samples would be
replaced before re-testing both products. But what if the
initial results had been:

Samples detected
VirusX VirusY
JamScan 1 0
SpamScan 1 1

SpamScan would not be re-tested as it would have had its
certification renewed already. Is that problem though?

Of course! Say JamScan does re-test with 100% detection
against a test-set that has had the VirusY sample replaced
because the original sample was sent to JamScan's developers.
The certification of the two products is not comparable.
Assuming this second scenario occurred and the ICSA lead
product tester decided (at random -- let's assume s/he works
completely independently of the testers who do product
certifications) to spot-test those two products the day after
JamScan was re-tested and re-certified. It is possible the
lead tester could get these results:

Samples detected
VirusX VirusY
JamScan 1 1
SpamScan 1 0

That doesn't seem "right" to me. I mean, it is good that the
lead tester found this, but the fact that it would only be
found by "chance" and only in some cases is perturbing. Of
course, some would say that SpamScan will be caught out when
its mandatory 90 day re-testing is done. That is true
(assuming the VirusY sample is not replaced again because of
some other product's failyre to detect the "new" sample of it)
and it then becomes an open question as to what will happen.

But, imagine that VirusY has just become very common and
widespread in the wild. 90 days is a very long time for users
of SpamScan to only have partial protection from this virus...

So, what should happen instead of the developers being given
copies of the actual missed samples? They should be given
samples of the same virus replicated in a similar manner (and
suitably tested for recursive replication, just as all samples
in a virus test-set should). That sample acts as a "proof of
identity", as we all know that saying "JamScan missed samples
of VirusY" may be meaningless to its developers if "VirusY" is
the name that SpamScan gives the virus...

This has the advantage of allowing the samples in the test-
sets to remain static. *Between* tests, samples of VirusY can
be changed, but for all tests at a given date the consumer of
the results knows they are comparing apples on the same
"ripeness" scale. The main disadvantage of this approach is
that the testing organization staff has to be sufficiently
"respected" by the AV developers that they can "trust" the
non-tested sample they receive really is a sample of the same
virus their product is said to have missed a sample of.
Samples need not be changed between tests, but they can be --
the important thing is that they remain the same "within" a
test (and to me, that should include re-tests).

My view: Should ICSA give the actual missed test samples to
teh developers -- no (at least, not until that round of
(re-)testing is over, at which point they may of they intend
to replace samples for the next round). It is documented in
the past that products have been modified to detect specific
files because the product could not be readily modified in
time to "properly" detect all samples of a virus, *but* the
developer had access to the actual test samples. If you
believe that cannot happen again, I have this really nice
block of land on Mars...

My view: Should ICSA give *some* sample of missed virus(es)
to the developers of products it certifies -- yes (but not an
actual test-set sample given the caveats above). The reasons
are manifold, but it allows the developers of the deemed
"faulty" scanners to confirm that they and the ICSA are
talking about the same virus. Further, at least over time,
it can allow the developers to affirm the "quality" of the
lab work at ICSA and it does allow any suspected problems in
these areas to be aired between the developer and the lab.
My suspicion is that this is probably the current the
practise for certification tests (at least for "in teh wild"
viruses).

So, to go back to the beginning of this message, Norman's
claim the ICSA does not "share" its test-set is, I believe,
wrong so long as you understand the significant difference
between "a virus" and "a sample of a virus". If Norman
actually meant that ICSA should share its test-set samples
with the developers of the products ICSA certifies, I
strongly disagree and several detailed discussions of good
testing practice argue signiifcantly against this, as does
some rather shameful testing history.

> ICSA's is simply to test whether viruses are eliminated and make money
> doing it? Imagine if some organization held a bunch of human viruses
> and tested whether anti-biotics stopped them or not while at the same
> time not allowing these companies to have samples of the viruses to be
> sure they and others could eliminate them. Is there something wrong

> with that analogy? I think not. ...

Sorry, but it is a very poor analogy (and note here that
Norman does seem to be advocating the sharing of actual test
samples). Bio-virs "mutate" in the deeply meaningful sense
of the word, whereas comp-virs only mutate in the weak sense
of what is allowed by their internal algorithm(s) driving the
mutation. The latter can be programmatically detected and it
is the role of the AV researchers and developers to ensure
their understanding of the full extent of such "mutation" is
detected by their product for each implementation they claim
to detect. Thus, AV developers worth their keep should be
able to produce reliable detection of polymorphic viruses
from code (and other) analysis derived from a single sample,
whereas researchers developing antiviral agents to protect a
human (or other biological) community generally need samples
of the known strains. (Here the analogy between "different
bio-virs" and "different comp-virs" tends to be at the variant
level, not the sample level anyway...)

When I was at VB, a very well-respected scanner persistently
missed one sample of an old polymorphic virus. In its day,
that virus had been quite challenging, but its day was long
past (though another product missed several samples of the
same virus and its developers did not seem at all interested
in "fixing" that). One of the developers of the product in
question Emailed and rang me, saying that they had re-
analysed the virus ad nauseum and had just generated a few
tens of thousands of replicants, all of which their scanner
detected except for a few corrupted replicants (the virus
had a bug such that it corrupted a trifling percentage of
COM files it infected). They were sure from their re-
checking of everything that their scanner should detect all
replicable samples of this virus. I went into the lab,
replicated the missed sample and sent some of its replicants
(to several generations) to the developer -- they detected
them all and all replicants they made from them (except for
the very occasional corrupted COM infection)... I was
pressured to send them the test-set sample. This went on
for a few days until I received a phone call late one
afternoon. They had generated over 500,000 samples of this
virus, and after dismissing all the corrupted COMs their
scanner "missed" (for those still reading, such "misses" are
quite "acceptable" as the virus cannot spread from such a
file so it cannot be considered a "sample" of that virus)
they had two files left, which happily replicated into
samples they did detect. In tracking why this happened,
they discovered "a silly bug" in their code emulator which
meant under peculiar combinations of conditions (which were
present in their definition of this virus) some part of the
emulator would run for one loop fewer than the definition
said it should. In a very, very tiny proportion of samples
of this virus, this bug meant the virus code was not fully
decrypted when the emulator stopped and thus such samples
were missed. It's still a bit of a mystery how VB got one
such sample in the 500 samples of that virus in its test-set
and why it took the developer concerned half a million
samples to generate two such examples...


--
Nick FitzGerald

Ian Whalley

unread,
Apr 8, 2000, 3:00:00 AM4/8/00
to
[Larry replied to me:]

>>Larry, Larry. There's nothing to be be ashamed about. Charging for
>>testing is a reasonable thing to do :-).
>Not ashamed, just like to get the last word!

In which case, this could go on for a while.


>>inw [trying to find Larry's phone number]
>I'll e-mail it to you.

Probably best. I will place it once more into the gaping maw of my
filing system (it's a bit like the Sarlacc in 'Return of the Jedi'),
from which it will never again surface. Just like the last N times
someone gave me your phone number...

Best;

inw

--
Ian Whalley

Norman Hirsch

unread,
Apr 9, 2000, 3:00:00 AM4/9/00
to
On Sat, 08 Apr 2000 11:32:52 GMT, "Nick FitzGerald"
<ni...@virus-l.demon.co.uk> wrote:

>Norman Hirsch <NOSPAM...@nha.com> wrote:
>
>> I think they should. Isn't the ultimate goal to eliminate viruses.
>
>Yes, it is.
>

>Norman seems to be saying that the ICSA should share its


>samples with the developer community. This raises a more
>than slight problem with testing fairness.

I am and "testing fairness" should be secondary to the goal of
eliminating viruses.

>Imagine the ICSA (or anyone else) were testing JamScan and
>SpamScan for certification. To simplify things, let's also
>imagine that there are just two viruses in the test-set --
>VirusX and VirusY. Further, there is only one sample of
>each virus. (These are all simplifications to keep the
>example easily trackable in your head -- true they do
>exaggerate the effects of a single sample miss.)
>
>On first testing, CSA gets this result (if you are not
>viewing with a monospaced font, tough!):
>
> Samples detected
> VirusX VirusY
>JamScan 1 0
>SpamScan 0 1
>
>Clearly each product has a 50% hit rate.
>
>What should ICSA do now?

Provide both samples to all bonafide A/V companies

>Well, resubmission or update submission are allowable, so
>ICSA contacts the developers with the appropriate news about
>which virus(es) their product missed. Unless things have
>changed since I last checked, the ICSA also sends a sample
>of the missed virus. This is where a major "problem" can
>arise -- does the ICSA send *the* missed sample, or just
>"some" sample of the same virus from when the ICSA lab staff
>replicated it?
>
>I am not entirely clear of the ICSA's protocols here. I
>believe that in the (distant?) past the ICSA sent the actual
>missed sample. Therein lies one potential problem -- what
>should be done about that sample when it comes to re-testing
>the product when the update is received? A devious
>developer (and aren't they all?? 8-) ) could add a quick-
>and-dirty check for the specific file that is the "problem"
>sample, assuring their product of full detection when the
>update is re-tested...

I am unclear on ICSA's protocols on sending viruses that were missed
also but I believe they should make them all available, perhaps in
their own labs, after testing each time if they are to exist at all as
a testing organization.

>So, to go back to the beginning of this message, Norman's
>claim the ICSA does not "share" its test-set is, I believe,
>wrong so long as you understand the significant difference
>between "a virus" and "a sample of a virus". If Norman
>actually meant that ICSA should share its test-set samples
>with the developers of the products ICSA certifies, I
>strongly disagree and several detailed discussions of good
>testing practice argue signiifcantly against this, as does
>some rather shameful testing history.

Here again, you are putting the "testing" ahead of the goal of
enabling A/V companies to eliminate as many viruses as possible and
ASAP.

>> ICSA's is simply to test whether viruses are eliminated and make money
>> doing it? Imagine if some organization held a bunch of human viruses
>> and tested whether anti-biotics stopped them or not while at the same
>> time not allowing these companies to have samples of the viruses to be
>> sure they and others could eliminate them. Is there something wrong
>> with that analogy? I think not. ...
>
>Sorry, but it is a very poor analogy (and note here that
>Norman does seem to be advocating the sharing of actual test
>samples). Bio-virs "mutate" in the deeply meaningful sense
>of the word, whereas comp-virs only mutate in the weak sense
>of what is allowed by their internal algorithm(s) driving the
>mutation. The latter can be programmatically detected and it
>is the role of the AV researchers and developers to ensure
>their understanding of the full extent of such "mutation" is
>detected by their product for each implementation they claim
>to detect. Thus, AV developers worth their keep should be
>able to produce reliable detection of polymorphic viruses
>from code (and other) analysis derived from a single sample,
>whereas researchers developing antiviral agents to protect a
>human (or other biological) community generally need samples
>of the known strains. (Here the analogy between "different
>bio-virs" and "different comp-virs" tends to be at the variant
>level, not the sample level anyway...)

I think your attempt at a distinction between the bio and computer
viruses is irrelevant and in fact wrong. Multiple Macro viruses can
create new viruses with new algorithms and bio-viruses mutate also.
In my opinion, the best and simpliest solution to both is to provide
samples of all viruses to the A/V developers. Forums such as this
newsgroup will eventually turn up problems much quicker than the
limited tests done by organizations such as ICSA/Secure Computing,
etc. The internet has made such limited testing unnecessary and in
time this will be more true. The world community focusing on
viruses, reporting viruses, problems, etc is much more important than
the specific collection of viruses held by these organizations for
testing. What is needed is an INDEPENDENT world organization that
will ensure that ALL viruses are available to all bonafide A/V
companies for testing and enabling their products to detect and clean
same.

and yes, I am advocating meanwhile the sharing of the actual test
samples and any other samples which the A/V company may wish to test
and improve their products.

Note: I read with interest Chris Scally's article: "Is the WildList
Too Tame" in this month's VB. It deals in part with some of the
problems detecting variants vs generic name of Macro Viruses. This is
more reason why ALL virus samples should be available to A/V
companies.

Nick FitzGerald

unread,
Apr 10, 2000, 3:00:00 AM4/10/00
to
Norman Hirsch <NOSPAM...@nha.com> wrote:

> >Norman seems to be saying that the ICSA should share its
> >samples with the developer community. This raises a more
> >than slight problem with testing fairness.
>
> I am and "testing fairness" should be secondary to the goal of
> eliminating viruses.

You are sorely confused about my emphasis on "testing
fairness". A certification body cannot begin to do a
reasonable job if they do not have fair testing
procedures. Some fair testing procedures work to
further promote better detection among the tested
products. Some are neutral in regard to this criterion
of "test usefulness". I cannot think of any that work
contrary to that goal.

But as it seems you believe that much of the rest of
the world is mssing something here, please work through
the following discussion and point out the detailed
reasons why my beliefs (which are more or less
universally shared by the developers of the
acknowledged "best" virus scanners and by the
developers of many of the "still striving" scanners too)
are wrong. I look forward to being corrected and
tracking down the sources of the errors in my thinking
so I can be sure to not repeat them in future.

The important points, as I see them:

1. In the process of replicating, different samples of
a virus can have different "forms" of the virus code.
This is not the issue of what is strictly, technically
known as polymorphism -- for example, some macro viruses
name their sub-routines differently when they infect the
normal template from when they infect documents or user
templates. Some binary infectors have different forms
in COM vs EXE files or in "straight EXEs" vs "EXE
drivers", and so on.

2. Everything in 1. can be further exacerbated by poly-
morphism, whereby a virus deliberately alters its form
between every sample, generation or at some other
(possibly random) event during replication.

3. For a scanner to claim it detects a given virus, it
must reliably detect every form that the virus can
"naturally" take.

Seems very simple to me, so far. Do you disagree with
any of those ideals Norman?

The "problems" arise when someone tries to "certify"
that a scanner can detect a virus. Detection, as with
the virus' replication, is an algorthmic process and the
algorthim(s) the developer chooses for detecting any
particular virus will either detect it reliably or not.
Detection errors of two kinds are possible -- claiming
to find the virus when it is not present and failing to
detect it when it is.

It is an impossible task for the tester (or the
developer) to create every possible sample of any one
virus, let alone to do so for the hundreds or thousands
of viruses typically used in a test (or for which a
product claims detection). Thus, the tester produces a
test-set with "representative" samples of the viruses
under test, including various different kinds of host
files and replication conditions that vary in the ways
known to interact with the infection mechanisms of the
viruses under test.

If a product misses a particular sample of a given
virus, should the developer be given that sample?

Testing methodology says no -- the test then becomes
biased (for reasons discussed in my previous post and
mainly not needing repetition here). Giving the
developer the specific missed samples (at least while
their product is subject to re-test) means the developer
can cheat and add detection of the specific sample. If
that happens, the certification is flawed, if it is
presented on the strength of this subsequent, *bogus*
"detection".

Norman seems to think that in saying this I am being too
concerned about testing fairness.

Well, duh! What is a test?

Anyway, the point of a detection failure (or of a false
positive -- but we have not been talking about that kind
of detection error) is that the developer has picked a
"bad" *detection algorithm*. This most likely shows a
weakness in the developer's research into the working of
the virus in question. The fact they missed a valid
sample is thus sufficient information for that developer
to set about improving their product. Admittedly, given
the complexity of all the sub-systems in some modern
scanners *and* the ugliness of some polymorphic viruses,
finding the "problem" may be a time-intensive effort that
is greatly eased by giving the developer the actual
missed sample, but then we get back to the testing
fairness issue.

Norman is welcome to his opinion that I am too concerned
about that, but in say so he is overlooking a rather
obvious fact... We only know how good products are at
detection by their performance in tests, yet Norman says
he is most concerned about getting all products to detect
as many viruses as possible.

That latter aim is enhanced much more by ensuring that
the developers *cannot* cheat in the tests. If a
developer cannot cheat by detecting a specific file,
rather than by improving the detection algorithm they use
for a "problematic" virus, then the overall effect is
that poor products *will be improved* or will fail tests
and thus lose market share, thereby increasing virus
detection as former customers switch to better products.

Like it or not, that's what competition is about...

So, Norman's professed objective is actually better served
by my (in his view) over-emphasis on fair testing. Would
products be as good today of Consumer Reports had taken
the manufacturers word for it that some dodgy feature
would be fixed in the next design revision and thereby CR
gave the product a recommendation rather than a deserved
basting?

Nope, and that's a good analogy if testers give missed
samples from current test-sets to developers (if those
samples are to be used in re-testing, either in the
current or in future tests).

Note: Personally, most developers I have worked with are
*not* of the ilk that would add detection for a particular
file rather than properly fix their products and it pained
me to refuse them missed samples. It's the old story
though, to be fair you have treat everyone the same, which
means many good develoeprs ended up being treated as if
they also slithered on their bellies...

<<snip>>


> >What should ICSA do now?
>
> Provide both samples to all bonafide A/V companies

We already knew you thought that Norman. I was looking
for the rationale to your belief.

> I am unclear on ICSA's protocols on sending viruses that were missed
> also but I believe they should make them all available, perhaps in
> their own labs, after testing each time if they are to exist at all as
> a testing organization.

In theory, the locale of access is irrelevant -- such
access is unfair and "spoils" the tests. In practise,
however, the locale of access can make the test even more
unfair. If the ICSA were to make access to its test
samples available, but only in its test labs, then those
developers who do not have a strong technical team in the
US and near ICSA's labs would be at a distinct financial
disadvantage if they wished to avail themselves of the
"unfair advantage" of direct access to the test samples.

> >So, to go back to the beginning of this message, Norman's
> >claim the ICSA does not "share" its test-set is, I believe,
> >wrong so long as you understand the significant difference
> >between "a virus" and "a sample of a virus". If Norman
> >actually meant that ICSA should share its test-set samples
> >with the developers of the products ICSA certifies, I
> >strongly disagree and several detailed discussions of good
> >testing practice argue signiifcantly against this, as does
> >some rather shameful testing history.
>
> Here again, you are putting the "testing" ahead of the goal of
> enabling A/V companies to eliminate as many viruses as possible and
> ASAP.

I don't think I am over-emphasizing fair testing so much
as you are conflating the concepts "detects this specific
sample of the virus" with "can detect the virus reliably".

The *latter* is the only worthwhile goal unless you are too
focussed on obtaining the certification. I thought Norman
was against the ICSA tests as "goals unto themselves", so
my position of "make the products better by improving their
detection algorithms through better research" seems more in
line with Norman's goals than suggesting that specific
missed samples should be given to the developers.

<<snip earlier discussion of biological vs computer viruses>>


> I think your attempt at a distinction between the bio and computer
> viruses is irrelevant and in fact wrong. Multiple Macro viruses can
> create new viruses with new algorithms and bio-viruses mutate also.

True, but that does not discount anything I said.
Mecahnisms for classifying "hybrid" macro viruses as "just
a combination of known viruses" (and therefore not "new" or
"different" variants) or as "other than a straight
combination" (and therefore new variants) are generally
agreed within the AV research community. Whether such
"hybrids" are tested or not depends on what os reported to
the WildList and/or on what a given testing organization
sees from its customers, etc.

> In my opinion, the best and simpliest solution to both is to provide
> samples of all viruses to the A/V developers. Forums such as this

Why is that "best"?

I've given some *reasons* why that approach is a very poor
idea. You have just told us, and now repeated, your
opinion. While you are welcome to your opinion, it's
customary in a debate to put forward some form of
justification or argued rationale for why someone should
hold or support an opinion you are trying to gain support
for...

> newsgroup will eventually turn up problems much quicker than the
> limited tests done by organizations such as ICSA/Secure Computing,
> etc. The internet has made such limited testing unnecessary and in
> time this will be more true. The world community focusing on

In general, this is a nice idea. In reality, many of the
"problems" reported will be non-issues caused by ignorance
of the phenomena under discussion. To sort the wheat from
the chaff will always require some element of specialist
knowledge and expertise... I don't see that necessarily
rules out a role for one or more certification bodies.

> viruses, reporting viruses, problems, etc is much more important than
> the specific collection of viruses held by these organizations for
> testing. What is needed is an INDEPENDENT world organization that
> will ensure that ALL viruses are available to all bonafide A/V
> companies for testing and enabling their products to detect and clean
> same.

Norman -- you've come full circle... How do you tell the
"bonafide" AV companies apart from the others?

One of your biggest gripes seems to be that some developers
are prevented getting some samples of some viruses because
of arbitrary, inter-personal subjectivities. Whose
subjective criteria for "bonafide AV company" will this new
world body use?

> and yes, I am advocating meanwhile the sharing of the actual test
> samples and any other samples which the A/V company may wish to test
> and improve their products.

As the tests are the only measures of "quality" most
(potential) customers have to go by, the quality of the tests
must be assured if they are to be used to measure product
quality and its improvement. Therefore you cannot ignore
issues of testing quality and fairness. We know your opinion
on this -- give us the arguments that support it. *Convince*
us of why we are wrong-headed in our current beliefs...

> Note: I read with interest Chris Scally's article: "Is the WildList
> Too Tame" in this month's VB. It deals in part with some of the
> problems detecting variants vs generic name of Macro Viruses. This is

Ahhhh -- I agree with some of the issues raised there, and
have made similar comments in the past. This is, however, a
separate issue. Here you are talking about the adequacy of
reporting and recording mechanisms outside the victim
organization. It will be a loooong time before this is
improved...

> more reason why ALL virus samples should be available to A/V
> companies.

All virus samples will never be available to all developers.
Ignoring the point that it is logistically impossible, let's
just consdier macro viruses. Often samples of these are sent
from a customer to their AV vendor under NDA because a
confidential document or spreadsheet is infected, but that
file must be shared with other users within the AV customer's
organization, or even with outsiders (their lawyers,
accountants, auditors, etc). In fact, often getting samples
of "problematic" new variants of such viruses has been
complicated, and thus the delivery of a suitable update
slowed, precisely because of the AV customer's own policies
regarding release of internally created documents to
"outsiders"...

In short, Norman's is a nice, but hopeless, dream!

Faced with such vagaries as the real world seems bent on
throwing in the way of Norman's view of perfect testing, it
seems we will have to keep making do with our current,
somewhat imperfect, procedures. Removing some of the most
egregious imperfections may seem like an over-emphasis on
detail to Norman, but it seems like an acknowledgement of
cold, hard reality to me.


--
Nick FitzGerald

Randy Abrams

unread,
Apr 10, 2000, 3:00:00 AM4/10/00
to
Norman Hirsch <NOSPAM...@nha.com> wrote in message
news:XXLwOGCNeRTbkj...@4ax.com...

> On Sat, 08 Apr 2000 11:32:52 GMT, "Nick FitzGerald"
> <ni...@virus-l.demon.co.uk> wrote:
>
> >Norman Hirsch <NOSPAM...@nha.com> wrote:
> >
<snip>

>
> >Norman seems to be saying that the ICSA should share its
> >samples with the developer community. This raises a more
> >than slight problem with testing fairness.
>
> I am and "testing fairness" should be secondary to the goal of
> eliminating viruses.

No it shouldn't... To be more clear, the first goal of a certification is to
ensure that certify-ees meet the criteria. The individuals at the ICSA want
to eliminate viruses, but the people going to the ICSA, or looking at an
ICSA certified logo want to know that the product they are considering is of
sufficient quality to meet certification criteria. Obviously, the job of the
ICSA here is to provide a fair, unbiased and meaningful test. Virus Bulletin
also tests and certifies products. It's great that they want to help get
rid of viruses, but I don't look at their tests to get rid of viruses, they
are first and foremost, a reference with information. The ICSA certification
is a reference. Testing fairness must be a primary concern for the ICSA in
their certification.

<snip> >


> >So, to go back to the beginning of this message, Norman's
> >claim the ICSA does not "share" its test-set is, I believe,
> >wrong so long as you understand the significant difference
> >between "a virus" and "a sample of a virus". If Norman
> >actually meant that ICSA should share its test-set samples
> >with the developers of the products ICSA certifies, I
> >strongly disagree and several detailed discussions of good
> >testing practice argue signiifcantly against this, as does
> >some rather shameful testing history.
>
> Here again, you are putting the "testing" ahead of the goal of
> enabling A/V companies to eliminate as many viruses as possible and
> ASAP.

The test is for us consumers. The ICSA is not a central anti-virus research
and development lab. If the anti-virus company does not know how to
replicate samples, then they should fail the certification. The first and
foremost purpose of certification is to demonstrate a level of
accomplishment and not to provide a developers network for AV companies.

<snip>


> I think your attempt at a distinction between the bio and computer
> viruses is irrelevant and in fact wrong. Multiple Macro viruses can
> create new viruses with new algorithms and bio-viruses mutate also.

You are confusing cross breeding with mutation.

> In my opinion, the best and simpliest solution to both is to provide
> samples of all viruses to the A/V developers.

And how does this tell me which products perform at a known standard?

> Forums such as this
> newsgroup will eventually turn up problems much quicker than the
> limited tests done by organizations such as ICSA/Secure Computing,
> etc. The internet has made such limited testing unnecessary and in
> time this will be more true.

The internet has also becomes the fastest and largest distribution mechanism
for mis-information. I even saw a wobbler alert here on this very board.
We've seen many inaccurate posts up here. The internet has, in no way at
all, done away for the need of reliable, competent product testing and
reporting.

<snip>


> testing. What is needed is an INDEPENDENT world organization that
> will ensure that ALL viruses are available to all bonafide A/V
> companies for testing and enabling their products to detect and clean
> same.

That may be, but it doesn't mean that we don't need anything else. It
doesn't mean that all products would be equal. It doesn't mean that there is
or would be no need for certification/testing. It doesn't mean that the ICSA
should be the organization to do this either.

<snip>


> Note: I read with interest Chris Scally's article: "Is the WildList
> Too Tame" in this month's VB. It deals in part with some of the
> problems detecting variants vs generic name of Macro Viruses. This is
> more reason why ALL virus samples should be available to A/V
> companies.

This is true, but has nothing to do with the fact that the ICSA performs a
fair and equitable certification of anti-virus products. Just because there
are other needs it does not mean that the ICSA should stop certifying
products.

Larry Bridwell

unread,
Apr 11, 2000, 3:00:00 AM4/11/00
to
Norman Hirsch wrote:
>
> On Fri, 07 Apr 2000 16:08:52 -0400, Larry Bridwell
> <lbri...@mindspring.com> wrote:
>
<snip extraneous>... I am spending the weekend with two of

> >the most beautiful, intelligent and well mannered young men to be born.
> >My two wonderful grandsons, Ryan and Gavin. Now that description may
> >bejust a tad prjudiced, but it is a fact none the less :-).
> >
> Have fun with them, Larry. Nothing is better than that!

I knew we could find grounds for agreement :-)!

Spent the weekend with them and enjoying the rest of the week as well.

Larry
>
> Best regards,
>
> Norman Hirsch

Larry Bridwell

unread,
Apr 11, 2000, 3:00:00 AM4/11/00
to
Nick FitzGerald wrote:
>
> Norman Hirsch <NOSPAM...@nha.com> wrote:
>
<snip>

>

> Well, resubmission or update submission are allowable, so
> ICSA contacts the developers with the appropriate news about
> which virus(es) their product missed. Unless things have
> changed since I last checked, the ICSA also sends a sample
> of the missed virus.

Only if the developer request a sample.

> This is where a major "problem" can
> arise -- does the ICSA send *the* missed sample, or just
> "some" sample of the same virus from when the ICSA lab staff
> replicated it?

ICSA Labs will send a "replicant" not the actual test sample, for the
obvious reasons you so eloquently explained.

>
> I am not entirely clear of the ICSA's protocols here. I
> believe that in the (distant?) past the ICSA sent the actual
> missed sample.

I think you could be accurate here, but this has not been the case since
at least October of 1996 and I believe before that. I believe the
current replicant policy was put in place when Richard Ford was Director
of AV research.

> Therein lies one potential problem -- what
> should be done about that sample when it comes to re-testing
> the product when the update is received? A devious
> developer (and aren't they all?? 8-) )

:-)


<snip examples>

>
> So, what should happen instead of the developers being given
> copies of the actual missed samples? They should be given
> samples of the same virus replicated in a similar manner (and
> suitably tested for recursive replication, just as all samples
> in a virus test-set should).

Ours are :-).

> This has the advantage of allowing the samples in the test-
> sets to remain static. *Between* tests, samples of VirusY can
> be changed, but for all tests at a given date the consumer of
> the results knows they are comparing apples on the same
> "ripeness" scale. The main disadvantage of this approach is
> that the testing organization staff has to be sufficiently
> "respected" by the AV developers that they can "trust" the
> non-tested sample they receive really is a sample of the same
> virus their product is said to have missed a sample of.
> Samples need not be changed between tests, but they can be --
> the important thing is that they remain the same "within" a
> test (and to me, that should include re-tests).

Good points Nick.

I am not sure how many testers and developers really understand the
importance of the Trust factor.

>
> My view: Should ICSA give the actual missed test samples to
> teh developers -- no

And we do not, ever!

<snip>

>
> My view: Should ICSA give *some* sample of missed virus(es)
> to the developers of products it certifies -- yes (but not an
> actual test-set sample given the caveats above). The reasons
> are manifold, but it allows the developers of the deemed
> "faulty" scanners to confirm that they and the ICSA are
> talking about the same virus.

And in our case it gives the developer the chance to challenge our
findings. This was an important item in allowing ICSA to post the lab
results. Before 1998 our testing contracts did not allow us to do this,
but by developing a trust relationship over time, making reports
(privately) to the consortium over time, we were able to overcome that
obstacle and give end users some needed information and still maintain
the trust of the developers.


? Further, at least over time,


> it can allow the developers to affirm the "quality" of the
> lab work at ICSA and it does allow any suspected problems in
> these areas to be aired between the developer and the lab.
> My suspicion is that this is probably the current the
> practise for certification tests (at least for "in teh wild"
> viruses).

It is.

>
> So, to go back to the beginning of this message, Norman's
> claim the ICSA does not "share" its test-set is, I believe,
> wrong so long as you understand the significant difference
> between "a virus" and "a sample of a virus". If Norman
> actually meant that ICSA should share its test-set samples
> with the developers of the products ICSA certifies, I
> strongly disagree and several detailed discussions of good
> testing practice argue signiifcantly against this, as does
> some rather shameful testing history.

This would be the approximate ICSA Labs position

>
> > ICSA's is simply to test whether viruses are eliminated and make money
> > doing it? Imagine if some organization held a bunch of human viruses
> > and tested whether anti-biotics stopped them or not while at the same
> > time not allowing these companies to have samples of the viruses to be
> > sure they and others could eliminate them. Is there something wrong
> > with that analogy? I think not. ...

Yes... but Nick covers it at least as well as I could and he did so
sooner. (Enjoying the grandchildren!)

Another reason users should consider more than JUST VB tests or ICSA
tests or anyone's tests. While any one of the major tests
(certification or comparative) may give some idea of a products
efficacy, one should do good research before making the final plunge
(especially corporate users).

Good post Nick.

Larry Bridwell
ICSA Labs


>
> --
> Nick FitzGerald

Larry Bridwell

unread,
Apr 11, 2000, 3:00:00 AM4/11/00
to
Norman Hirsch wrote:
>
> On Sat, 08 Apr 2000 11:32:52 GMT, "Nick FitzGerald"
> <ni...@virus-l.demon.co.uk> wrote:
>
> >Norman Hirsch <NOSPAM...@nha.com> wrote:
> >

<snip>

> >Norman seems to be saying that the ICSA should share its
> >samples with the developer community. This raises a more
> >than slight problem with testing fairness.
>
> I am and "testing fairness" should be secondary to the goal of
> eliminating viruses.

Actually I would put the wording a little differently. In a previous;
as an educator we would have called the elimination of viruses as the
terminal (ultimate) objective and the "testing" would be an enabling
objective (as it helps third parties develop detection and elimination
products) and the "testing fairness" would be a qualifier as to how
effective the enabling objective (testing) was in reaching the terminal
objective.

However, to reach the "terminal objective" there would be many more
enabling objectives. Some may include new protection technology,
customer education, better and more secure OSes, and, most importantly,
the cessation of virus writing.

Your opinion. I tend to agree with Nick here.

> Multiple Macro viruses can
> create new viruses with new algorithms

Examples please.

> and bio-viruses mutate also.
> In my opinion, the best and simpliest solution to both is to provide
> samples of all viruses to the A/V developers.

Can't see how that would change/stop/curb new viruses.

>Forums such as this
> newsgroup will eventually turn up problems much quicker than the
> limited tests done by organizations such as ICSA/Secure Computing,
> etc.

While I am sure this group and those like it have and will continue to
play an important role in consumer education and provide emergency aid
when needed, the virus problem seems to be at least as bad as ever and
the proliferation of viruses seems to be increasing. While testing
agencies have at least helped av products raise their level of
effectiveness and provide consumers with products better able to protect
them from known viruses (at least). These agencies also help the
consumer by helping with QA.

> The internet has made such limited testing unnecessary and in
> time this will be more true.

In what way? Seems to me the internet has only given us more vectors of
infection, more dangerous payload potential, and a more rapid avenue of
spread. The only positives I can find are the potential of faster
upgrades/updates and consumer education (which can and often is offset
by faulty information from pseudo-experts).

> The world community focusing on
> viruses, reporting viruses, problems, etc is much more important than
> the specific collection of viruses held by these organizations for
> testing.

And exactly how would a "world collection differ"? Where do you the the
"world community" collection would come from?

> What is needed is an INDEPENDENT

Define Independent. Everyone uses the word, but I doubt any of us
really mean the same thing.

> world organization that
> will ensure that ALL viruses are available to all bonafide A/V
> companies for testing and enabling their products to detect and clean
> same.

Again, it is a great idea, maybe even an ideal, but how practical do you
think it is when we can't even get a world organization to provide food,
clothes, homes, and social justice without the threat or actual use of
military force? My opinion: Wishful thinking at best, naďveté at
worst. Of course, I would be more than willing to help see it happen.
I have often been accused of being a little on the pie-in-the-sky side
:-).

>
> and yes, I am advocating meanwhile the sharing of the actual test
> samples and any other samples which the A/V company may wish to test
> and improve their products.


> Note: I read with interest Chris Scally's article: "Is the WildList
> Too Tame" in this month's VB. It deals in part with some of the
> problems detecting variants vs generic name of Macro Viruses. This is
> more reason why ALL virus samples should be available to A/V
> companies.

I'll read it again. Did not quite come away with that conclusion.

0 new messages