Posted by: David Harley | May 15, 2010

EICAR 2010 Testing Papers

I was at the EICAR conference in Paris last week, and the iAWACS workshop that preceded it. The iAWACS sessions, especially the PWN2KILL challenge, need more consideration than I can give them at this moment, with piles of work to do before the upcoming CARO and AMTSO workshops. In fact, there was a lot more of interest at the EICAR conference than I can summarize here right now, but I’m guessing that readers of this blog will be particularly interested in the testing-related talks and papers. Abstracts of presentations are available from

The keynote speaker, Christopher Devine, discussed his open source AV test suite project, AVerify. I don’t think there’s a paper for this, but EICAR intend to put slides up eventually at

Igor Muttik, an AMTSO Director, presented on “A single metric for evaluating a security product”, in which he argued that “”detection rate” which does not incorporate the timing element is not a valid metric” and offered a vendor-neutral analysis of the factors that contribute to the probability of successful protection, the mathematical approach to calculating this probability, and approaches to implementation in practice.

Lysa Myers and Matt Garrad of West Coast Labs (which is an AMTSO member, by the way) offered “A view into new testing techniques” discussed WCL’s approaches to the challenges that testing poses in the current threatscape.

 Ján Vrabec and I presented on “Real Performance”, a consideration of some of the issues around performance testing (as opposed to detection testing) and whole product testing, both topics for papers which it is hoped will be approved at the forthcoming AMTSO workshop.

Perhaps the liveliest session was the presentation by Jonathan Dechau, Romain Grivaux, Kenza Jaafar and Jean-Paul Fizaine of ESIEA Laval, who discussed “New Trends in Malware Sample-Independent AV Evaluation Techniques with Respect to Document Malware”. This paper used one of the attacks used in the PWN2KILL challenge as a starting point for presenting “a reproducible, open testing method to evaluate anti-virus product”.  While some of the conclusions of the paper were brought into question by uses of the EICAR test file that went against the formal specification of the file (see an article by Eddy Willems in the Virus Bulletin issue for June 2003), points were made (with some passion!) that deserve a more thorough consideration than I can offer here and now.

The Vrabec/Harley paper is available here, and the slide deck is available here. We’ll post links to other papers and slide decks here if and when they become publicly available.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: