Posted by: David Harley | March 14, 2011

Comparative Testing: Hits and Mythsses

Helmuth Freericks is an old hand at the malware game, having got to Commtouch by way of Authentium and Command Software.

In an article for SC Magazine on Anti-virus myths and facts, he offers an explanation of how the misinterpretation and misunderstanding of AV test results can cost money. In particular he considers the myths that:

  • AV is purely reactive (a peculiarly stubborn myth, that one: while we all wish AV products had a higher success rate in detection, the problem is certainly not that the industry is restricted to static signatures)
  • Not all detections are true positives – well, I’m not sure anyone is unaware of the false positive problem nowadays, but I can’t deny that it is a problem
  • “Testing an anti-virus solution should be done by throwing as many viruses at it as possible”: in fact, while he doesn’t really go into the static versus dynamic testing debate as such, he does make a number of points worth mentioning, including the undesirability of mixing detection testing and performance testing.

David Harley 


Responses

  1. The survey results suggest that vendors should fix performance without defining what that means. I work for Symantec and I am deeply involved in our performance testing efforts. Defining “performance” is a lot harder than it looks. Are we talking scan speed? Memory use during scans? Memory use during idle? Impact on boot up times? Amount of network traffic generated? Impact on opening office files? Impact on downloading web pages, staring applications . . . .? There are many metrics of performance. We trust Passmark to run our tests because they look at roughly 14 metrics of performance rather than just cherry picking one favorite number. You can see their latest reports here:
    http://www.passmark.com/benchmark-reports/index.htm

    • That’s why we have performance testing guidelines. 🙂 Yes, I agree. AV testing in general is much harder than most people think (to do properly, anyway), and the idea that performance testing is somehow easier is misconceived. I’m not here to endorse particular testers, but a lot of AV companies use Passmark. Of course, where a report is sponsored by a company, the reader needs to take that into account.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Categories

%d bloggers like this: