Posted by: David Harley | March 27, 2015

Choosing an antivirus program

Not so long ago, Heimdal Security’s Aurelian Neagu put together a blog to which a number of security people – including me – contributed tips: 50+ Internet Security Tips & Tricks from Top Experts.

So, given my continuing interest in testing and related issues, I was interested to see that Aurelian had put up security guide on What Is The Best Antivirus For My PC? A Step-By-Step Research Guide. After all, a lot of my writing (including the work I did with AMTSO) has been concerned with helping people to be able to make informed judgements on what security products they should invest in. Not from the point of  view of recommending specific products – since much of my income comes from working with a company that is a major player in the anti-malware market, it would be hard to avoid conflicts of interest – but in terms of making the best possible decision. However, I’ve tended to focus on product testing, and in particular, approaches to evaluating comparative tests.

The Heimdal guide takes a slightly broader approach exemplified by a Venn diagram where three circles representing Expert Reviews, Independent Testing, and User Opinions overlap to make up the label Complete Antivirus Assessment.

I’m not altogether on board with this approach:

  • Due to decades of reading forum discussions – from the chaos of alt.comp.virus in the 1990s, where marketroids, researchers, virus writers and confused computer users all rubbed shoulders, through to various LinkedIn groups where most of the posts are by vendor marketing managers –  I’m far from convinced that crowd-sourced information is reliable. It’s one of those areas where if you know enough to distinguish between good and bad advice, maybe you don’t need advice. There’s an article demanding to be written here on what snippets of advice should raise red flags, but Heimdal hasn’t written it.
  • The trouble with expert reviews is that so many of them are not written by experts (and they’re not always independent). It’s one of those areas where if you know enough to distinguish between good and bad advice, maybe you don’t need advice. (Is there an echo in here?)
  • Not all independent tests are competent. And some that look independent aren’t.

If past experience is anything to go by, I stand a good chance of inviting accusations of being at least negative and possibly elitist by these comments. But I don’t think it’s enough to direct people towards a forum of all shades of opinion and expertise may be represented. How do you decide whose advice to take (especially when it’s based on  ones-size-fits-all-criteria like price – how many tests, reviews and commentaries assume that free AV is best)?

There are, in fact, some useful ideas here as regards sources of information, like several of the more competent testers.  But it’s downright bizarre that there’s no mention of AMTSO here. Admittedly, one of the reasons I no longer have formal ties with AMTSO is that I always felt that the organization could have done more to engage with the everyday user, rather than focusing on testers. And it’s a pity that the AMTSO site seems to have dropped linking to articles other than its own guidelines documents, most of which are focused on testing methodologies rather than evaluation of tests by non-experts. (However, the AMTSO Fundamental Principles of Testing is still a must-read for anyone who wants to understand more about testing.)

Heimdal are to be applauded for trying to provide clarity where there is none – or very little – but I’m disappointed.

David Harley

Posted by: David Harley | March 2, 2014

AMTSO Feature Settings Checks Expanded

With a very muted fanfare, AMTSO has adjusted and expanded its web page for anti-malware feature settings by splitting it into two pages: the main page now links to  “Feature Settings Check for Desktop Solutions” and “Feature Settings Check for Android based Solutions“.

The Desktop Solutions page still links to the following tests:

  1. Test if my protection against the manual download of malware (EICAR.COM) is enabled
  2. Test if my protection against a drive-by download (EICAR.COM) is enabled
  3. Test if my protection against the download of a Potentially Unwanted Application (PUA) is enabled
  4. Test if protection against accessing a Phishing Page is enabled
  5. Test if my cloud protection is enabled

The Android links are as follows:

  1. Test if my protection against the manual download of malware is enabled
  2. Test if my protection against a drive-by download is enabled
  3. Test if my protection against the download of a Potentially Unwanted Application (PUA) is enabled
  4. Test if protection against accessing a Phishing Page is enabled

I haven’t looked at the new links, as I don’t have an Android device to test them with.

Feature testing is about checking whether your security product has specific features available and activated, and isn’t really related to the comparative testing that AMTSO mostly focuses on. Still, a lot of people seem to find tools like the EICAR ‘test’ file useful and reassuring.

David Harley
Small Blue-Green World

Posted by: David Harley | February 26, 2014

AMTSO Guidelines on Mobile Testing

Testing security software for mobile platforms isn’t the easiest testing area, but given that so many people use mobile devices for much or even all their personal computing, it’s a very important one.

And it’s exactly the area addressed by a paper recently published on the Anti-Malware Testing Standards Organization web site. Major topics it addresses include similarities and dissimilarities to testing for PC platforms, real world anti-malware testing (of course), and battery drain measurement.

The document AMTSO Guidelines on Mobile Testing  was approved by the AMTSO membership on the 20th February 2014.

David Harley
Small Blue-Green World

Posted by: David Harley | November 11, 2013

Testing the WATeR

As it happens, I wasn’t at the most recent AMTSO meeting, or at first Workshop on Anti-Malware Testing Research (WATeR) held the same week and at the same venue in Montreal.

However, Virus Bulletin’s testing guru John Hawes evidently was, and wrote a typically informative summary for the Sophos blog. He paid particular attention (quite rightly) to a presentation by AMTSO president Richard Ford (a former editor of Virus Bulletin, as it happens*, and Harris Professor of Assured Information at the Florida Institute of Technology) on “Do we measure resilience?” Rightly, because the distinction between robustness and resilience in a security product isn’t considered often enough in testing. Unsurprisingly, given that testing of cleaning and disinfection is even more time-and-resource intensive than accurate detection testing.

Richard also presented at this year’s Virus Bulletin on A meta-analysis of recent malware tests (with Liam Mayron, also of FIT) and was co-author (with ESET’s Righard Zwienenberg and Avira’s Thomas Wegele) of The Real Time Threat List, also presented at Virus Bulletin 2013.

Hat tip to Andrew Hayter for drawing John Hawes’ blog to my attention, and to Ina Nestinova for reminding me that I hadn’t done anything with the information.

David Harley
Small Blue-Green World 

*Well, not entirely coincidentally. Virus Bulletin has been a major player in AV product testing for a long time, so apart from his academic stature and skills in other areas of security, his knowledge of testing at VB and elsewhere probably had a significant bearing on his being invited to take on the role of AMTSO President.

Posted by: David Harley | October 12, 2013

Testing and OS X: the VB paper

(Reblogged from Mac Virus)

The paper Lysa Myers and I presented at Virus Bulletin last week on OS X security product testing is now available from Geek Peninsula – New Conference Paper: Virus Bulletin 2013 (including the abstract by way of description) – or from the ESET Threat Center Conference Papers page  – Mac Hacking: the Way to Better Testing?

There’s also an article for Infosecurity Magazine that summarizes the paper at some length: Mac Product Testing: After the (Flash) Flood.

David Harley
Small Blue-Green World
ESET Senior Research Fellow

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.