Posted by: David Harley | July 10, 2017

SE Labs on Effective AV

In my previous (very brief) post I was mildly critical of AV-Test for publishing a comparative review of OS X Sierra products without much in the way of information about methodology. Of course, there are lots of tests that don’t describe their methodology well, if at all, but I tend to expect better of testers who are also AMTSO members, who in principle subscribe to the The same complaint can rarely, if ever, be made about Simon Edwards, who is not only an excellent tester, but frequently writes informed commentary about testing and other security topics.

For example, this article on the SE Labs blog on Can anti-malware be 100 per cent effective? goes some way to answering the question ‘How can testing be any use when they only use a tiny percentage of all the malware that’s out there?’ 🙂 Maybe not all the way there.

But there are links to some very thorough reports.

David Harley 

Posted by: David Harley | July 10, 2017

AV-Test – OS X comparative test

Here’s a comparative rarity (pun intended): a comparative test of OS X products.

This one is from AV-Test: 10 Antivirus Suites for MacOS Sierra Put to the Test

Not much detail on methodology, though.

David Harley

Posted by: David Harley | July 4, 2017

AV-Test Malware Statistics

AV-Test offers an interesting aggregation of 2016/2017 malware statistics in its Security Report here. Its observations may be of particular interest to readers of this blog (how are you both?) since they’re based on AV-Test’s sample collection.

The report points out that:

There is no indication based on proliferation statistics that 2016 was also the “year of ransomware“. Comprising not even 1% of the overall share of malware for Windows, the blackmail Trojans appear to be more of a marginal phenomenon.

But as John Leyden remarks for The Register:

The mode of action and damage created by file-encrypting trojans makes them a much greater threat than implied by a consideration of the numbers…

Looking at the growth in malware for specific platforms, AV-Test notes a decrease in numbers for malware attacking Windows users. (Security vendors needn’t worry: there’s still plenty to go round…)

On the other hand, the report says of macOS malware that ‘With an increase rate of over 370% compared to the previous year, it is no exaggeration to speak of explosive growth.’ Of Android, it says that ‘the number of new threats … has doubled compared to the previous year.’

Of course, there’s much more in this 24-page report. To give you some idea of what, here’s the ToC:

  • The AV-TEST Security Report 2
  • WINDOWS Security Status 5
  • macOS Security Status 10
  • ANDROID Security Status 13
  • INTERNET THREATS Security Status 16
  • IoT Security Status 19
  • Test Statistics 22

David Harley

Posted by: David Harley | June 8, 2017

Testing paper for VB, article for ESET

And…

It seems that I’ve kind of drifted back into writing about product testing, yet again. In October I’m due to present a recently-completed paper as part of a Small Talk at Virus Bulletin, in Madrid. More about that later (since VB hasn’t published any info about the Small Talks yet).

In the meantime, here’s an article just published by ESET: Testing, marketing, and rummaging in the FUD banks

There’s likely to be another article on another testing issue that bugs me in the near future…

David Harley

Posted by: David Harley | March 1, 2017

AV-Comparatives Android Test

This looks like a reasonably comprehensive and informative test of Android security products. The test, from AV-Comparatives, used ‘the top 1,000 most common Android malware threats of 2016’.

Android Security Test 2017 – 100+ Apps tested

Lots of mainstream products scored 100% (as they should in a test like this), and others scored near to that. So this isn’t going to tell you which of those products you should be using. (Which is fine by me: I’m not sure that the ‘editors pick’ snapshot choice between several products that are approximately level  in performance, though beloved of magazine reviews, is generally very helpful.) What it does show is a lot of products whose scores seem to be unacceptably low.

David Harley

Older Posts »

Categories