Posted by: David Harley | July 4, 2017

AV-Test Malware Statistics

AV-Test offers an interesting aggregation of 2016/2017 malware statistics in its Security Report here. Its observations may be of particular interest to readers of this blog (how are you both?) since they’re based on AV-Test’s sample collection.

The report points out that:

There is no indication based on proliferation statistics that 2016 was also the “year of ransomware“. Comprising not even 1% of the overall share of malware for Windows, the blackmail Trojans appear to be more of a marginal phenomenon.

But as John Leyden remarks for The Register:

The mode of action and damage created by file-encrypting trojans makes them a much greater threat than implied by a consideration of the numbers…

Looking at the growth in malware for specific platforms, AV-Test notes a decrease in numbers for malware attacking Windows users. (Security vendors needn’t worry: there’s still plenty to go round…)

On the other hand, the report says of macOS malware that ‘With an increase rate of over 370% compared to the previous year, it is no exaggeration to speak of explosive growth.’ Of Android, it says that ‘the number of new threats … has doubled compared to the previous year.’

Of course, there’s much more in this 24-page report. To give you some idea of what, here’s the ToC:

  • The AV-TEST Security Report 2
  • WINDOWS Security Status 5
  • macOS Security Status 10
  • ANDROID Security Status 13
  • INTERNET THREATS Security Status 16
  • IoT Security Status 19
  • Test Statistics 22

David Harley

Posted by: David Harley | June 8, 2017

Testing paper for VB, article for ESET


It seems that I’ve kind of drifted back into writing about product testing, yet again. In October I’m due to present a recently-completed paper as part of a Small Talk at Virus Bulletin, in Madrid. More about that later (since VB hasn’t published any info about the Small Talks yet).

In the meantime, here’s an article just published by ESET: Testing, marketing, and rummaging in the FUD banks

There’s likely to be another article on another testing issue that bugs me in the near future…

David Harley

Posted by: David Harley | March 1, 2017

AV-Comparatives Android Test

This looks like a reasonably comprehensive and informative test of Android security products. The test, from AV-Comparatives, used ‘the top 1,000 most common Android malware threats of 2016’.

Android Security Test 2017 – 100+ Apps tested

Lots of mainstream products scored 100% (as they should in a test like this), and others scored near to that. So this isn’t going to tell you which of those products you should be using. (Which is fine by me: I’m not sure that the ‘editors pick’ snapshot choice between several products that are approximately level  in performance, though beloved of magazine reviews, is generally very helpful.) What it does show is a lot of products whose scores seem to be unacceptably low.

David Harley

Posted by: David Harley | February 14, 2017

Myths, Marketing, and Testing: the Antimalware Generation Gap

ESET has just published an article that was originally my contribution to this report for ESET: TRENDS 2017: SECURITY HELD RANSOM. The new(-ish) article is about so-called next-gen’s uneasy relationship with more established players in the anti-malware industry, rather than focusing exclusively on testing, but does nevertheless look at testing issues such as ‘next-gen’ companies’ reluctance to participate in testing, yet pushing ‘an already open door even wider by their own attempts to compare the effectiveness of their own products and those of ‘first-gen’ vendors.’ The blog article is here: Next-gen security software: Myths and marketing.

What it doesn’t really address – well, it was written a while ago – is the recent resurgence of the bad old idea that testing is so easy that anyone can do it, with guidance and off-the-peg samples from (next-gen) vendors or their associates. Remind me to tell you about Grottyscan and Wonderscan sometime, from an article on how to bias testing.

Recently, next-gen companies have stepped up their war with some testers. Ironically, treading much the same ground that longer-established vendors have stumbled on in the past. But I intend to come back to that.

Meanwhile, here are some relevant links.

Crowdstrike versus NSS:

Steve Ragan on Cylance versus AV-Comparatives and MRG Effritas: Cylance accuses AV-Comparatives and MRG Effitas of fraud and software piracy – Is it time for a new testing and certification model in the industry? Interestingly, Cylance seems to have backtracked somewhat on its opposition to Pay-to-Play, and has commissioned a test from AV-Test using methodology Cylance ‘co-created’.

I got a certain amount of sour amusement from Cylance’s own assertion that this is ‘The first time that a 3rd party testing organization created their own malware in order to conduct testing.’ If only that were true… And if only it were a good idea. Apparently it’s also the first time that a tester ‘developed new testing methods specifically designed to target next-generation vendors.’ I wonder if SE Labs has any thoughts about that?As far as I can tell from Ragan’s article the ‘new’ methodology consists of the following:

  • The first test, the ‘Holiday Test’ seems to be an old-school ‘frozen update test’.
  • The second test used ‘malware’ created by AV-Test ‘to simulate certain types of attack’.
  • The third test involves disabling URL filtering.
  • The fourth is a small-scale FP test.

AMTSO’s latest press release doesn’t mention specific vendors or tests, but does observe with regard to ‘recent tests’ that

We reject turning off product capabilities while comparing the capabilities of products in real-world use, as we believe that this introduces bias in the results.

Hard to argue against that, though such cherrypicking of functionality has long bedevilled testing: that has a lot to do with AMTSO’s emphasis on ‘whole product testing’. The press release doesn’t mention malware creation or the use of simulated malware, though these are other areas that have been discussed at some length at AMTSO meetings and in documentation (and many other places).

These are interesting times for testers and vendors. And indeed for AMTSO, which includes among its members nearly all these players. Not Cylance, though, but Cylance reseller Cognition is a member, and also seems to have a close relationship with TestMyAV, which advocates DIY testing rather than using independent testers, or as they put it, trusting ‘the experts’.

I’m glad I’m not caught up in that particular cat-herding exercise any more.

On the other hand, I think I may feel a paper coming on.

David Harley

Posted by: David Harley | January 15, 2017

Testing the Internet of Things

The security community has long been concerned about the potential for compromise of the so-called Internet of Things (IoT). Recently, it has become commonplace to add Internet connectivity to objects that in the past functioned quite happily without connectivity. It seems that there are plenty of people who see advantages to being able to control all sorts of things from light bulbs to televisions to heating to kettles, though I sometimes wonder whether in some instances they’re mostly manufacturers rather than consumers, who may not be desperate to control everything in the house through a smart app. However, it’s been apparent time and time again that manufacturers in this market segment are not always giving security the attention it requires.

Well, I won’t bore you all (or both…) with another Luddite rant, in case I start to sound too much like a Jeffrey Deaver killer. However, I applaud AV-Test‘s initiative in setting up a site to record their testing of IoT device security. Right now the articles there seem to be focused on IP cameras, an area that AV-Test has also addressed more generally on its mothership site. It’s not an area I’m sufficiently conversant with to comment on AV-Test’s reviews, but given the organization’s recent experience in this field and its reputation in the field of anti-malware testing, I imagine they’ll be up to the usual high standards.

David Harley


« Newer Posts - Older Posts »