Posted by: David Harley | March 1, 2017

AV-Comparatives Android Test

This looks like a reasonably comprehensive and informative test of Android security products. The test, from AV-Comparatives, used ‘the top 1,000 most common Android malware threats of 2016’.

Android Security Test 2017 – 100+ Apps tested

Lots of mainstream products scored 100% (as they should in a test like this), and others scored near to that. So this isn’t going to tell you which of those products you should be using. (Which is fine by me: I’m not sure that the ‘editors pick’ snapshot choice between several products that are approximately level  in performance, though beloved of magazine reviews, is generally very helpful.) What it does show is a lot of products whose scores seem to be unacceptably low.

David Harley

Posted by: David Harley | February 14, 2017

Myths, Marketing, and Testing: the Antimalware Generation Gap

ESET has just published an article that was originally my contribution to this report for ESET: TRENDS 2017: SECURITY HELD RANSOM. The new(-ish) article is about so-called next-gen’s uneasy relationship with more established players in the anti-malware industry, rather than focusing exclusively on testing, but does nevertheless look at testing issues such as ‘next-gen’ companies’ reluctance to participate in testing, yet pushing ‘an already open door even wider by their own attempts to compare the effectiveness of their own products and those of ‘first-gen’ vendors.’ The blog article is here: Next-gen security software: Myths and marketing.

What it doesn’t really address – well, it was written a while ago – is the recent resurgence of the bad old idea that testing is so easy that anyone can do it, with guidance and off-the-peg samples from (next-gen) vendors or their associates. Remind me to tell you about Grottyscan and Wonderscan sometime, from an article on how to bias testing.

Recently, next-gen companies have stepped up their war with some testers. Ironically, treading much the same ground that longer-established vendors have stumbled on in the past. But I intend to come back to that.

Meanwhile, here are some relevant links.

Crowdstrike versus NSS:

Steve Ragan on Cylance versus AV-Comparatives and MRG Effritas: Cylance accuses AV-Comparatives and MRG Effitas of fraud and software piracy – Is it time for a new testing and certification model in the industry? Interestingly, Cylance seems to have backtracked somewhat on its opposition to Pay-to-Play, and has commissioned a test from AV-Test using methodology Cylance ‘co-created’.

I got a certain amount of sour amusement from Cylance’s own assertion that this is ‘The first time that a 3rd party testing organization created their own malware in order to conduct testing.’ If only that were true… And if only it were a good idea. Apparently it’s also the first time that a tester ‘developed new testing methods specifically designed to target next-generation vendors.’ I wonder if SE Labs has any thoughts about that?As far as I can tell from Ragan’s article the ‘new’ methodology consists of the following:

  • The first test, the ‘Holiday Test’ seems to be an old-school ‘frozen update test’.
  • The second test used ‘malware’ created by AV-Test ‘to simulate certain types of attack’.
  • The third test involves disabling URL filtering.
  • The fourth is a small-scale FP test.

AMTSO’s latest press release doesn’t mention specific vendors or tests, but does observe with regard to ‘recent tests’ that

We reject turning off product capabilities while comparing the capabilities of products in real-world use, as we believe that this introduces bias in the results.

Hard to argue against that, though such cherrypicking of functionality has long bedevilled testing: that has a lot to do with AMTSO’s emphasis on ‘whole product testing’. The press release doesn’t mention malware creation or the use of simulated malware, though these are other areas that have been discussed at some length at AMTSO meetings and in documentation (and many other places).

These are interesting times for testers and vendors. And indeed for AMTSO, which includes among its members nearly all these players. Not Cylance, though, but Cylance reseller Cognition is a member, and also seems to have a close relationship with TestMyAV, which advocates DIY testing rather than using independent testers, or as they put it, trusting ‘the experts’.

I’m glad I’m not caught up in that particular cat-herding exercise any more.

On the other hand, I think I may feel a paper coming on.

David Harley

Posted by: David Harley | January 15, 2017

Testing the Internet of Things

The security community has long been concerned about the potential for compromise of the so-called Internet of Things (IoT). Recently, it has become commonplace to add Internet connectivity to objects that in the past functioned quite happily without connectivity. It seems that there are plenty of people who see advantages to being able to control all sorts of things from light bulbs to televisions to heating to kettles, though I sometimes wonder whether in some instances they’re mostly manufacturers rather than consumers, who may not be desperate to control everything in the house through a smart app. However, it’s been apparent time and time again that manufacturers in this market segment are not always giving security the attention it requires.

Well, I won’t bore you all (or both…) with another Luddite rant, in case I start to sound too much like a Jeffrey Deaver killer. However, I applaud AV-Test‘s initiative in setting up a site to record their testing of IoT device security. Right now the articles there seem to be focused on IP cameras, an area that AV-Test has also addressed more generally on its mothership site. It’s not an area I’m sufficiently conversant with to comment on AV-Test’s reviews, but given the organization’s recent experience in this field and its reputation in the field of anti-malware testing, I imagine they’ll be up to the usual high standards.

David Harley

 

Posted by: David Harley | December 10, 2016

Babel fish all round, please

I referred earlier to an AV-Comparatives ‘next-gen’ test, and intended to follow up by pointing to this article by Nikita Shvetsov for Kaspersky: Lost in Translation, or the Peculiarities of Cybersecurity Tests.

Nice. So-called next-gen vendors may be less keen. 🙂

David Harley

Posted by: David Harley | December 10, 2016

When DIY testing isn’t DIY

For several weeks now I’ve been meaning to write about Carl Gottlieb’s site TestMyAV. Well, not about that site so much as the pros and cons of companies setting up their own test labs.

It may surprise you to know that I actually started my career in security working for a medical research organization, and part of the job was, in fact, evaluating anti-malware products. Though in those days malware was nearly all viral. But long before I crossed the Great Divide and starting working with security companies rather than just using their products, I was scaling back on actual virus testing, as with the gradual escalation of the problem, I couldn’t give it the attention it required. Nowadays, since a sizeable proportionof my income comes from providing security companies with consultancy, I couldn’t ethically set myself up as an independent tester, even if I could find the time.

TestMyAV worries me (a lot). It suggests that testing is simple enough that anyone can do it with the help of the resources that TestMyAV provides, including some high-level advice and documentation on setting up a lab, but also offering samples. And it seems to me that if newbie testers are reliant on samples from a site that doesn’t disclose its sources, they have at least two problems. They have to assume that the samples are valid, in the absence of a documented validation process. And they don’t know whether the samples are sourced from one of the companies they plan to test, which is a methodological disaster. As Simon Edwards, one of the most scrupulous testers I know, observed on Twitter:

‘Testing anti-malware with malware provided by tested vendors (or related companies) is about as biased as testing can be. Don’t do it!’

Clearly, he’s referring to the fact that Carl Gottlieb is CTO of Cognition, which is a major Cylance reseller.

Well, I still intend to get back to this topic at greater length, though I’m making no promises about when or where. But in the meantime, it seems that Kevin Townsend has been worried about the site, too. In Anti-malware testing issues he lays stress on links between TestMyAV and Cognition. He emphasizes the number of pages there that offer an antivirus product recommendation. He summarizes the ongoing war of words between the mainstream used-to-be-antivirus industry and those companies that call themselves ‘next generation’. And he suggests a less contentious way of testing products.

I don’t agree with every word Kevin says: I think it’s pretty harsh to suggest baldly that independent testers aren’t independent, for instance, even though I’m not the testing industry’s biggest fan. The symbiotic relationship between testers and the mainstream security industry is complex and in some senses problematical, but both industries have – sometimes, at least – fought hard (in AMTSO and elsewhere) to strike the best possible balance in the interests of fair testing and the best outcome for the consumer. Nevertheless, he makes some very important points.

Carl Gottlieb evidently disagrees vehemently, but has said that he won’t ‘address the points publicly’.

David Harley

(Not speaking for any company or organization.)

Older Posts »

Categories