Posted by: David Harley | May 26, 2016

ICSA Labs: Testing the Internet of Things

ICSA Labs, nowadays a division of Verizon, has a long history in the world of testing and certification (and is a longstanding member of AMTSO). The company has come up with rather a good idea: security certification for the disparate range of devices and sensors that make up the Internet of Things. There’s no doubt that anything that might help in raising IoT security standards is worth a cheer or three.

Unfortunately, at the time of writing I can’t seem to access the ICSA Labs web site, but there’s an article by Richard Chirgwin for The Register that goes into a little more detail – ICSA Labs wants IoT industry to seek security certification – though he’s sceptical as to how much interest there’ll be. The article links to an announcement here, and a white paper describing the programme here, and I’ll be taking a look at those as soon as I can.

Chirgwin also mentions a somewhat similar programme announced by the Underwriters Laboratories (UL): UL Launches Cybersecurity Assurance Program. The announcement claims that:

New UL 2900 Series of Standards Offer Testable Cybersecurity Criteria for Network-Connectable Products & Systems

Clearly also worth a look.

I may come back to this topic in the near(-ish) future.

David Harley

 

Posted by: David Harley | May 12, 2016

SE Labs: what Simon did next…

…Simon Edwards, that is. Simon has had considerable influence on the testing scene in recent years both as a tester with Dennis Technology Labs, where he was Technical Director, and as one of the leading lights of AMTSO, where he was formerly chairman of the Board of Directors.

The web site for his new venture, SE Labs, is now up and running (though it has a couple of rough edges at the time of writing), and already includes a report on Home Anti-Malware Protection that compares a number of products. Registration (which is painless and not over-intrusive) is required to access enterprise and business reports.

As you’d expect from Simon, the site is more informative than most sites about methodologies. There’s also a blog page which I will follow with interest.:)

Hark! What is that rumbling? I think it might be the trembling of some of those ‘next-gen’ and APT-detection vendors who claim that their technology is too magical advanced to be tested. The site’s About page claims:

Constantly innovating, SE Labs has developed next-generation testing to prove the abilities of ‘next-generation’ security products using a comprehensive, full-stack approach to security assessment powered by true and detailed threat intelligence.

And given Simon’s exhaustive work in that area, I fully expect that he’ll make good on that promise.

Interestingly, the recent cat-meets-pigeons announcement by VirusTotal about Maintaining a healthy community by discarding subscribers who take data from VT but don’t share data or include their service in VT’s API includes this observation:

Additionally, new scanners joining the community will need to prove a certification and/or independent reviews from security testers according to best practices of Anti-Malware Testing Standards Organization (AMTSO).

This will presumably not affect all those vendors who are insisting that they do not use signatures and that losing access to VirusTotal’s data will not affect them in the slightest. But if you need a good testing service, guys, I think there might be one that meets VT’s requirements over here.😉

This is probably not the last you’ll hear from me on this…

David Harley

Posted by: David Harley | April 22, 2016

EICAR Call for Papers

I haven’t had much to do with the EICAR conference in recent years – well, it’s nearly 18 months since I went to any conference, and even longer since I went to Virus Bulletin, rather to my own surprise – but I note that the Call for Papers is now on the web site. I feel obliged to note these things when I’m on the Review Team.😉 The announcement says that:

The 24th Annual EICAR Conference will be held on October 17th and 18th with a pre-conference program on the EICAR Minimum Standard in Nuremberg, Fairground, at the IT-SA conference facilities.

The conference theme is ‘Trustworthiness in IT security products’. Which no doubt has something to do with the EICAR Trustworthiness Strategy, which is a topic I may well come back to in due course (hence its inclusion on this site).

More information at http://www.eicar.org/17-0-General-Info.html.

David Harley

 

Posted by: David Harley | April 22, 2016

AV-Comparatives test security product support

One aspect of security software that isn’t often tested is the quality of its support. AV-Comparatives, however, has grasped that particular nettle with support evaluation reports from security vendors with support desks in the UK and in Germany. The reports are available here, and I commented at more length for Infosecurity Magazine: Testing Anti-Malware Support.

I like the idea, but would like to see AMTSO consider generating some guidelines.

David Harley

Posted by: David Harley | April 16, 2016

AV-Comparatives Testing with AMTSO List

The Austrian security product testing lab AV-Comparatives has announced what it describes as ‘the first public industry test using the AMTSO Real Time Threat List (RTTL).’

The Anti-Malware Testing Standards Organizations describes the RTTL as follows:

The Real-Time Threat List (RTTL) is a repository of malware samples collected by experts from around the world. The repository is managed, maintained and secured by the Anti-Malware Testing Standards Organization (AMTSO).

Ah, I hear you say, isn’t that what the WildList Organization does? Well, yes, but RTTL is supposed to do it better, according to the abstract for a 2013 presentation for Virus Bulletin by Righard Zwienenberg, Richard Ford, and Thomas Wegele: The Real Time Threat List.

The abstract states:

In particular, the change in the nature of online threats has left the WildList trailing the ‘real-time’ threat, making it unsuitable for effective ‘in-the-wild’ testing.

And that is undeniably true, as I and others have said many times. For instance here:

It may be that until we see methodologically sound ‘testing of testers’ we will have to accept that there is still a place for ‘best endeavours’ testing based on sample sets that lack freshness but are known to have been validated. Clearly, though, certification based entirely on detection of WildCore samples can no longer be regarded as a sufficient guarantee of a product’s effectiveness. It’s fortunate, then, that the reputable organizations making use of WildList testing are already using or moving towards methodologies that hold onto what is good about the WildList while adding functionality and value by bolstering it with fresher samples and more dynamic technologies.

The VB presentation on RTTL here goes just a little further than the abstract in 2013:

Something new, fast and “accurate” had to be created, which eventually resulted in the conception of the Real Time Threat List…The idea of the Real Time Threat List is to share new threats with additional meta-data incorporated into the system.

AMTSO’s terse FAQ doesn’t really add much in the way of information. There has been plenty of discussion at AMTSO meetings, of course, but AMTSO seems to be clinging to its reluctance to share information outside those meetings, even to AMTSO members, though I keep hearing rumours of an occasional chink of light breaking through at least one of those walls.  We shall see.

AV-Comparatives is, as you’d hope and expect, reasonably forthcoming about its methodology in the test report. It took the ‘Top500’ samples from RTTL as of 9th March 2016, and filtered out the 247 samples that didn’t display malicious behaviour in its sandboxes, in the expectation of removing false positives and ‘possibly unwanted’ samples. It used its own ‘Real World Protection Framework’ to execute each sample simultaneously against security programs configured with default settings and cloud access. All vendors were, in this instance, AMTSO members. And I don’t object to it in principle, since the description should tell any reasonably well-informed reader enough to evaluate the validity of the tests. But what does it tell us about the validity of the RTTL? Well, it tells us that the Top500 isn’t necessarily exhaustively validated or categorized.

And that isn’t necessarily a problem. An advantage of WildCore is that it represents a notionally validated sample set is that it represents, as the WildList Organization FAQ put it,’ a set of replicated virus samples that represents the real threat to computer users.’ The corresponding disadvantage is that by definition the samples aren’t ‘fresh’ because they’ve been around long enough to undergo through some sort of validation process. Which was fine back in the 90s, when the pace of sample release and discovery was infinitely slower. I should also mention that back then, malware was almost entirely viral, whereas most malware samples nowadays aren’t viruses: that is, they don’t self-replicate. So maybe requiring the testing organization to do its own validation is an acceptable trade-off, as long as RTTL access doesn’t get as far as all those amateur (or amateurish) testers who are unfamiliar with the concept of sample validation.

But AV-Comparatives methodology described tells us nothing further about the RTTL, and AMTSO remains uncommunicative. It seems to me that by not expanding on the public details of RTTL, AMTSO is asking us simply to take its word for it that its collection is a Good Thing. And in fact, it has a lot going for it, potentially at least. But there are plenty of testers who are still prepared to say that their test sets are better than anyone else’s without feeling the need to supply evidence.

AMTSO, of course, is no more a testing organization than it is an anti-malware vendor. But if it’s providing samples to testers it might as well be, without information on the conditions under which it is distributed, and it would certainly be nice to know more about the underlying collection methodology.

Don’t get me wrong: while my formal involvement with AMTSO is at this point minimal, I remain sympathetic to its agenda and intentions, and I do believe that testing is in much better shape now than it would be if the organization had not existed. But there are plenty of other people who are less sympathetic – not to say actively hostile. Even if the information above was easily available to outsiders, they will remain suspicious of the apparently close – not to say symbiotic – relationship between vendors and testers within the organization. Sadly, that relationship is in my experience often pretty stressed, inside and outside AMTSO. However, also in my experience, members on both sides of the vendor/tester divide are generally in agreement on the need to raise testing standards and accountability for the benefit of the users of security products, and are committed to working towards achieving higher standards.

AMTSO’s caution as regards sharing information is understandable, given the mauling it has received on occasion from the media (and from vendors and testers who may be less willing to be too accountable for comfort). But in my opinion, it was too ready to give up on transparency.

David Harley

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.