Getting to the bottom of independent AV comparative testing – who to believe and how to make the right decision.
We respect the right of any user to choose a reliable antivirus solution. That is why Kaspersky Lab participates in almost all industry recognized independent tests: after all, when users want to compare security solutions, they are going to refer to the results of these tests. But it is important to understand what is behind each “approved” and “certified” logo.
Of course, every vendor will say their product is the best. Therefore, when choosing between several reputable, popular security solutions, a sensible user will not only look at the arguments and claims of each developer, but also look at independent analysis. There are numerous companies that spend days on end doing exactly that – testing various antivirus solutions, firewalls, proactive protection systems and all the other security components that are now integrated in the comprehensive protection suites used by individuals and companies. Each test laboratory has its own evaluation system and methodology, so it’s important to understand what their “awards” are worth and what you can expect from a certified solution. We will briefly describe a few of the most renowned independent labs: AV-Comparatives, AV-TEST, Dennis Technology Labs and VirusBulletin.
The main test
The most important thing evaluated in the test labs (something that is difficult to perform at home) is the quality of anti-malware protection (protection against viruses). Two main approaches are used for this – “real world” and “classical” – and sometimes a combination of the two may be used.
Today, many experts believe that real world testing is more important for the user. It simulates situations where malicious or infected websites and email attachments are opened on a protected computer and calculates the percentage of threats that are detected and blocked before they penetrate the system. The advantage of this method is its ability to check all the protection technologies integrated in a product (there are now a great many of them) using the very latest threats.
There are several variations of the classical approach, but testing is generally performed as follows. A virus scanner is pitted against a folder containing several thousand files that includes a collection of hundreds or even thousands of currently active malicious programs of various types. Ideally, the antivirus solution should detect 100% of the malicious files. Another important parameter, which can be evaluated concurrently, is the percentage of false positives. The “clean” files in the folder must be ignored, so ideally this figure should be 0%.
Even the most reliable antivirus solution cannot guarantee 100% protection. Simply being good at detecting viruses is not enough – an AV solution must be able to treat the infected computer if, for example, an infection has already penetrated the machine. This is important because, according to Kaspersky Lab statistics, 5% of antivirus solutions are installed on infected PCs. And then there are user errors, which can result in malware finding its way onto the computer.
Ideally, the antivirus shouldn’t slow the system down and shouldn’t bother the user with frequent requests and warnings about program activity because it annoys people and ultimately reduces the effectiveness of the protection since users end up ignoring the pop-up windows.
The AV-Comparatives test lab carries out separate tests to evaluate the most important functions: protection in real-world scenarios, detection of infected files (classical testing), performance and many more. Once a year, based on all of its tests, AV-Comparatives announces its highest “Product of the Year” award as well Gold, Silver and Bronze awards for each test type. But if you don’t want to wait until the end of the year, you can get a pretty good idea of how good a product is by looking at the results of 2-5 different tests. The most interesting are the Real World test and the Performance evaluation. False positives reduce the score in the first test. The best solutions are awarded an Advanced + certificate. Those that are not so perfect get an Advanced certificate while the others end up with Standard and Tested badges. Every month, AV-Comparatives publishes the intermediate results of its Real-World test containing information on the current threats that every popular antivirus solution was able to block. The lowest protection level, which can be provided by the Windows-integrated mechanism Windows Defender or Microsoft Security Essentials, is taken as 90% of blocked threats. The market leaders are usually fighting it out for a fraction of a percentage point in the range of 95-100%. Kaspersky Internet Security blocked 100% of the threats in the latest Real-World test.
Until recently, AV-TEST certification was held once every two months and included three categories of tests: Protection, Repair, and Usability. Each of the three categories was scored from 1 to 6. A solution that got a total of 11 points or more received a certificate.
However, this year the tests have been altered and it has now become much easier to earn a certificate than before. Previously, such factors as the detection rate, the repair rate, the number of false positives and the level of system slowdown were taken into account. This year the Repair category has been withdrawn from testing and the minimal score for obtaining a certificate has been lowered to 10 points. That means an antivirus solution certified in 2013 may be unable to repair an infected system. Kaspersky Anti-Virus received the annual AV-TEST awards and certificates before the changes. Now it will be even easier for mediocre solutions, on par with student coursework, to be able to earn the test lab’s certification. You can find out more about the reasons for the changes to the certification rules in Eugene Kaspersky’s blog.
Dennis Technology Labs
British test lab headed by Simon Edwards has been specialized in a real world antivirus testing for a long time. Dennis Labs produces quarterly reports for different product categories – protection for home users, SOHO and large companies.
The certification methodology deserves special emphasis, as it was created in accordance with recommendations of AMTSO international association. There are a lot of factors affecting final rating: ability to detect malware, ability to treat infection, frequency of false positives. Each factor has its own weight and the final result quite accurately indicates the quality of the antivirus software in real world conditions.
Dennis Labs also diversified its competence by adding new tests – at the end of 2012 Dennis Labs started to test the functionality of corporate antiviruses application control.
One of the oldest and most respected resources in the antivirus sphere, appearing in the form of a magazine since 1989, performs its own antivirus testing and awards its own VB100 certificates. Only solutions that detect 100% of infected samples and produce 0% false positives receive these certificates. Virus Bulletin conducts the “classic” rather than “real world” test; products evaluate a collection of files that includes a small number of current viruses. A good-quality product is quite capable of achieving the 100% result. For example, the April results show that 29 of 67 security solutions successfully passed testing. Unfortunately, the VB100 certification does not take into account the performance and the repair quality, which makes a full evaluation of a product impossible. A consumer has to bear in mind another feature of the VB100 award – the logo could be used in vendor’s marketing material even after one basic level certification. That is why we recommend using a handy results table, which provides a quick overview of a few of the last tests.
We have only mentioned four popular laboratories that test antivirus software, but there are many others that carry out similar work, not to mention the tests performed by magazines and other sources of comparative information. These tests are very different in quality and completeness – as we’ve already mentioned, even the leading labs do not always aspire to the highest quality standards. Therefore, if you are interested in a particular test and its results we recommend you find out how it was conducted. Did all the leading antivirus vendors participate in it? How big was the test collection of clean and infected files? Were all the key quality factors of an antivirus solution taken into account? And finally, what sort of result earns the highest award? If more than a dozen solutions received the top award, it was obviously given out too easily.