Is Kaspersky good for Windows 10

Antivirus software for Windows 10 tested

Frank Ziemann

The AV-Test Institut has tested 21 antivirus programs for private users under Windows 10. Eleven products got the full number of points, none failed.

EnlargeAntivirus for Windows 10 in the test

The Magdeburg AV-Test Institute tested 21 antivirus programs that their manufacturers had submitted for certification. The detailed tests took place in January and February under Windows 10 Pro (64 bit). Compared to the previous test two months ago, the test field has remained the same except for a few version changes. The test report of the protection solutions for companies can be found here.

The best AV programs in a nutshell:

Of the 21 products tested, these eleven candidates achieved the full score of 18 points:

This is how it is tested

As usual, tests are carried out in the three categories of protection, speed and usability. The protection programs have to detect and fend off almost 10,000 pests that are not older than four weeks. In addition, they are confronted with 258 daily malware in the so-called real-world test (0-day malware). The testers check to what extent the antivirus programs slow down common everyday processes, such as calling up websites, downloading, copying files or installing and using legitimate software. The evaluation of the usability results from false alarms that occur during such processes. The software is checked with standard settings and can pull out all the stops, including cloud functions.

There are a maximum of six points in each category, so a maximum of 18. Those products that achieve a total of at least ten points and at least one point in each category receive a certificate. In addition, AV-Test awards the "Top Product" rating for solutions that score extremely well in all test criteria and achieve a total of 17.5 points or more.

EnlargeTest results antivirus solutions for Windows 10

The test results

All candidates easily met the minimum requirements for an AV-Test certificate. This time, the full 18 points have been achieved by 11 products - more than half. This includes Microsoft Defender Antivirus (formerly Windows Defender), which is included as standard in Windows 10. Four other products follow just behind, losing only half a point. These 15 protective solutions received the AV-Test rating of “Top Product” - that's a good two thirds of the test field. So it's still pretty tight at the top. Only at the very end of the table of results do the performance decrease significantly.

➤ These antivirus manufacturers continue to support Windows 7

In this test, 14 out of 21 antivirus solutions achieved the full six points for protection. The vast majority of the products in this test offer good to very good malware detection. Only Vipre, Microworld and PC Matic cannot keep up. At Northguard, there is Kaspersky software in the box, only one version number older, and on par with the original.

EnlargeRussian-Danish twins: Kaspersky and Northguard

Slowed down: is the PC slowing down?

AV-Test tested the braking effect of the protection programs on a standard and a high-end PC. Usually differences between the two computers are measurable, but hardly noticeable in practice. At TotalAV, as well as Kaspersky and Northguard, the handbrake on is noticeable when visiting popular websites. When starting common software, Bitdefender and Vipre in particular slow down. When copying, tested with over 10,000 files, all remain quite inconspicuous, as is the case with downloads.

False alarms: this is how often there are false alarms

For a long time PC Matic (formerly PC Pitstop) has been noticed by frequent false alarms, but had improved in the meantime. This time, however, there are again 19 misdiagnoses, most of them during the installation and use of common software. This is followed by Malwarebytes with four and Trend Micro with three incorrect results. Avira, ESET, F-Secure, G Data, K7 Security, Norton, Protected.net and Vipre show that you can complete all tests without even triggering a false alarm.

While not a single program generated false positives while surfing the web, many candidates falsely raised the alarm at least once during system scans (complete testing of a clean system). The error rate of most products is at the limit of bearable or below - measured by the fact that the system scan checked over 1.3 million virus-free program files.

Conclusion

There is no clear winner in this test either, the tip is too wide for that. There are still losers: In the recent past, Vipre has mostly scored with significantly better malware detection, but this time allowed too many malware through in the real-world test. Trend Micro failed in the real-world test in February and failed to detect four malware threats, and triggered three false positives during the course of the test. Ahnlab had a very similar experience. So you end up with only one point behind the top in the bottom quarter of the table. With Avast