1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.


Level 36
Content Creator
In the Malware Protection Test, malicious files are executed on the system. While in the Real-World Protection Test the vector is the web, in the Malware Protection Test the vectors can be e.g. network drives, USB or cover scenarios where the malware is already on the disk.
Great results for G Data and very good for F-Secure and Norton Life Lock, but with more false positives.
Microsoft not so good performing in this test and showing, like others, it's dependence of an active internet connection.
Aantekening 2020-04-20 143855.png


Level 11
Detection vs Protection is potentially a misleading metric too, as detection favors static scanning and most AV suites have a ton of features aimed at runtime protection, whether that’s a behavior blocker or just AMSI and network protection. It’s interesting to break down what component of the AV suite did the job but it’s also a bit misleading. For most of us at the end of the day, it’s whether or not you were protected that matters — the overall protection rate.

unless of course your primary requirement is to depending on static scanning results in which case that’s the part that matters.


Level 22
Still seeing on forums like the Microsoft Community, volunteer moderators and other supposed authorities stating Defender (assumed out-of-box) is "all you need." Maybe but this assumes all users share the same blah online behaviors. Either that or they talk out of both sides of the mouth.

Microsoft got a one-star rating this time. I sense a small gold rush toward the numero unos--ESET, GData, etc, as usual.

Comparatives, I use for entertainment purposes. But they consistently show MS is all over the place from one quarter to the next. I mitigate the fluctuations with first and third party tools. It's like taking a daily vitamin tablet.


Level 30
Malware Tester
Great from ESET and G-Data. Least dependency on the cloud, great protection, less false positives.
But offline detection rate is less useful here in a sense because nowadays almost everybody is always connected to internet while using their PC so online protection rate is the most important section here. Offline detection gives you the idea of how fast an AV creates signatures for new, prevalent samples. False positive chart is important too. If you take a look at the Online Protection rate there's not much difference among them and all products are over 99% in a collection of 10,249 malware samples. So, I agree with what blackice said, "just pick whatever fits your computer and performs best and just keep swimming."


Level 11
False alarm testing is interesting too, detailed breakdown at: False Alarm Test March 2020 | AV-Comparatives

Personally, I think almost every product here did well against false alarms. It is not stated how many total samples there were so we don't know what the rate is. But that aside, here's some things I noticed:

  • Almost no false alarms are on digitally signed packages (highlighted as orange in their table above), and honestly most legitimate software in circulation these days are digitally signed. Windows already makes you jump through so many hoops when installing digitally signed software
  • Some of the false positives are in other AV engine installation packages -- this seems more excusable that seeing another AV engine or signature DB might trigger a false positive
  • Some of the false positive samples are controversial at best -- for example WildTangent is frequently considered ad-ware/PUA over its lifespan. Maybe it's better now, but I honestly would not be mad if my AV flagged WildTangent... Also, NortonLifeLock flagged TeamViewer as a trojan. TeamViewer does seem to be used as a RAT backdoor quite often, so perhaps this is an intentional choice on the vendor side
  • The vast vast majority of the detections are on unsigned packages/installers, which tend to exhibit suspicious behaviors -- they unpack files, they install things to various places on the system, etc etc etc.
  • Some of the prevalences seem questionable at best. For example, "Microsoft Encarta Package" had the 2nd highest prevalence, which means "probably several tens of thousands of users". The last version of Microsoft Encarta was 2008. Maybe tens of thousands of users is correct, but I find it hard to believe that this is common.
Overall I dislike this test. When the bulk of the hits are unsigned installer packages, this isn't the kind of false alarm that keeps me up at night. My worst nightmare is that one day my AV updates its hourly signatures and then suddenly decides a Windows system file or a Microsoft Office library is malware, automatically removes it, and renders my machine unbootable. Is this test saying "that won't happen"? Or simply that they didn't include that in their testing library?

I'd encourage going through that whole list of false positives and seeing how many of those pieces of software you've heard of, or know of someone who uses it. It's not a ton....
Last edited: