This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
We encourage you to compare these results with others and take informed decisions on what security products to use.
Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.
Did not participate in AV-Comparatives and SE Labs tests.F-secure?
Do you believe that it would be among the first if it would?Did not participate in AV-Comparatives and SE Labs tests.
Yes. It was in the previous comparison:Do you believe that it would be among the first if it would?
Yes, If you use Edge with SmartScreen + PUA enabled. If I correctly recall, the extension (Windows Defender Browser Protection) does not include PUA.I've read in some articles that Microsoft Defender protects better if you use Edge and it's not the same with other browsers, even if you use the browser extension. Is that true?
Real-World tests include fresh web-originated samples.
Malware Protection tests include older samples (several days old) usually delivered via USB drives or network drives.
Real-World Triathlon 2021-2022: SE Labs, AV-Comparatives, AV-Test (7548 samples in 24 tests)
-------------------Missed samples
Norton 360..................12..... =
Avast...........................13..... +
Kaspersky....................18..... =
Microsoft...................*27.5.. =
McAfee ........................37.... +
Avira ............................43.... =
Comparison with the period 2019-2021:
- no significant changes in protection ( = )
- significant improvement in protection ( + )
Real-World Biathlon 2021-2022: SE Labs, AV-Comparatives, AV-Test (6748 samples in 16 tests)
-------------------Missed samples
Norton......................6
Bitdefender .............9
Avast........................10
Kaspersky................13
TrendMicro..............13
Microsoft...............*22.5
McAfee....................23
Malwarebytes.........26
Avira........................30
Malware Protection Biathlon 2021-2022: AV-Test, AV-Comparatives (270634 samples in 16 tests)
-------------------Missed samples
Norton 360.............1........ =
McAfee...................3....... +
Bitdefender.............5
Avast....................15........ +
Kaspersky.............28....... =
Microsoft............. 30....... =
Avira ....................45....... =
Malwarebytes.....173
TrendMicro ........623....... -
Comparison with the period 2019-2021:
- no significant changes in detection ( = )
- significant improvement in detection ( + )
- significant decrease of detection ( - )
Why two years period?
The results of any single test made by SE Labs, AV-Test, or AV-Comparatives are useless for comparing AVs. In most cases, the statistical errors allow only saying that the group of 10 first AVs can be awarded. This follows from a too-small number of tested samples. I noticed that even a period of one year is not sufficient.
Some thoughts about these results:
https://malwaretips.com/threads/the-best-home-av-protection-2019-2020.106485/post-927440
- The entries with the same color cannot be differentiated due to statistical errors.
- The differences between several AVs are very small and they can hardly be noticed by the home user.
- In some cases, the differences can disappear by tweaking the AV settings.
https://selabs.uk/consumer/
https://www.av-comparatives.org/news-archive/
https://www.av-test.org/en/antivirus/home-users/
It is explained in the OP. The same color means that the statistical error is probably greater than the differences in missed samples. So, on the basis of the data one cannot say which AV is better.When you intent this post as serious, I have some serious questions (the last being the most statistically relevant one : )
- What do the color codes mean?
It comes from:
- What does the asterix for Microsoft mean?
The number of AV labs.
- What is the difference between real world thriatlon and biatlon in your categories?
- When biatlon means those AV-s took only part in 2 out of 3 on what criteria did you exclude missed samples of the AV;s mentioned in Thriatlon?
- On what basis did you decide that the sample test set of two years testing is big enough to make them relevant for the total malware population in those two years?
I interpreted this as a sacrifice of safety at the cost of convenience and fun living with computers.At some point why bother using a PC at all...
It is necessary, but overconfidence is prohibited. In my opinion, it is too overprotective to the extent that it interferes with the use of online software.So is an antivirus really necessary?
On what basis did you decide two years is enough? The sample test has to be relevant to the number of malwares in the wild in that specific period. Since nobody knowns how much malware lives in the wild in a given month (you can't count what you don't detect), I am still in the dark about why 2 years is sufficient, why not four years or eight?It is explained in the results for the period 2019-2020:
https://malwaretips.com/threads/the-best-home-av-protection-2019-2020.106485/post-927440
Whatsapp malicious ?About Norton Security. It detects some good files as WS.Reputation. And for some malicious files, it shows a good file.
It's not that often. It's probably forgivable. Everyone decides for himself whether to forgive or not.
Sent the file to their lab.
FakeWhatsApp / Win32.Stealer.Whatsapp malicious
On what basis did you decide two years is enough? The sample test has to be relevant to the number of malwares in the wild in that specific period. Since nobody knowns how much malware lives in the wild in a given month (you can't count what you don't detect), I am still in the dark about why 2 years is sufficient, why not four years or eight?
Very well said.As posted earlier: it makes me smile about the passion MT-members debate qualities and advantages of specific AV's while you are saying the differences are irrelevant due to to small test sample sizes
Yes you explained, but you did not answer my question.The results from one year can be very different in the next year. This difference is significantly smaller when comparing the two-year periods. It is possible that a three-year period could also be OK.
Here is an example from the reference link:
Real-World: SE Labs, AV-Comparatives, AV-Test
-------------------------------2019----2020------
Avast Free...............26..............11
Microsoft................12..............25
The one-year results are totally different, but the two-year results are the same.
On the other side, the AV protection is changing in time. So, it would not be reasonable to increase the period above two years.
Yes you explained, but you did not answer my question.
I am just pulling your leg, I agree with your observation that the missed malware difference are only relevant when the size of "real world malware" is statistically a representative set of all malware in the wild for that reported month. Using the same logic I am challenging the two year period you used
Fact is you don't know the population size (of active malware in a given month) you also don't know how much variation in results the AV-labs want to report. Therefore we don't know the statistocal relevance and representation of the sample sizes used by the AV-labs (for real world scenario testing for instance). The same applies to your two year period