AV-TEST 20 Protection Packages Included in the Endurance Test

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Gandalf_The_Grey

Level 50
Verified
Trusted
Content Creator
Apr 24, 2016
3,919
Especially in times of Corona, consumer users are being subjected to a vast onslaught of dangerous spam and malware. The 6-month endurance test from AV-TEST reveals which protection package is able to protect Windows 10 over a long period.

Spam with infected attachments or dangerous downloads are but a few of the risks facing the PCs of private users. Especially in times of Corona, attackers see many opportunities to prey on private data or to extort users with ransomware. Malware statistics at the end of 2019 showed a total of 1.001 billion registered malware samples. At the end of 2020, that number reached 1.139 billion samples – an increase of nearly 14 percent. In the endurance test from July to December 2020, nearly 40,000 proliferating and especially dangerous malware samples were applied – per product.

14 protection packages scored 17.5 to 18 points​

The result shows that in the endurance test there are many protection packages that can stand up against an incredible number of malware samples, yet they do not bog down the Windows system or trigger false alarms. In the 6-month endurance test, each package had the opportunity to score up to 18 points: 6 points each for the three areas of protection, performance, and usability. A total of 4 packages achieved this outstanding feat. They are from F-Secure, McAfee, Microsoft and NortonLifeLock. Also scoring an excellent 17.8 points were the products from Avira, BullGuard, G Data, Kaspersky, Trend Micro and VIPRE Security. AhnLab, ESET, Avast and AVG followed with 17.7 and 17.5 points respectively. By comparison: In a classic monthly test, products achieving a score of 17.5 to 18 points even receive special recognition.
Coming in next were Bitdefender and Protected.net with 17.3 points, followed by K7 Computing with 17 points. The next echelon of packages from Microworld and Malwarebytes reached only 15.9 and 15.8 points. At the end of the ranking is PC Matic, scoring a mere 14.3 points.
Protection packages in an endurance test.jpgMalware detection rates.jpg
 
Last edited:

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,122
Hi, @Gandalf_The_Grey, why do you think Bitdefender is not at the top of the ranks anymore? :confused:
Look carefully at the first table (Usability = false positives).:)(y)
A few million legitimate samples were tested during a system scan. Bitdefender has got 28 false positives (Industry average was 25) and Norton only 1, Kaspersky 2, Microsoft Defender 2, Eset 4, etc.
These false positives are related to offline signatures (not to dynamic detections). That is why there are differences between AV-Test and AV-Comparatives false positives tests. But in the case of Bitdefender, the number of false positives (offline signatures) in the AV-Test is significantly higher than in AV-Comparatives (offline detections selected).
 
Last edited:

Lenny_Fox

Level 22
Verified
Oct 1, 2019
1,125
These false positives are related to offline signatures (not to dynamic detections). That is why there are differences between AV-Test and AV-Comparatives false positives tests. But in the case of Bitdefender, the number of false positives (offline signatures) in the AV-Test is significantly higher than in AV-Comparatives (offline detections selected).

You first mention offline and dynamic (online). Then highlight Bitdefender comparing OFFline AV-test with OFFline AV-Comparatives. Is it an irregularity (high difference under simular conditions, being both OFFline) or a type (explanation that AV-Test is OFFline while AV-Comparativies is ONline)?
 

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,122
You first mention offline and dynamic (online). Then highlight Bitdefender comparing OFFline AV-test with OFFline AV-Comparatives. Is it an irregularity (high difference under simular conditions, being both OFFline) or a type (explanation that AV-Test is OFFline while AV-Comparativies is ONline)?
My note: "But in the case of Bitdefender, the number of false positives (offline signatures) in the AV-Test is significantly higher than in AV-Comparatives (offline detections selected)." is clearly related to offline detections.
For example, in the AV-Comparatives False Alarm Test the offline false positives were:
Bitdefender 2, Eset 2, Kaspersky 3, Microsoft 3.
The total false positives were as follows:
Bitdefender 6, Eset 2, Kaspersky 3, Microsoft Defender 21.
When comparing these results to the results of AV-Test (offline system scan: Bitdefender 28, Kaspersky 2, Microsoft Defender 2, Eset 4,), we can see the anomaly for Bitdefender.
 
Last edited:

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,122
One can also notice strange results for Norton.
In the AV-Test, the system scan of a few million files produced only 1 false positive. On the contrary, in the AV-Comparatives test, Norton produced 41 false positives (about 1000 total samples). All these false positives were on access without executing the files. But, this difference is related to Norton's way of managing files downloaded from the internet. Such files are managed by the Download Insight feature, even if they are not executed. So, in the AV-Comparatives False Alarm Test, the Norton offline detection is not tested and the results cannot be compared with AV-Test.
 
Last edited:

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,122
I wasn't sure if Norton is good or not in these days but I bought today due to this test result and the performance is incredible. Protection-wise I believe the testers :D
This test cannot prove the difference in the AV protection of 11 Avs (the differences are too small as compared to statistical errors). Also, the differences in performance among most of these 11 AVs are very small.
Anyway, you have chosen the AV which is a top AV in many tests of AV-Test, AV-Comparatives, and SE Labs. :)
Norton's protection is based on the file reputation in the cloud, so you can expect a high number of false positives.
 
Last edited:

mlnevese

Level 24
Verified
May 3, 2015
1,375
The protection difference between the top brands s insignificant. If you are going to choose any of them make sure to test them for the full trial period, then use your computer normally. Pay attention to things like performance or problems connecting to some sites/hardware, boot time if this is something that you think is important as well.

Always remember things like RAM usage is not a criterion to label software heavy. System performance impact is.

Then choose whichever worked better for you, has a better price and and an interface you think is intuitive.
 

FireHammer

Level 10
Aug 27, 2020
451
Look carefully at the first table (Usability = false positives).:)(y)
A few million legitimate samples were tested during a system scan. Bitdefender has got 28 false positives (Industry average was 25) and Norton only 1, Kaspersky 2, Microsoft Defender 2, Eset 4, etc.
These false positives are related to offline signatures (not to dynamic detections). That is why there are differences between AV-Test and AV-Comparatives false positives tests. But in the case of Bitdefender, the number of false positives (offline signatures) in the AV-Test is significantly higher than in AV-Comparatives (offline detections selected).
Thanks a lot @Andy Ful for that explanation.:) (y)
 
Top