AV-Comparatives Consumer Real-World Protection Test July-October 2020

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Gandalf_The_Grey

Level 51
Verified
Trusted
Content Creator
Apr 24, 2016
4,006
The second half-year results of the ongoing Consumer Real-World Protection Test are now available!

Real-World Protection Test July-October 2020 - AV-Comparatives

Highest and lowest protection rates revealed for 17 popular Windows antivirus programs

Independent, ISO-certified security testing lab AV-Comparatives has released the results of its 2020 H2 Consumer Real-World Protection Test. Popular anti-malware programs for Microsoft Windows from 17 different vendors were put through their paces. These were (alphabetically): Avast, AVG, Avira, Bitdefender, ESET, F-Secure, G Data, K7, Kaspersky, McAfee, Microsoft, NortonLifeLock, Panda, Total AV, Total Defense, Trend Micro and Vipre.

The tested products had to defend against over 750 recent and prevalent malicious programs, downloaded from the Internet. They were assigned one of four possible award categories, according to how well they performed in the test. From lowest to highest, the categories are: Tested, Standard, Advanced, and Advanced+.

The Consumer Real-World Protection Test checks each security product’s ability to protect a PC against threats originating on the Internet. The tested products can use all of their protection mechanisms, such as URL blockers, reputation services, cloud signatures and behavioural detection.

To ensure that the anti-malware programs do not plague the user with false alarms, a false-positives test is included in the Real-World Protection Test. This involves downloading and executing harmless program files, and browsing to harmless websites, to check if any of these are incorrectly identified as malicious. Products with above-average levels of false alarms have their award level downgraded.

The Consumer Real-World Protection Test is part of AV-Comparatives’ Main Test Series for consumer products. The Test Series also includes the Malware Protection Test and the Performance Test. The former checks the ability of antivirus products to protect against malicious files already on the system, or coming from an external drive or over the local area network. The Performance Test checks what effect each product has on the speed of the PC. Together with the Real-World Protection Test, these tests provide an all-round view of a security product’s technical capabilities.

Like all AV-Comparatives’ public reports, the report of the 2020 Consumer Real-World Protection Test can be downloaded free of charge from the institute’s website, www.av-comparatives.org.  AV-Comparatives is an independent testing lab based in Innsbruck, Austria, and has been publicly testing computer security software since 2004. It is ISO 9001:2015 certified for the scope “Independent Tests of Anti-Virus Software”. It also holds the EICAR certification as a “Trusted IT-Security Testing Lab”.
 

Back3

Level 9
Apr 14, 2019
419
In real life, most MalwareTips users have security extensions and that's where they will get their false positives. Malwarebytes Browser Guard is very aggressive on Chrome when malware,scam and PUP's protections are on. I get my false positives there and not at the antivirus level. And many people also have a VPN that includes malware protection.....
 
Last edited:

Freud2004

Level 9
Jun 26, 2020
427
:censored: F-Secure 44 false positives !!!!!

:oops: Kaspersky 2 false positives

In resume, F-secure block all that "moves"....

And remove rate? Very good, my AV blocks a lot, and remove rate? My house is on fire but the firemen don't have water to put out the fire.
For me an AV is a balance between detection and disinfection.


This is my opinion.
 

JoyousBudweiser

Level 12
Verified
Aug 22, 2013
586
:censored: F-Secure 44 false positives !!!!!

:oops: Kaspersky 2 false positives

In resume, F-secure block all that "moves"....

And remove rate? Very good, my AV blocks a lot, and remove rate? My house is on fire but the firemen don't have water to put out the fire.
For me an AV is a balance between detection and disinfection.


This is my opinion.
If you are seriously infected better reinstall the os, you can't 100% trust any av with removal of malwares.
 
F

ForgottenSeer 89360

:censored: F-Secure 44 false positives !!!!!

:oops: Kaspersky 2 false positives

In resume, F-secure block all that "moves"....

And remove rate? Very good, my AV blocks a lot, and remove rate? My house is on fire but the firemen don't have water to put out the fire.
For me an AV is a balance between detection and disinfection.


This is my opinion.
When testing these products, they have installed software that looks for IoC (Indicators of Compromise). Everything that was removed properly is counted as "Blocked". Everything else is counted as "Compromised".
This is not a test where they scan a severely infected system so removal is not that relevant. This is a test where on a clean machine they start downloading and executing malware.
 

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,166
AV-Comparatives Real-World July 2018 - October 2020
https://www.av-comparatives.org/tests/real-world-protection-test-july-november-2018/
https://www.av-comparatives.org/tests/real-world-protection-test-february-may-2019/
https://www.av-comparatives.org/tests/real-world-protection-test-july-october-2019/
https://www.av-comparatives.org/tests/real-world-protection-test-february-may-2020/
https://www.av-comparatives.org/tests/real-world-protection-test-july-october-2020/


--------------------------------Compromised-------------------------Clusters
TrendMicro.................1+3+1+0+0.............=5..........1+1+1+1+1
F-Secure......................1+0+5+0+0.............=6..........1+1+1+1+1
Symantec (Norton)..(7)+1(2)+0+0+1(4)...=8.5.......1+1+1+1+1
Avira............................2+0+0+3+5.............=10........1+1+1+1+1
Bitdefender.................1+1+2+5+3.............=12........1+1+1+1+1
Kaspersky Lab............5+0+6+1+2.............=14........1+1+1+1+1
Microsoft.............1(8)+(3)+(5)+2+4..........=15........1+1+1+1+1
Panda..........................8+4+3+1+0.............=16........1+1+1+1+1
Avast/AVG..................4+6+5+2+2..............=19.......1+2+1+1+1
Vipre............................5+2+2+7+3.............=19........1+1+1+2+1
K7..........................6(2)+4(3)+2(3)+4+3.....=23........1+2+1+1+1
McAfee.......................6+6+4+8+11............=35........1+2+1+2+2
ESET.........................11+12+11+4+5..........=43........2+3+2+1+1

Here is what AV-Comparatives say about the importance of clusters:
"Our tests use much more test cases (samples) per product and month than any similar test performed by other testing labs. Because of the higher statistical significance this achieves, we consider all the products in each results cluster to be equally effective, assuming that they have a false-positives rate below the industry average."
Real-World Protection Test Methodology - AV-Comparatives (av-comparatives.org)

Together with other cumulative results, the above scorings create close to a random pattern.
https://malwaretips.com/threads/microsoft-defender-pros-and-cons-november-2020.105069/post-913427
https://malwaretips.com/threads/microsoft-defender-pros-and-cons-november-2020.105069/post-914023

Avast/AVG can be the top and the last. Trend Micro can be the top and the last (and far away). Eset can be the average and the last. Kaspersky can be the top and average, etc. Only Symantec is consistently better than Microsoft ?????

Conclusion.
These tests (AV-Comparatives Real-Wold, AV-Comparatives Malware Protection, AV-Test, SE Labs) cannot precisely measure the anti-malware protection in the wild for the home environment. The differences among most AVs are probably too small.

Post edited (small correction for Kspersky).
 
Last edited:

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,166
Considering some of them might react in 15 minutes, like Avast with its CyberCapture as I've shown here: User Feedback - Avast Premium Security 20.9
Not sure how their automated testing kit reacts to that.
The CyberCapture feature works only for EXE files with MOTW. It is intended to protect the installers downloaded by the user from the Internet. This feature will not block payloads downloaded by malware.
 
F

ForgottenSeer 89360

The CyberCapture feature works only for EXE files with MOTW.
It covers scripts, screensaver files and dynamic link libraries amongst others, but the MOTW is required. If it has been in an archive and it's been extracted through Explorer and not a 3-rd party archiver, the MOTW apparently gets preserved, as the file shown in my post was originally archived.
Considering their testing kit downloads everything from the web, as stated in their methodology, then they all fulfil the CyberCapture requirements.
It's possible that they count something as compromised before the CyberCapture verdict is available. That goes for Microsoft too.

These results, apart from showing randomness and minimal difference, might not even be accurate.

This feature will not block payloads downloaded by malware.
There will be no payload to download, when the original executable is held, until it's been automatically analysed and after the analyses, the payload would be already known to Avast. The presence of the downloaded file might be counted as a "miss" by a toolkit, that can't be smart enough to recognise CyberCapture alert. The way to bypass CyberCapture is with weponised documents, but I don't believe their bot can open an MS-Office document, then click "Enable Editing" on top and then "Allow Content". It's most likely tested only against exes, bat files and ps1 files. This is where all these brilliant results come from.
 
Last edited by a moderator:

Andy Ful

Level 72
Verified
Trusted
Content Creator
Dec 23, 2014
6,166
It covers scripts, screensaver files and dynamic link libraries amongst others,
Are you sure about scripts and DLLs? In my tests (several months ago) only EXE (and maybe COM, SCR - do not remember precisely) were checked by CyberCapture. I tested the EXE which triggered CyberCapture. Next, I changed a byte (usually the letter in some text) and the file extension. After adding the MOTW I tried to execute the modified file. For DLLs, I used rundll32.exe and DLL hijacking method.
 
F

ForgottenSeer 89360

Are you sure about scripts and DLLs? In my tests (several months ago) only EXE (and maybe COM, SCR - do not remember precisely) were checked by CyberCapture. I tested the EXE which triggered CyberCapture. Next, I changed a byte (usually the letter in some text) and the file extension. After adding the MOTW I tried to execute the modified file.
It did upload some *.scr, *.dll and *.bat files. I am not sure if this is something new. It also comes up with a verdict very quick. I haven't seen it uploading a *.ps1 file, but powershell will mostly be invoked through CMD to bypass the execution policy, so I don't believe they can be an issue, as long as *.bat files are checked.
I also don't believe AV-Comparatives bot is using rundll32.exe, so *.dll files don't matter either.

In my opinion, the only thing they download is *.exe files and that's why the detections are always so great, with a minimal difference. I can go ahead and support the assumption with the fact that 100% detection is achieved only by 3 highly-reputation-based antiviruses. Vendors normally don't collect reputation data for scripts. This might explain the outstanding Norton results, as Symantec long time ago developed this reputation method and everything questionable, just goes away immediately. This has caused billions of complaints from software developers. I can also support this assumption by looking at Total Defence - it is a fully rebranded Bitdefender with all their detection modules - the only thing they don't have access to is the Bitdefender cloud. We can see they have a severely lower score than Bitdefender, so the difference comes from reputation analyses in the cloud. Bitdefender only performs this analyses on executables.

If I am right about the *.exe files, then Defender, as well as Avast, might block them in 10-15 minutes, but by that time, the bot has already counted them as missed.
It might also be trained to look just for IoC. In that case, as the file never executed, results might be accurate.
There is malware targeting only specific regions, this might be counted as "blocked", but it might be missed in the wild.

That's from AV-Comparatives: Real-World Protection Test Methodology - AV-Comparatives

We aim to use visible and relevant malicious websites/malware that are currently out there and present a risk to ordinary users. We try to include as many in-the-wild exploits as possible; these are usually well covered by almost all major security products, which may be one reason why the scores look relatively high, beside the fact that on a fully patched Windows system there are not many different exploits online to test against. The rest are URLs that point directly to malware executables; this causes the malware file to be downloaded, thus replicating a scenario in which the user is tricked by social engineering into following links in spam mails or websites, or installing a Trojan or other malicious software.

We use our own crawling system to search continuously for malicious sites and extract malicious URLs (including spammed malicious links). We also search manually for malicious URLs. In the rare event that our in-house methods do not find enough valid malicious URLs on one day, we have contracted some external researchers to provide additional malicious URLs (initially for the exclusive use of AV-Comparatives) and look for additional (re)sources.

It would make no sense downloading anything else, other than executable as social engineering normally relies on that. Other files might be used in more sophisticated, targeted attacks, but they say they look for prevalent malware.
 
Last edited by a moderator:
Top