AVLab.pl AVLab - Advanced in the Wild Malware Test

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,114
AVLab tested 1456 malware samples from honeypots around the world and some publicly available malware resources (data for one month in July 2020).
The tested products:
  • Avast Free Antivirus
  • Bitdefender Total Security
  • Comodo Advanced Endpoint Protection
  • Comodo Internet Security
  • Emsisoft Business Security
  • G Data Total Security
  • Kaspersky Total Security
  • SecureAPlus Pro
  • Avira Antivirus Pro (one sample missed ????)
  • Webroot Antivirus (one sample missed ????)
Almost all products (see Edit2 note) have blocked all threats, but using different methods visible on the chart:

AVLab072020.png


P1 - sample stopped by the web browser or automatically after it was downloaded to disk.
P2 - sample was not detected after download but the execution was blocked.
P3 - sample blocked after being executed.

Such products like Comodo, Emsisoft, and Securea blocked most malware on execution (P3 level). Other products were blocked most samples on the P1 level. Only a very small number of samples were stopped on the P2 level (about 1% for Avast).

The tests were conducted on Windows 10 Pro x64. Due to the testing procedure, the malware samples were tested with some delay (one day or more) which was necessary to download the files from honeypots and make the initial analysis.
The AVLab testing procedure does not test URLs. Although Chrome was used, the samples were not downloaded from original URLs, but from random URLs generated by the local server DNS and HTTP/S. As we can see this did not decrease the protection against the tested samples.

It is probable, that the delay between the file landing in honeypot and the moment of testing against AVs caused the excellent protection of tested products.


Edit1.
Added source links.
https://checklab.pl/en/recent-results
https://checklab.pl/en/publications...ing-checklab-website-dedicated-security-tests
https://checklab.pl/en/methodology

Edit2
There is an inconsistency in the presented results. The Polish article shows that Avira and Webroot missed one sample. The charts (Polish and English versions) show that these AVs did not miss any malware.
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,114
Why didn't they test Defender too? :(
This testing procedure does not preserve the MOTW of files in the way as it is preserved on the real machines. The Defender detection depends of the real MOTW which is added when the file is downloaded from the real URL. There is also controversy about what web browser should be used (Edge Chromium or Chrome).
 

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,114
Here's the English version.
Thanks. Added the link to OP and also an English version of the testing methodology.
There is an inconsistency in the presented results. The Polish article shows that Avira and Webroot missed one sample. The charts (Polish and English versions) show that these AVs did not miss any malware. :unsure:
 
Last edited:

Adrian Ścibor

From AVLab.pl
Verified
Well-known
Apr 9, 2018
175
The Polish article shows that Avira and Webroot missed one sample. The charts (Polish and English versions) show that these AVs did not miss any malware. :unsure:

Hello Guys. I know about it. Sorry for that. The charts technology has been used has limited, therefore we're working on change the charts to another results visualization. It is dificcult to show 0.01% on such a small bar graph.
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,114
Hi Adrian,
I think that the current procedure would be OK to test also Microsoft Defender. If I correctly understood the AVLab test technology, the local DNS and HTTP/S server + Chrome can cause Defender to think that files are downloaded from the Internet. In this way, the Microsoft Defender BAFS is triggered (file is checked on the P1 level).
Although using random (not real) URLs can have some impact on detection, this impact will be common for most AVs that use behavior-based detections.
Also, the problem of choosing the web browser (Edge or Chrome) should not be important because the URLs are not real ones. (y)
 
Last edited:

Adrian Ścibor

From AVLab.pl
Verified
Well-known
Apr 9, 2018
175
Hi Adrian,
I think that the current procedure would be OK to test also Microsoft Defender. If I correctly understood the AVLab test technology, the local DNS and HTTP/S server + Chrome can cause Defender to think that files are downloaded from the Internet. In this way, the Microsoft Defender BAFS is triggered (file is checked on the P1 level).
Although using random (not real) URLs can have some impact on detection, this impact is common for most AVs that use behavior detections.
Also, the problem of choosing the web browser (Edge or Chrome) should not be important because the URLs are not real ones. (y)
Indeed. You can see how it works and builds ("sandbox" [veryfying samples to testing ] and "testing" [AV testing]) procedures at two videos: How we test antivirus? The making of CheckLab, a website dedicated to security tests
 

Bonorex

Level 1
Jun 12, 2020
39
A nice test, to see when each AV reacts to malware, but showing no big differences in overall blocking results (no compromised systems). What I hope, is to see an updated test in 2020 on online banking protection of consumer AVs, like Avlab did in 2019:


That was a really original and instructive series of tests carried out between different AVs, where very few softwares got 100% scores and each one showed its vulnerabilities. It was something different from the standard AV-test and AV-comparative charts.
 

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,114
Indeed. You can see how it works and builds ("sandbox" [veryfying samples to testing ] and "testing" [AV testing]) procedures at two videos: How we test antivirus? The making of CheckLab, a website dedicated to security tests
Did AVLab implement any method to skip polymorphic (or even metamorphic) variants? I think that different honeypots can often catch the same malware having different hashes.
Anyway, I see that this would be hard to include in the testing procedure because of limited computing resources.:unsure:
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top