AVLab.pl AVLab - Advanced in the Wild Malware Test

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
A nice test, to see when each AV reacts to malware, but showing no big differences in overall blocking results (no compromised systems).
...
This test (and previous tests) can show two things that cannot be concluded from standard AV tests:
  1. There are big differences between AVs, that are related to detecting/blocking malware on different levels (P1, P2, P3).
  2. It is hardly possible to infect the computer protected by a good AV, when the malware is run with a little delay.
Of course, any testing method is not perfect. For example, other methods can be better to find the differences between AVs in targeted attacks.
 
Last edited:

Adrian Ścibor

From AVLab.pl
Verified
Well-known
Apr 9, 2018
173
Did AVLab implement any method to skip polymorphic (or even metamorphic) variants? I think that different honeypots can often catch the same malware having different hashes.
Anyway, I see that this would be hard to include in the testing procedure because of limited computing resources.:unsure:

Good point of view. However any methods could be possible to implement? For example we can include Yara to scan every of downloaded samples, next exclude the same malware family. Then we have a conflict between knows samples or fresh samples - each of them will be excluded, e.g. Emotet malicious files (docx, xlx, rtf...). Right now I do not know how we could limit the same malware family.

Besides, in fact all SHAs of malware SHAs are unique, even so we have 50-100 Emotet samples. You can check by veryfy the table: Recent results

Any sample that has been tested once will never be tested again, because we check the database to see if it ever existed.
 

Soulbound

Moderator
Verified
Staff Member
Well-known
Jan 14, 2015
1,761
Im curious on Webroot results. I myself am a Webroot user in one system and even am part of the beta testers. Webroot strong points rely on reactive action (i.e its famous rollback upon infection is known, on par with Kaspersky's roll back), however webroot is not known for relatively new infections/signatures. how old were the samples used in the test?
 

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
Good point of view. However any methods could be possible to implement?
...
It is probably possible only by using many resources and AI. It is possible that AV-Comparatives and MRG Effitas use some kind of filtering for polymorphic variations in Real-world tests. They test only a few hundreds of samples per 3-4 months, so such malware would decrease the diversity of samples. AVLab uses many more samples, so malware polymorphism is probably not so important.
 
Last edited:

Digmor Crusher

Level 23
Verified
Top Poster
Well-known
Jan 27, 2018
1,237
I need to see a test that tests AV companion programs such as Malwarebytes, Spyshelter, Appguard, Voodoo Shield, Hitman Pro Alert, Zemana, Appcheck, OSArmour etc. We all know that most AV's pretty much provide close to the same level of protection, but many want a second layer of protection, these programs do that and no one really knows how effective they are. This would be the most interesting test done.
 

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
Hi. You can check it yourself at site: Recent results in the table by verify on the Virus Total for example. As for the Webroot if the Rollback is activated we know it and the result is captured in the system logs.
I assumed that the first sample was tested 1.07.2020 - it was submitted to VirusTotal 30.06.2020. For the sample nr 250, the first submission was 30.06.2020, too.
For the sample nr 269, the first submission was 4.07.2020.
For the sample nr 450, the first submission was 7.07.2020
For the sample nr 750, the first submission was 7.07.2020, too
For the sample nr 800, the first submission was 12.07.2020.
So roughly, the freshness of the samples varied from 0-day to a-few days.

It is probable that the samples from similar honeypots are gathered also by the AV analysts, so these samples are mostly detected by AVs in one day.
 
Last edited:

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top