AV-Comparatives Business Security Test March-April 2021 – Factsheet

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Gandalf_The_Grey

Level 83
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 24, 2016
7,249
Introduction
This is a short fact sheet for our Business Main-Test Series, containing the results of the Business Malware Protection Test (March) and Business Real-World Protection Test (March-April). The full report, including the Performance Test and product reviews, will be released in July.

To be certified in July 2021 as an “Approved Business Product” by AV-Comparatives, the tested products must score at least 90% in the Malware Protection Test with zero false alarms on common business software, and at least 90% in the overall Real-World Protection Test (i.e. over the course of four months), with less than one hundred false alarms on any clean software/websites (and with zero false alarms on common business software). Tested products must also avoid major performance issues (impact score must be below 40) and have fixed all reported bugs in order to gain certification.

Please note that the results of the Business Main-Test Series cannot be compared with the results of the Consumer Main-Test Series, as the tests are done at different times, with different sets, different settings, etc.
In business environments, and with business products in general, it is usual for products to be configured by the system administrator, in accordance with vendor’s guidelines, and so we invited all vendors to configure their respective products.

Only a few vendors provide their products with optimal default settings which are ready to use, and did therefore not change any settings. Cloud and PUA detection have been activated in all products. We currently do not include any PUA in our malware tests.

Please keep in mind that the results reached in the Enterprise Main-Test Series were only achieved by applying the respective product configurations described here. Any setting listed here as enabled might be disabled in your environment, and vice versa. This influences the protection rates, false alarm rates and system impact. The applied settings are used across all our Enterprise Tests over the year. That is to say, we do not allow a vendor to change settings depending on the test. Otherwise, vendors could e.g. configure their respective products for maximum protection in the protection tests (which would reduce performance and increase false alarms), and maximum speed in the performance tests (thus reducing protection and false alarms). Please not that some enterprise products have all their protection features disabled by default, so the admin has to configure the product to get any protection.
 

Kongo

Level 36
Verified
Top Poster
Well-known
Feb 25, 2017
2,585
I still wonder how much Panda pays them to get such a score... 😅
 

Kongo

Level 36
Verified
Top Poster
Well-known
Feb 25, 2017
2,585
Further proof that you shouldnt believe anything these "tests" are saying, especially the fact that PUP's that use nothing than the Avira engine can have 80%+ in offline detection ratio.
Actually it's no proof at all, but it just seems strange that those tests differ so much from other tests like here in the Hub or on Youtube. More details about the samples used in those tests would help a lot.
 

Kongo

Level 36
Verified
Top Poster
Well-known
Feb 25, 2017
2,585
I know what you mean, but it is Panda Endpoint Protection Plus, so that can differ substantially from their consumer offering.
Like AV-Comparatives said, those results are not comparable.
Thats true, but how do you explain this result from the Malware Protection test of march:


In all the tests I've seen and done by myself Panda's behavioural component was pretty much unseen, in this test however it seems that it caught quite a significant number of the samples with a proactive component. (Free version)
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Thats true, but how do you explain this result from the Malware Protection test of march:

This test was for Consumer AV versions (not Business AV versions).
 

Kongo

Level 36
Verified
Top Poster
Well-known
Feb 25, 2017
2,585
This test was for Consumer AV versions (not Business AV versions).
I know, it was an answer to @Gandalf's statement here:
I know what you mean, but it is Panda Endpoint Protection Plus, so that can differ substantially from their consumer offering.
Like AV-Comparatives said, those results are not comparable.
Even the free version scores great in the consumer version. Even tho their behavioural component appeared weak, it somehow managed to catch many samples dynamically.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
The last 6 AV-Comparatives Business tests (including the March-April 2021) for popular AVs:

Real-World (1207+732+844+767+801+373 samples)

------------------------ Missed samples-----------------
Bitdefender.......0+0+0+4+2+0.........=6
Kaspersky.........3+0+4+1+1+2.........=11
Panda................5+4+0+0+1+1.........=11
Microsoft.......(10)+(2)+(7)+2+2+0..=13.5
VIPRE................3+1+4+4+2+1.........=15
Avast.............4+4+12+1+1+(1)........=22.5
ESET.................13+8+4+3+2+3........=33
Sophos........?+2+4+4+14+(2)8........=*36.5 (first scoring averaged)


Malware Protection (1556+1311+1278+1192+1603+1008 samples)

------------------------ Missed samples -----------------
Avast................0+1+0+2+0+0........=3
Bitdefender......0+1+3+1+2+0........=7
Microsoft.........2+7+1+0+0+4........=14
Panda.............2+1+1+1+13+1.......=19
VIPRE.............0+10+4+2+6+1.......=23
Sophos............?+1+3+7+5+4........=*24 (first scoring averaged)
Kaspersky.......3+13+8+6+5+4.......=39
ESET...............2+20+9+1+2+7.......=41

Avast Business Antivirus Pro Plus 20.10 21.2
Bitdefender GravityZone Elite 6.6 6.6
ESET PROTECT Entry with ESET PROTECT Cloud 8.0 8.0
Kaspersky Endpoint Security for Business – Select, with KSC 11.5 11.5
Malwarebytes EDR 1.2 1.2
Microsoft Defender & Microsoft Endpoint Manager 4.18 4.18 + Windows Defender Browser Protection
Panda Endpoint Protection Plus on Aether 8.0 8.0
Sophos Intercept X Advanced 10.8 10.8
Vipre Endpoint Cloud 12.0 12.0

Edit.
The Malware Protection tests are less important because most of the attacks in the wild that include these samples are already prevented by AVs before the sample could be executed (network protection + web protection + email protection, etc).
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
I know, it was an answer to @Gandalf's statement here:

Even the free version scores great in the consumer version. Even tho their behavioural component appeared weak, it somehow managed to catch many samples dynamically.
Understand. Anyway, comparing 2 single tests is not especially informative bearing in mind the big randomness of results. :unsure:
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Actually it's no proof at all, but it just seems strange that those tests differ so much from other tests like here in the Hub or on Youtube. More details about the samples used in those tests would help a lot.
Yes, these tests cannot be compared with MH tests.
  1. The MH tests (not for URLs) include more samples never seen by AVs (they are slightly fresher).
  2. In MH tests, many samples are payloads, but in the wild in some cases, the AVs can prevent the attack before the payload could be downloaded & executed.
  3. In MH tests it is hard to compare AVs, because they are not tested at the same time. Nowadays, the detection can be significantly different after some hours.
So, some MH samples are only a hypothetical danger for particular AVs. The MH tests are very good for finding the possible weak points in the AV protection, but it is hard to conclude how these weak points could be important in the wild in the tested period.
 
Last edited:

silversurfer

Super Moderator
Verified
Top Poster
Staff Member
Malware Hunter
Aug 17, 2014
11,101
So, some MH samples are only a hypothetical danger for particular AVs. The MH tests are very good for finding the possible weak points in the AV protection, but it is hard to conclude how these weak points could be important in the wild in the tested period.

How anyone will know if certain samples from MH are really dangerous or not for certain AVs, impossible.

Speculation doesn't help even it's meant to be true, but in forums a lot of people claiming personal opinions as facts ;)

About prevalence of samples in the wild, when a sample looks like prevalent and when it's the opposite, real experts only knows for sure!
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Due to the unavoidable randomness of the test results and diversity of attack methods, the test scorings cannot reflect the real AV protection in the wild (it is impossible). One can try to make some cumulative statistics to reduce the randomness (like I did), but this can be done only with limited success. Simply, many tests are required for that and there is no perfect testing method. The MH, AV-Test, AV-Comparatives, SE Labs, AVLab, MRG Effitas, etc. use different testing methods and probably any known method can have some impact on showing the real protection strength.(y)
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
As an example of the problem, we can take the perfect results of the test from the PC Security Channel:



The tested product was Defender on MAX settings (all ASR rules enabled). The test was performed on 1764 EXE samples and all samples were blocked. Many people think that such tests can reflect real AV protection. But in fact, the differences between popular AVs in the detection of EXE files are very small and this test skipped the most important security layers (web protection, email protection) and totally skipped scripting malware. So, after watching this video one still does not have an idea about real Defender protection in the wild.

That is why I do not take into account the YouTube tests when discussing the AV testing. Such tests are only a demonstration of the AV capabilities.
 

SeriousHoax

Level 49
Verified
Top Poster
Well-known
Mar 16, 2019
3,862
In MH tests, many samples are payloads, but in the wild in some cases, the AVs can prevent the attack before the payload could be downloaded & executed.
This is a very important point that everyone should keep in mind. Many samples specially the scripts are mostly payloads and as you said, many products will stop many of these payloads from downloading and executing. To give an example, Norton has a feature called "Norton Script Control" which by default removes all embedded scripts from office documents, email attachments, etc. Emsisoft's behavior blocker is also tuned to block office applications from creating child process, launching executables etc. similar to some of Microsoft Defender's ASR rules. Many other AV does this. This is just one example of how these types of features can prevent infection in a real world scenario which may not be reproducible while solely executing the payloads.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Let's think about the test like about the castle attacked by the enemy.
The Real-World tests reflect the scenario when the defenders are armed and fight on the walls with closed gates.
The Malware Protection tests (with many payloads) are probably similar to the scenario when half of the defenders are drunk or poisoned, and the gates are opened (the defense is highly impaired in some way or the enemy used a very unconventional attack method).

Both scenarios are possible, but the second is rare in the wild.
 
Last edited:
F

ForgottenSeer 72227

Let's think about the test like about the castle attacked by the enemy.
The Real-World tests reflect the scenario when the defenders are armed and fight on the walls with closed gates.
The Malware Protection tests (with many payloads) are probably similar to the scenario when half of the defenders are drunk or poisoned, and the gates are opened (the defense is highly exploited in some way).

Both scenarios are possible, but the second is rare in the wild.
I won't lie, I had a good chuckle:ROFLMAO:

However it is indeed true. Any test in general should be taken with a grain of salt. They aren't the end all be all. Like you said, what happens in the real world vs what happens in tests are separate things, with the real world scenario being more realistic.

That's not to say tests don't offer some value, but let's not forget they are a snapshot in time. Even then the testing methodology differs from on testing organization to another, so its really hard to compare them accurately. The Hub while great isn't perfect either and shouldn't be considered "better", or more "accurate" than other tests. This isn't to say those doing tests in the hub are doing it wrong, it's again a test and like the others should be taken with a grain of salt. One shouldn't be making AV decisions on tests alone, there are many more factors to consider than a malware test (ie: performance in your system, not what the performance was on a testing computer).
;)
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top