Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Andy Ful

Level 65
Verified
Trusted
Content Creator
Mrg Effitas tests the business AV versions that usually have activated some Advanced Threat Protection features. Some AVs use the custom configuration in these tests (Avast, Avira, CrowdStrike, Eset, Microsoft, Symantec).

Awarded AVs:

• Avast Business Antivirus
• Bitdefender Endpoint Security
• CrowdStrike Falcon Protect
• ESET Endpoint Security
• Microsoft Windows Defender
• Symantec Endpoint Protection

The above AVs were also consistently awarded in the last three tests for Q4 2019, Q1 2020, and Q2 2020.

Security Applications Tested in Q2 test:

• Avast Business Antivirus 20.5.2598
• Avira Antivirus Pro 1.0.23.8081
• Bitdefender Endpoint Security 6.6.20.287
• CrowdStrike Falcon Sensor 5.34.11604.0
• ESET Endpoint Security 7.3.2036.0
• F-Secure Computer Protection Premium 20.3
• McAfee Endpoint Security 10.7.0.1285
• Microsoft Windows Defender 4.18.1911.3 (ASR rules activated via ATP)
• Symantec Endpoint Protection Cloud 22.20.2.57
• Trend Micro Security 6.7.1185/14.2.1108

Kaspersky AV not tested this time.

Full report:
 
Last edited:

Gandalf_The_Grey

Level 38
Verified
Trusted
Content Creator
Great that they listed all the Non-default endpoint protection configurations.
For Microsoft Windows Defender it was:
Microsoft Defender ATP endpoint detection and response capabilities were turned on including ASR rules.
We as end-users don't have acces to ATP but ConfigureDefender helps with ASR rules.
Too bad Kaspersky wasn't tested this time.
 

Andy Ful

Level 65
Verified
Trusted
Content Creator
This test awards are more reliable than usual, because all awarded products were also consistently awarded in two previous tests. It is worth remembering that most AV Lab tests are made on the very small subset of in-the-wild malware files, so the test scoring for particular quarter of the year means close-to-nothing, and can be valuable only for AV vendors to improve the product or removing detection bugs. Anyway, the consistent very good results in several tests can have some value also for consumers.
I am pretty sure that Kaspersky Small Office Security would be awarded too, if it could participate in this test - it was awarded in several previous tests.
 
Last edited:

Vitali Ortzi

Level 20
Verified
This test awards are more reliable than usual, because all awarded products were also consistently awarded in two previous tests. It is worth remembering that most AV Lab tests are made on the very small subset of in-the-wild malware files, so the test scoring for particular quarter of the year means close-to-nothing, and can be valuable only for AV vendors to improve the product or removing detection bugs. Anyway, the consistent very good results in several tests can have some value also for consumers.
I am pretty sure that Kaspersky Small Office Security would be awarded too, if it could participate in this test - it was awarded in several previous tests.
Would be lovely if they could redo this test https://www.mrg-effitas.com/wp-content/uploads/2018/05/MRG_Exploit_Protection.pdf

Since it had detailed exploit techniques used while having enough sample variation
 
Last edited:

Andy Ful

Level 65
Verified
Trusted
Content Creator
What is wrong with this chart?

AVC2020.png

The chart suggests that in Q2 2020 the in-the-wild detection of Bitdefender could be better as compared to for example Avast and can make the Avast users unhappy. :(
In fact, this suggestion could be valid only when all in-the-wild malware that might be missed by Bittedender had to be also missed by Avast.
But, the sets of all missed in-the-wild malware for Avast and for Bitdefender are usually very different. So, by choosing 360 samples from tenths of thousands of different malware variants, we can easily miss more undetected samples of Avast than Bitdefender (and vice versa). In this way, the result would depend not on the real AV detection, but simply on the choice of the tested samples. One has to apply a statistical model to see if the data shows any advantage for Bitdefender. Unfortunately by doing this, it is easy to show that any of the first 6 AVs has no advantage. They have to be treated as they had the same detection in-the-wild. This model shows also that the first 6 AVs have a statistical advantage (in malware detection) over the last fourth. See also:
 
Last edited:

SeriousHoax

Level 32
Verified
It's interesting to see Kaspersky isn't included in this test. I wonder why! In the past I saw Symantec didn't take part in AV-Comparatives tests for a year or something because they had a disagreement with AV-C's testing method. Later it was resolved and they started taking part again. So it's possible that this time Kaspersky had some disagreement with MRG Effitas and decided not to take part or the testing was already done and they asked not to publish their result over some disagreement.
Kaspersky did well in the past tests so this is not common of them to be not included here. Anyway, just my guess based on the Symantec incident and few other similar cases.
 
Top