MRG Effitas 360 Assessment & Certification – Q1 2019

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

a1nn

Level 2
Thread author
Verified
Jun 5, 2017
50
MRG Effitas's Q1 2019 360 Assessment and Certification has been released on May 20, 2019. Eleven products were tested on 339 in-the-wild samples on a Windows 10/7 x64 test bench with Chrome.

Products Tested:
  • Avast Business Antivirus 19.3.2554
  • Avira Antivirus Pro -Business edition 15.0.45.1184
  • BitdefenderEndpoint Protection Elite 6.6.10.146
  • CrowdStrike FalconProtect 4.26.8904.0
  • Microsoft Windows DefenderEngine: 4.18.904.1
  • ESET Endpoint Protection 7.0.2091.0
  • F-Secure Computer Protection Premium 19.2
  • Kaspersky Small Office Security 19.0.0.1088 (d)
  • McAfee Endpoint Security 10.6.0.542
  • Symantec Endpoint Protection 22.17.0.183
  • Trend MicroWorry-Free Business Security 20.0.1049
213885


Additional tests (ransomware, financial malware, PUA/Adware, Exploit/Fileless, FP, and Performance) as well as the methodology can be found in the full report.

 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
It is a kind of real world test (efficacy assesstments):
"When conducting these tests, we tried to simulate normal user behaviour. We are aware that a “Real World” test cannot be conducted by a team of professionals inside a lab because we understand how certain types of malware work, how malware attacks and how such attacks could be prevented. Simulating normal user behaviour means that we paid special attention to all alerts given by security applications. A pass was given only when alerts were straightforward and clearly suggested that malicious action should be blocked."
...
"Although there is no absolute definition of this kind of testing, loosely speaking, it involves the introduction of malware to an endpoint through a realistic vector, such as a browser or USB memory stick. Real World testing mostly involves “dynamic testing” (i.e. the malware is executed and then the ability of the security product to block the malware is measured). "


All tests were performed on Windows 10 (malware + PUA, performance), except the exploit test which was performed on Windows 7. It is a strange move because Windows Defender on Windows 7 is not the same product (much weaker) as compared to Windows Defender on Windows 10.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
These are the performance test results. it seems that Windows Defender is the second lightest antivirus :ROFLMAO:

View attachment 213888
That depends on what is important to the user. Some well known WD slowdown factors were not counted (file copying, installing/updating/uninstalling other applications). But, these factors are not important for most of the average users, especially in organizations.
 
F

ForgottenSeer 72227

All tests were performed on Windows 10 (malware + PUA, performance), except the exploit test which was performed on Windows 7. It is a strange move because Windows Defender on Windows 7 is not the same product (much weaker) as compared to Windows Defender on Windows 10.

WD scored quite well IMO, it really has improved a lot from it's first go around.(y) @Andy Ful I'm also with you in regards to why they decided to switch OS'es during the test? It seems kind of strange and quite frankly like you said, Defender in W7 is not the same version of WD found on W10, so in reality they didn't test it at all in that particular test. I do have to say that MS is notoriously bad for reusing names for different products and only makes it more confusing.IMHO, if they retire a name/product, it should remain that way. Don't use the same name for a new product that has no resemblance to it's original program.

That depends on what is important to the user. Some well known WD slowdown factors were not counted (file copying, installing/updating/uninstalling other applications). But, these factors are not important for most of the average users, especially in organizations.

I agree as well.

They did miss the few area's that WD's performance isn't the greatest. I do wish that MS would improve it's performance in those areas, but as you've said, most people really won't notice WD all that much because most people/businesses don't install/uninstall programs and or transfer a lot of files on a daily basis. Those that do definitely will notice WD more, but I would say that in most cases, how it scored on this test represents what most people will notice IMHO. Well, at least it mimics my experience.;)(y)
 

Burrito

Level 24
Verified
Top Poster
Well-known
May 16, 2018
1,363
Congrat's to Norton. The only AV to achieve a perfect score across every test. And did it being in the top half for performance. Not surprising.

Congrat's to CrowdStrike and Trend for having lots of 'light blue' -- that is behavior blocks. Behavior blocks are good.

Congrat's to Microsoft's Defender. Another validation that the recent good test results are not a fluke.

Not a good show for F-Secure and Avira.

And.... McAfee exposed itself as an inferior product once again. But you have to give them credit... they signed up to be tested. You have other inferior AVs like Webroot, MBAM and many others... that just avoid testing because they know.... they know that testing exposes them for what they are. Inferior capabilities.
 

a1nn

Level 2
Thread author
Verified
Jun 5, 2017
50
Personal I find it way more interesting that All participants in the previous Bank simulator test 100% failed and where was F-Secure!?

The two tests used different samples; the banking test used 15 ITW samples that were clones of very prevalent financial malware.
In total, 15 live ITW samples were used. The tests were conducted using financial malware only, including Azorult, Emotet, Trickbot, IceDid, Ursnif, Gootkit clones.
https://www.mrg-effitas.com/wp-content/uploads/2019/05/2019_OBQ1.pdf

In the case of this test, they used essentially zero-day URL samples of which 119 happened to be financial malware.
Most of the malicious URLs used in this test were compromised legitimate websites which served malware. We believe that such URLs pose the greatest danger to users as this is the place where they least expect to get infected. Some URLs come from our regular honeypots or in case of ransomware and financial malware in particular, we used URLs from newly-discovered distribution sites.
https://www.mrg-effitas.com/wp-content/uploads/2019/05/MRG_Effitas_2019Q1_360.pdf

If you're talking about the botnet test in the banking test, then that was purely a test of behavioral or heuristic detection. They used a hand-modified TinyNuke sample.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top