SE Labs Home Anti-Malware Protection - January - March 2019

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

a1nn

Level 2
Thread author
Verified
Jun 5, 2017
50
Executive Summary
firefox_VZyTbRAdLa.png


  • The security software products were generally effective at handling general threats from cyber criminals...
Most products were largely capable of handling public web-based threats such as those used by criminals to attack Windows PCs, tricking users into running malicious files or running scripts that download and run malicious files.
  • .. and targeted attacks were prevented in many cases.
Many products were also competent at blocking more targeted, exploit-based attacks. However, while some did very well in this part of the test, others were very much weaker. Products from K7 and G-Data were notably weaker than the competition.
  • False positives were not an issue for most products
Most of the products were good at correctly classifying legitimate applications and websites. The vast majority allowed all of the legitimate websites and applications. eScan’s was the least accurate in this part of the test.
  • Which products were the most effective?
Products from McAfee, Symantec (Norton), ESET, F-Secure and Microsoft achieved extremely good results due to a combination of their ability to block malicious URLs, handle exploits and correctly classify legitimate applications and websites.

[Note: "The web browser used in this test was Google Chrome. When testing Microsoft products Chrome was equipped with the Windows Defender Browser Protection browser extension (Windows Defender Browser Protection)."]

Exact percentages, methodology, and much more detail in the specifics of each program was scored based on how they handed each piece of malware can be found in the full report: https://selabs.uk/download/consumers/epp/2019/jan-mar-2019-home.pdf
 

shmu26

Level 85
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Jul 3, 2015
8,153
McAfee and Eset do well, while Kaspersky does poorly. Bitdefender doesn't even want to participate in the test. It's the opposite of the usual results.
There is a difference in methodology:
"Each product was exposed to the same threats, which were a mixture of targeted attacks using well-established techniques and public email and web-based threats that were found to be live on the internet at the time of the test."
It's a much tougher test than the other companies do.
Windows Defender is among the better products, but not at the top.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,593
All tests use only a very small sample of threats in the wild. Also, the attack techniques used in the test are not comprehensive (they cannot be). Bearing this in mind, anyone can toss the ranks of 10 first AVs which will be as reliable as the results which were published. Here is an example of how reliable it can be:

For this test you need three identical chairs, packaging for three anti-virus products ... and an open window on a high floor of a building. The methodology of this test is as follows:
  1. Tape each of the boxes to a chair. Do so carefully, such that each is fixed in exactly the same way.
  2. Throw each of the chairs out of the window, using an identical technique.
  3. Examine the chairs for damage and write a comparative report, explaining the differences found.
  4. Conclude that the best product was the one attached to the least damaged chair.

    The problem with this test is obvious: the conclusions are not based on any useful reality.


    If someone thinks that the above are my words, then it is not true. This is a fragment from the report we are talking about, in this thread.

    So, is this test useless?
    No, if you compare it with 10 previous tests and make an average.
    Negative test results (missed samples, false positives) can be also useful for AV vendors to improve their products.(y)
 
Last edited:

shmu26

Level 85
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Jul 3, 2015
8,153
All tests use only a very small sample of threats in the wild. Also, the attack techniques used in the test are not comprehensive (they cannot be). Bearing this in mind, anyone can toss the scorings of 10 first AVs wich be as reliable as the results which were published. Here is an example of how reliable can it be:

For this test you need three identical chairs, packaging for three anti-virus products ... and an open window on a high floor of a building. The methodology of this test is as follows:
  1. Tape each of the boxes to a chair. Do so carefully, such that each is fixed in exactly the same way.
  2. Throw each of the chairs out of the window, using an identical technique.
  3. Examine the chairs for damage and write a comparative report, explaining the differences found.
  4. Conclude that the best product was the one attached to the least damaged chair.

    The problem with this test is obvious: the conclusions are not based on any useful reality.



LOL
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,593
You won't believe but WD is amazing AV. Better then most paid security packs
There is no real difference in real-world protection among many good AVs. WD just improved a lot to join them. Please note, that if the user does not use Edge (SmartScreen) then "Windows Defender Browser Protection" extension has to be installed (like in this test for Chrome).
 

Mahesh Sudula

Level 17
Verified
Top Poster
Well-known
Sep 3, 2017
825
Kaspersky low in the charts. I can bet " Yesterday i tested a latest ransom (Sordino) just after 24 hrs in WILD"
Results:
Mcafee: INET Connected. Caught (JTI/Suspect). Offline- Encrypted the system (Real protect slept)
Symantec: Sonar caught offline. ONLINE :Heur.adv.ML.B .Executed the quarantined sample ..Real time and Sonar caught as Trojan.Gen.2
Kaspersky : Caught it BB. ONLINE : UDS
GData: Deepray caught it (ONLINE). Offline: BB after 3 files were encrypted.
Quick heal: Adv BB caught it.
Eset, Avira, F Secure,K7 failed miserably
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,593
Kaspersky low in the charts. I can bet " Yesterday i tested a latest ransom (Sordino) just after 24 hrs in WILD"
Results:
Mcafee: INET Connected. Caught (JTI/Suspect). Offline- Encrypted the system (Real protect slept)
Symantec: Sonar caught offline. ONLINE :Heur.adv.ML.B .Executed the quarantined sample ..Real time and Sonar caught as Trojan.Gen.2
Kaspersky : Caught it BB. ONLINE : UDS
GData: Deepray caught it (ONLINE). Offline: BB after 3 files were encrypted.
Quick heal: Adv BB caught it.
Eset, Avira, F Secure,K7 failed miserably
This is usually the case, when the ranks of most AVs in the test (also in your quick test) are close to random. (y)
Anyway, your quick test is very different from SE Labs test. You have run the ransomware on the computer. In SE Labs test this will happen rarely, because the phishing URL will be blocked (by web filtering or blocking malicious spam attachment).
 
Last edited:

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top