Disclaimer

This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
We encourage you to compare these results with others and take informed decisions on what security products to use.
Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Seyyed Akram

Level 8
Verified
Kaspersky low in the charts. I can bet " Yesterday i tested a latest ransom (Sordino) just after 24 hrs in WILD"
Results:
Mcafee: INET Connected. Caught (JTI/Suspect). Offline- Encrypted the system (Real protect slept)
Symantec: Sonar caught offline. ONLINE :Heur.adv.ML.B .Executed the quarantined sample ..Real time and Sonar caught as Trojan.Gen.2
Kaspersky : Caught it BB. ONLINE : UDS
GData: Deepray caught it (ONLINE). Offline: BB after 3 files were encrypted.
Quick heal: Adv BB caught it.
Eset, Avira, F Secure,K7 failed miserably
If you have spare time, could you please test the sample against Bullguard and against Malwarebytes Premium?
 

Mahesh Sudula

Level 16
Verified
Malware Tester
Has Mcafee improved that much? Very interesting.
Have been testing Mcafee past 1 month. Bought 1 5yr license to.
Observations:
Excellent antiphishing, UI improved, Very light on system, Malware blocking good.

JTI (GTI-Global threat intelligence)/Suspect1000233333aa! - Random signature caught already by their cloud systems but not yet signature given.
Real Protect- Cloud based Behavior based system(INET is mandatory). BB caught several of my samples with RealProtect!...detection.
Few mins later it would be chnaged to JTI/Suspect. Pretty Nice!
Mcafee didn't care to give a OFFLINE Based ML/BB system.
 

Andy Ful

Level 46
Verified
Trusted
Content Creator
Another test result, another wave of hysteria and meaningless hype.

Ps: Nothing personal, it is just boring to see this kind of posts again and again.
Such threads can be educational for many readers. So, let's be patient. (y)

By the way, SE Labs is probably the first lab which pointed out that the results of one test are meaningless for proving which AV is better. The same is with possibly fake dice. You cannot prove that any number on a fake dice is more probable, by rolling the dice only once.
Actually, the differences between real-world protection of many AVs are too small, and can be seen only when one compares the results of many tests. Just the same situation, as when rolling the fake dice many times.
 
Such threads can be educational for many readers. So, let's be patient. (y)

By the way, SE Labs is probably the first lab which pointed out that the results of one test are meaningless for proving which AV is better. The same is with possibly fake dice. You cannot prove that any number on a fake dice is more probable, by rolling the dice only once.
Actually, the differences between real-world protection of many AVs are too small, and can be seen only when one compares the results of many tests. Just the same situation, as when rolling the fake dice many times.
I do not agree, the only way to test an av is with a test, it is as obvious and logical.
 
  • Like
Reactions: Burrito

Nightwalker

Level 16
Verified
Content Creator
Such threads can be educational for many readers. So, let's be patient. (y)

By the way, SE Labs is probably the first lab which pointed out that the results of one test are meaningless for proving which AV is better. The same is with possibly fake dice. You cannot prove that any number on a fake dice is more probable, by rolling the dice only once.
Actually, the differences between real-world protection of many AVs are too small, and can be seen only when one compares the results of many tests. Just the same situation, as when rolling the fake dice many times.
I fully agree with you and I will try to be a little more patient, but I wont promise anything more :ROFLMAO:

One very interesting thing in this test is the difference between AVG/Avast, for everything we know they share the same engine and cloud, just different skins and marketing priorities.

I guess AVG was tested earlier and it may had some positive implications for Avast (cloud + machine learning), but it is just a speculation (this can have implications for all others vendors).

It isnt wise to benchmark antivirus performance based in just one test, but with many tests from different organizations plus independent testing, it can give us a overall picture of how security vendors are doing in the real world (like Kaspersky and now Windows Defender consistency).
 

shmu26

Level 82
Verified
Trusted
Content Creator
In any case, we see from this test that no AV can be expected to stop all the malware all the time. If the bad guys shoot enough bullets, inevitably one or two will get through, it's just a matter of time. That's why default/deny is a stronger security posture than AV protection.
 

Andy Ful

Level 46
Verified
Trusted
Content Creator
I do not agree, the only way to test an av is with a test, it is as obvious and logical.
And who says otherwise?:emoji_thinking:

...
One very interesting thing in this test is the difference between AVG/Avast, for everything we know they share the same engine and cloud, just different skins and marketing priorities.
...
We cannot be sure about the differences between AVG and Avast. So, the reason may be more complex. (y)
 
Last edited:

Andy Ful

Level 46
Verified
Trusted
Content Creator
I wonder why K7Computing are still partaking in these antivirus tests if they know that their software usually ranks last. Are they not aware that it is affecting their business?
This is the second test only. The K7 participates in SE Labs tests (Consumers category) from December 2018.
Paradoxically, the vendor of AV that scores not well has more advantage from such tests, than other vendors. The failure can be a hint on how to improve the product. This was the way of Windows Defender, which was the second last a year ago.
 
Last edited:

Andy Ful

Level 46
Verified
Trusted
Content Creator
There are some well known AVs which are better than WD in SE Labs tests.
WD is better in SE Labs tests than some well known AVs.
Average results from 12 months (Total Accuracy Rating):

Norton Security ............................ 99.5
ESET Smart Security .................... 99
Kaspersky Internet Security ......... 99
Windows Defender ....................... 98.5
Trend Micro Internet Security ...... 97.25
McAfee Internet Security ............. 96.75
F-Secure .........................................94.75

Because of big statistical errors, I can cautiously say that in SE labs tests Norton is better than F-Secure.(y)
I cannot say that Norton Security is generally better (as protection) for home users, because no one knows what it means and how it can be measured.
 
Last edited:

Raiden

Level 13
Verified
Content Creator
It's a much tougher test than the other companies do.
That's how tests should be done IMO. They shouldn't be easy and quite frankly I'm really happy with these results because it proves that all products don't score 99-100% every time.

All products will fail and miss things, even the mighty Kaspersky. I don't see the need to go crazy because the results weren't what one was expecting. IMHO, if you consider the amount on malware out there, anything that can score in the 90's is more than enough. At the end of the day just because Kaspersky scored lower than WD for example, doesn't mean its crap all of a sudden. It also doesn't mean WD is better than Kaspersky, it just did better with this sample set. If anything, all this proves is that WD has improve quite significantly and is just as good as most 3rd parties now. It's no longer at the bottom anymore and is more than capable of protecting users.



This test does not show that WD is better than Kaspersky. It shows only that on a very small number of samples (as compared to all of real world threats: URLs, scripts, executables), WD happened to score better. There is nothing unusual in this.(y)
+1

In any case, we see from this test that no AV can be expected to stop all the malware all the time. If the bad guys shoot enough bullets, inevitably one or two will get through, it's just a matter of time. That's why default/deny is a stronger security posture than AV protection.
+1

I always try to stress this as best I can. This test just proves that no product is perfect. This is not something new, this has always been the case and will continue to remain that way. Most products now a days are more than enough to protect users, but it's always important to practice safe habits. At some point it will fail, even if your product scores 99-100% consistently. It doesn't matter how well it scores, all it takes is one to get through.;)
 
Last edited:
I read these tests, in no way shape or form do I base my choice of an anti-virus on them nor make any recommendations based on them. Basically the top 8 or so programs are so close to each other and are interchangeable between tests that it basically comes down to " pick the one you like and ride it ". These tests are like reading the comics in a newspaper, read them, be entertained for 5 minutes, and move on.
 

blackice

Level 9
Verified
I read these tests, in no way shape or form do I base my choice of an anti-virus on them nor make any recommendations based on them. Basically the top 8 or so programs are so close to each other and are interchangeable between tests that it basically comes down to " pick the one you like and ride it ". These tests are like reading the comics in a newspaper, read them, be entertained for 5 minutes, and move on.
They should have a disclaimer. “For entertainment purposes only”. :ROFLMAO: