Status
Not open for further replies.

Slyguy

Level 43
I simply cannot believe in these kind of tests as i smell something like $ involved so i would rather rely on our malware testers. :)
That'd be a mistake. With all due respect, bulk scanning of archives of malware doesn't represent the full protective capabilities of any product. It's only a small part of the picture. For example by bulk scanning you essentially eliminate important protection measures like IPS firewalls, web filtration, traffic scanning, download karma insight systems, etc.

Making your decision entirely from the malware hub would be like making a car buying decision based on the results at the drag strip.
 

dinosaur07

Level 8
Verified
That'd be a mistake. With all due respect, bulk scanning of archives of malware doesn't represent the full protective capabilities of any product. It's only a small part of the picture. For example by bulk scanning you essentially eliminate important protection measures like IPS firewalls, web filtration, traffic scanning, download karma insight systems, etc.

Making your decision entirely from the malware hub would be like making a car buying decision based on the results at the drag strip.
For what they do it is ok for me to make a decision. The rest of the information and software i already have. I`m interrested to see what software detects what, both in static and dynamic real testing situation and with newest samples.
 

Chimaira

Level 4
I threw it in a VM this week and windows actually runs faster with it installed, largely I think because it disables the heavy WD and replaces WFD with it's own firewall. I'm now testing it in the VM alongside (one at a time) VS and OSArmor to see how it plays and how protective it is in combo.

Also of note, K7 uses encrypted communication and appears to have virtually zero telemetry with the ability to totally disable logging if you choose via checkboxes. Also it looks like they are OPSWAT share members, reinforcing sigs from Metadefender. Which may be why they score in parallel with others, such as Avast/AVG, Kaspersky, etc.
Very interested in its protection capability, let us know!
 
D

Deleted member 178

As for Emsisoft, when i was working for them, they already planned and started some major infrastructure changes to their networks and software.

2018 is a year of change for them so the issue with Cruelsister test or this monthly AV-C Low result isn't surprising for me.
 

Chimaira

Level 4
As for Emsisoft, when i was working for them, they already planned and started some major infrastructure changes to their networks and software.

2018 is a year of change for them so the issue with Cruelsister test or this monthly AV-C Low result isn't surprising for me.
Coming out the other side with better protection is all that matters.
 

Burrito

Level 23
Take that back or I will throw you with my robotic laser.

Kaspersky is a top product in the industry not only for its signatures but for the amazing capabilities of its modules like Application Control, System Watcher, etc.
As for Emsisoft, I would say it's a very good product with a long career yet to come.
Of course, RoboMan is correct. Kaspersky, Norton & Bitdefender have dominated testing by all the test organizations for over 5 years. You can dislike those programs for other reasons, but the best available evidence indicates they are among the top tier AVs.
 

baddogai

Level 1
I see the product I use is still conistently middle of the pack. Not that it matters much. But even the performance crown is taken by K7.
 

jrw666

Level 2
As for Emsisoft, when i was working for them, they already planned and started some major infrastructure changes to their networks and software.

2018 is a year of change for them so the issue with Cruelsister test or this monthly AV-C Low result isn't surprising for me.
Umbra, so are you saying it's just a temporary blip or are they in decline?
 
D

Deleted member 178

You understand from just a logical standpoint, that this graph does give a very intelligent, and true result...
What I mean is this...ALL products were tested with the SAME attacks. So if your favorite product is let's say Kaspersky, and it scored under McAfee or vise versa then since the attack / method was identical, this gives a true, or accurate result.
You can't get true or accurate results on one test, you need several of them, then you can start talking.
In the case of AV-C, i won't care of the monthly results but rather on the 5 months one to start having a "proper" estimation of the product efficiency.

Note that the results are based against this SPECIFIC pack of malware, based on its composition that may influence the result greatly. Some products are better against specific threats than others. if they get unlucky with the selection, the results will obviously go down.
The methodology is an important factor which is often ignored by the common reader, who just focus at the color and numbers of the graph...

Umbra, so are you saying it's just a temporary blip or are they in decline?
From what i know, i will take it as a blip; you would start to worry if EAM (or any others AV) get 10+ consecutive very bad results in all tests from various labs. A bad results in one test from one lab means nothing.
This is the same as you go to your favorite restaurant and the chef is tired that day and failed to cook your favorite meal properly.
Doesn't mean he lost all his skills.

Also, I'm sure EAM would score better in this test if the BB was on Alert mode, now it is on auto-resolve by default to reduce prompts (for the Average Joe) or FPs.

Tests are estimation of the product efficiency against certain malware in a certain period of time in a certain condition.
 
I

illumination

You understand from just a logical standpoint, that this graph does give a very intelligent, and true result...

What I mean is this...ALL products were tested with the SAME attacks. So if your favorite product is let's say Kaspersky, and it scored under McAfee or vise versa then since the attack / method was identical, this gives a true, or accurate result.
All products are not designed the same, nor respond the same. One can not simply use one methodology to test many different products and come up with anything that resembles accurate.

It doesn't matter if you say any of the following... The samples were old, the samples were new, the samples were not true zero day... None of this matters, since all vendors were exposed to the same attacks.
This matters very much, when the truth of detection is much lower then what is portrayed by these companies. True zero day malware if used in these tests, would produce much lower detection results. Thus the industry is misleading consumers.

Let's say all the sample were very old, just for argument sake. And your favorite vendor, didn't detect it, but Panda did, that is accurate. Or let's say your favorite vendor is at 85% protection rate, and the vendor you think is garbage scored 90%, that also is accurate.
Let's say, that sample is just old enough, and not in the wild anymore, and was just picked up from a downloading site and is no longer in some data bases of certain companies as they combine or remove signatures all the time. Let's say, due to improper testing of products and how they are designed, the product could of caught the older sample during dynamic testing.

Transparency of testing, Sample verification are not commonly presented by these companies, marking them untrustworthy.
 

DDE_Server

Level 16
Verified
Well another test, more or less the same silliness. I find it both funny and ironic when these test results come out, one minute everyone says to take these tests with a grain of salt, the next minute, product a,b,c (very good job), product d,e,f (well looks like they are POS). So which is it? Do we take it with a grain of salt, or do we rely on them with 100% validity.

I also find it funny when I see people rip a test apart when they don't agree with the results because their product does poorly and/or a product they don't like does well. Next month however, this test is good, it represents exactly what I believe because my product did well and the product I dislike does poorly. I get it, people want to see the the product that they use/prefer always do well, but in reality, I think we all know that there is no so such product that can deliver 100% protection 100% of the time. I've yet to see reports proving that using x product (that does poorly on a test, even if it was just 1 or 2 tests) will result in getting infected every single day. I mean there are a lot of people that use WD and somehow many of those people seem to be infection free. I'm not saying that there are people that haven't been infected when using WD, but I've yet to see concrete data that its a massive wide spread issue. Even if something does get past a particular product, there is no guarantee that another product would have caught it instead.

As for Emsisoft, I think they are a great company with great protection. They have a very good team behind it and offer some of the best customer support out there. Sure this test maybe a lower result than normal, but I really think its silly to write them off completely. Again, I have yet to see data proving that the vast majority of Emsisoft users are being infected on a daily basis because of this test result. I also want to give Emsisoft the benefit of the doubt in regards to cruelsister's test. It could have very well been because they were having issues with their cloud component, or it could not have. I don't think these cloud systems are as simple as hooking up a server to the internet so the software can communicate with it, I am sure its far more complex than that. I mean just last week Microsoft was having issues with smartscreen, so its not impossible. Unless you have exact proof to say Emsisoft is lying, than we really don't know for sure. All I care about anyways is they take action to improve it.

In summary, these tests are fun, but they are not the end all be all and they surely do not represent true real world usage/experiences. Personally, I believe many of these products will offer very good protection for the vast majority of users out there. I think that when choosing a product you should definitely take advantage of their trials and base your decision on that. Sure it's ok not like a product because it slowed your system/internet down, ran into some bugs, conflicted with your system, to many false positives, etc..., but I don't think you should necessarily dislike a product because it did poorly on a test, especially if it was just 1 or 2 tests.
very good analysis . My respect
 

Cortex

Level 17
Verified
Eset not done well either. I used NOD 32 for many years as they did consistently well. I do use Eset at the moment on one PC & consider it has a good FW. Hopefully their scores will pick up, I still trust Eset, but not a good score at all from Eset of old?

For some time I've considered Eset SS & Kaspersky to be the best, so time will tell. I have Trend on one of my daughter’s laptop & never had any problems at all. Bitdefender is great but IMHO the interface is way beyond dreadful, though some love it
 

cruelsister

Level 36
Verified
Trusted
Content Creator
Umbra- I guess I should give a bit of background- In 2016 I posted some videos, one of which showed Boot Time Protection lacking in Kaspersky, another showing Emsisoft wasn't good against Worms.

In the case of Kaspersky (of whom I am no fan due to professional reasons- those reasons having nothing to do with the protection capabilities for the Home User; but sometimes we have to put the good of our Brothers and Sisters over personal feelings), after some back and forth they acknowledged the issue and instituted such protection in the 2017 builds.

In contrast Emsisoft (at that time) chose to ignore the issue, so the results of my 2018 video mirrors that of the one from 2016. This is not a temporary "blip" in protection , and any such explanation is pretty much equivalent to "my Dog ate My Homework".

And being Kind and Gentle, I won't even bring up WMI malware...
 
Last edited:

baddogai

Level 1
Eset not done well either. I used NOD 32 for many years as they did consistently well. I do use Eset at the moment on one PC & consider it has a good FW. Hopefully their scores will pick up, I still trust Eset, but not a good score at all from Eset of old?

For some time I've considered Eset SS & Kaspersky to be the best, so time will tell. I have Trend on one of my daughter’s laptop & never had any problems at all. Bitdefender is great but IMHO the interface is way beyond dreadful, though some love it
Yeah, ESET seems to be slipping up lately. Not so much that I would change my AV as I run only legal software and don't do any risky behavior on the web. But enough for me to consider other options when my 3 year license of Eset Internet Security expires. (I got it on a sale dirt cheap.)
 

Robbie

Level 30
Verified
Content Creator
Malware Tester
You understand from just a logical standpoint, that this graph does give a very intelligent, and true result...

What I mean is this...ALL products were tested with the SAME attacks. So if your favorite product is let's say Kaspersky, and it scored under McAfee or vise versa then since the attack / method was identical, this gives a true, or accurate result.

It doesn't matter if you say any of the following... The samples were old, the samples were new, the samples were not true zero day... None of this matters, since all vendors were exposed to the same attacks.

Let's say all the sample were very old, just for argument sake. And your favorite vendor, didn't detect it, but Panda did, that is accurate. Or let's say your favorite vendor is at 85% protection rate, and the vendor you think is garbage scored 90%, that also is accurate.

So it has nothing to do with trusting AV comparatives....The result is the result.
For this to be true, all existing malware should be tested against all solutions. This would give an accurate result of the protection capabilities each software has.
 

Slyguy

Level 43
You understand from just a logical standpoint, that this graph does give a very intelligent, and true result...

What I mean is this...ALL products were tested with the SAME attacks. So if your favorite product is let's say Kaspersky, and it scored under McAfee or vise versa then since the attack / method was identical, this gives a true, or accurate result.

It doesn't matter if you say any of the following... The samples were old, the samples were new, the samples were not true zero day... None of this matters, since all vendors were exposed to the same attacks.

Let's say all the sample were very old, just for argument sake. And your favorite vendor, didn't detect it, but Panda did, that is accurate. Or let's say your favorite vendor is at 85% protection rate, and the vendor you think is garbage scored 90%, that also is accurate.

So it has nothing to do with trusting AV comparatives....The result is the result.
This is a great post and points out a simple fact, the same test was used with all products making the test a good bit more authoritative than spot testing because it's the same metric of real world protection mechanic testing.

As for K7, it looks like they just recently started participating with AVC, but have been scoring well and getting better. One of their first test runs with AVC was actually quite promising from back in March.

K7: Offline Detection - 97.7%, Online Detection - 98.2%, Online Protection - 99.89%
Kaspersky: Offline Detection - 94.5%, Online Detection - 97.8%, Online Protection - 99.98%

This is going to be one product to watch. Especially since they cleaned house on the performance test and were so far and away ahead of anything else on that test.
 

Raiden

Level 16
Verified
Content Creator
As for Emsisoft, when i was working for them, they already planned and started some major infrastructure changes to their networks and software.

2018 is a year of change for them so the issue with Cruelsister test or this monthly AV-C Low result isn't surprising for me.
That's great to hear! Hopefully once its all rolled out we will see some great improvements to its detection/protection capabilities!

Umbra- I guess I should give a bit of background- In 2016 I posted some videos, one of which showed Boot Time Protection lacking in Kaspersky, another showing Emsisoft wasn't good against Worms.

In the case of Kaspersky (of whom I am no fan due to professional reasons- those reasons having nothing to do with the protection capabilities for the Home User; but sometimes we have to put the good of our Brothers and Sisters over personal feelings), after some back and forth they acknowledged the issue and instituted such protection in the 2017 builds.

In contrast Emsisoft (at that time) chose to ignore the issue, so the results of my 2018 video mirrors that of the one from 2016. This is not a temporary "blip" in protection , and any such explanation is pretty much equivalent to "my Dog ate My Homework".

And being Kind and Gentle, I won't even bring up WMI malware...
Thanks for the background. Hopfully after your last video and their planned upgrades their detection of these worms will improve!



I also wanted to clarify my last post in this thread. I do think these tests do have some merit and that they should not be ignored completely. I think they are a good way to see how a product performs on a consistent basis and may help aid some in choosing a product for their needs. The only issue I have with them is that they always don't paint the whole picture and like many here said, they may not take into account all the protection capabilities of the product. In that regard I just have a hard time when discussions are had about writing off a product completely after 1 test, but as Umbra said, if it starts to become a consistent thing (ie: 10 or more tests in a row from different testing organizations) then there may be some cause for concern. Sorry if I seemed anti-testing, didn't mean to come off like that :)
 

Cortex

Level 17
Verified
Been messing with K7 this evening, talk about a fast install hardly had time to put my beer down! On a full scan it's found on a clean system on Wise365 a couple of default themes as PUP's - But looks good, but dare I use it as a default AV though?...So far (very) impressed! Naff tray icon though, they need to change that :)
 
D

Deleted member 178

The reason this test is informative, is because probably 89% of all end users use DEFAULT settings....

So if your fav software did poorly on this graph, out of the box...It is probably not a good choice for the masses.
This is a important point, especially for AV with prevention features (BB, HIPS, etc...). if you change one setting, it may influence the whole result.
 
Status
Not open for further replies.