[AV-test] Windows Home User August 2016 Windows 7

Evjl's Rain

Level 47
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Malware Hunter
Apr 18, 2016
3,684
g72P2iG.png
 

ExoGen CyberSecurity

Level 3
Verified
Well-known
Sep 17, 2016
113
I used Comodo for more then 10 years and I will keep using it. The way I have configured, I can go 1v1 with any TOP security product on my real PC (with all my important stuff). I know that with the 8.4 version the detection is not that good but with version 10 (I'm using it right now), the detection is way better and the ViruScope is really good (I can't wait for the Valkyrie to be added).

AV-Test = TOP KEK
 
  • Like
Reactions: yigido
Y

yigido

I used Comodo for more then 10 years and I will keep using it. The way I have configured, I can go 1v1 with any TOP security product on my real PC (with all my important stuff). I know that with the 8.4 version the detection is not that good but with version 10 (I'm using it right now), the detection is way better and the ViruScope is really good (I can't wait for the Valkyrie to be added).

AV-Test = TOP KEK
I did that test on my local area :D The computer repairer defends that "Panda is the TOP Av" :D
I said "Let's try it with same samples? on your machine" (Shadow Defender enabled;) )

Installed Panda first and system got infected (samples was from VX Vault and RGhost)
Installed Comodo and set to proactive config. As always system and files protected :cool: He was shocked.
I said that even Kasp.. ESET.. cannot achieve this with the technology they use.
Anyway, I do not want to get "bashed" :rolleyes:
We all know the TOP KEKs ;) everyone know but it is hard to admit it.
 

JakeXPMan

Level 17
Verified
Top Poster
Well-known
Oct 20, 2014
804
This looks to me like signature based tests only.

AVG, Avira, Bit, Kapersky and Avast have good results in these areas. Although that's not a good way to decide a security solution.

I also find it odd that Pandas performance is at the bottom but AVG nears the top, which I found Panda always very light on system impact. AVG I find the heaviest but with improvements in all around security.

You're much better off taking advice from MT experts and your own testings. These AV tests are ok, but only crack the surface it seems.
 
Last edited:
  • Like
Reactions: kev216 and yigido
Y

yigido

This looks to me like signature based tests only.

AVG, Avira, Bit, Kapersky and Avast have good results in these areas. Though that's not a good way to decide a security solution.

I also find it odd that Pandas performance is at the bottom but AVG nears the top, which I found Panda always very light on system impact and that.
This is where they failed. Detection doesn't equal to protection. They cannot understand security and they mislead the millions of people who decide their AVs from the test results.

From now on, I only believe PC Security Channel & cruelsister test labs :D
 

davetenay

Level 2
Verified
Jul 31, 2015
94
Actually, if you click on the link instead of looking just at the chart on the top, you will see they have two categories: detection of widespread samples and protection against 0-day malware. The last one is an on execution test. Usually Comodo got 100% in the previous protection test. This time only 98,7 on August.
 
  • Like
Reactions: JakeXPMan
Y

yigido

Actually, if you click on the link instead of looking just at the chart on the top, you will see they have two categories: detection of widespread samples and protection against 0-day malware. The last one is an on execution test. Usually Comodo got 100% in the previous protection test. This time only 98,7 on August.
Let's pick the MSE then. He is the first one that is on Comodo .The MSE's result %88,6 and it got better rate than Comodo. Even with against Comodo's %98,7, Comodo cannot be the last one. It is over the industry average

RQ8B3j.png
 
H

hjlbx

Without actually witnessing these tests in-person you have no absolute idea of what and how things have been measured -- despite the test methodology procedure explanations.

These AV lab test reports leave out alot of information and details that most users would consider essential for the user to know and understand.

That's all there is to it... and it's a basic limitation of any AV test lab report.
 

Evjl's Rain

Level 47
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Malware Hunter
Apr 18, 2016
3,684
so according to the test result: comodo is very very good against 0-day malwares but very poor against older/widespread malwares while MSE is opposite (MSE < WD)
This is the description for the testing procedures
https://www.av-test.org/en/test-procedures/test-modules/protection/

it means the in 1st test (0-day) malware links are downloaded and the detection rate is analized by Sunshine (I dont really understand this part)
2nd test: ondemand scanning -> file not detected -> executed

I think if the AV does badly in the 2nd test, it is not a good sign because this type of malware is old and widespread
 
  • Like
Reactions: askmark

davetenay

Level 2
Verified
Jul 31, 2015
94
Let's pick the MSE then. He is the first one that is on Comodo .The MSE's result %88,6 and it got better rate than Comodo. Even with against Comodo's %98,7, Comodo cannot be the last one. It is over the industry average

RQ8B3j.png

It's because they give more importance to detection (where Microsoft is better) than protection (where CIS is much better). Which is a non sense.
 
H

hjlbx

so according to the test result: comodo is very very good against 0-day malwares but very poor against older/widespread malwares while MSE is opposite (MSE < WD)
This is the description for the testing procedures
https://www.av-test.org/en/test-procedures/test-modules/protection/

it means the in 1st test (0-day) malware links are downloaded and the detection rate is analized by Sunshine (I dont really understand this part)
2nd test: ondemand scanning -> file not detected -> executed

I think if the AV does badly in the 2nd test, it is not a good sign because this type of malware is old and widespread

In my experience COMODO is pretty good against older malware - specifically Virussign malware packs that download older malware to a system.

Just about everything gets auto-sandboxed, except for the odd sample that you will come across every 1000 to 5000 samples that will bork COMODO Defense+ (HIPS and\or sandbox).

I am not going to even bother to look at the test methodology, but I would bet auto-sandboxing is ignored in their scoring...
 
Last edited by a moderator:
Y

yigido

so according to the test result: comodo is very very good against 0-day malwares but very poor against older/widespread malwares while MSE is opposite (MSE < WD)
This is the description for the testing procedures
https://www.av-test.org/en/test-procedures/test-modules/protection/

it means the in 1st test (0-day) malware links are downloaded and the detection rate is analized by Sunshine (I dont really understand this part)
2nd test: ondemand scanning -> file not detected -> executed

I think if the AV does badly in the 2nd test, it is not a good sign because this type of malware is old and widespread
I cannot understand this then.. Lets say Comodo can detect 0 threat.. Comodo gets 0 point on this test, it means that all threats are zero-day for Comodo, am I right?
Run them all > All sandboxed (some will be detected by cloud and viruscope) > System protected after the test.
The score %100..

That doesn't make Comodo the last one on this test.. It protected the system from all threat. I do not expect 1st rank but 3. or 4.
Why not?
 
H

hjlbx

I cannot understand this then.. Lets say Comodo can detect 0 threat.. Comodo gets 0 point on this test, it means that all threats are zero-day for Comodo, am I right?
Run them all > All sandboxed (some will be detected by cloud and viruscope) > System protected after the test.
The score %100..

That doesn't make Comodo the last one on this test.. It protected the system from all threat. I do not expect 1st rank but 3. or 4.
Why not?

AV Test Lab probably assigns a score of 0 to auto-sandboxing... since, technically, it is not detection and blocking by signature alone.

It isn't accurate and it sure ain't fair, but there it is...

In a real world test, all that matters is that data has not been stolen.

Technically, it does not matter even if malware has been installed\dropped on the system as long as it is inert; inert malware is non-active and - while it might irk some users to no end - it is no dire threat unless it is actively loaded into memory and running on the system.

The danger is if a user goes searching the file system and executes the inert malware.
 
Last edited by a moderator:
  • Like
Reactions: askmark and yigido
Y

yigido

AV Test Lab probably assigns a score of 0 to auto-sandboxing... since, technically, it is not detection and blocking by signature alone.

It isn't accurate and it sure ain't fair, but there it is...

In a real world test, all that matters is that data has not been stolen.

Technically, it does not matter even if malware has been installed\dropped on the system as long as it is inert; inert malware is non-active and - while it might irk some users to no end - it is no threat unless it is actively loaded into memory and running on the system.

The danger is if a user goes searching the file system and executes the inert malware.
AV-Test seems give 0 point to what they do not understand
 
  • Like
Reactions: askmark

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top