Reliable AV Testing Labs according to AV-comp survey

Status
Not open for further replies.

Burrito

Level 24
Verified
Top Poster
Well-known
May 16, 2018
1,363
Ha ha LOL.

Can you quote exactly the point where I say MT Hub is the Grail?
You won't find him. And the rest is just your conclusion. Your interpretation. So don't put words and terms in my mouth that I haven't used.

Apparent Misunderstanding.

Umbra was not claiming that you said it.

But I can verify that he is correct --- it has been stated here (by others).

silversurfer, Der.Reisende, harlan4096, SeriousHoax, omidomi..... and others do a great job here providing us valid data points for comparison.
Yes, there are some limitations to their testing. But over time, I've noticed that their testing often closely reflects professional testing. And they test products that are not tested elsewhere... so we can get a pretty good idea of how Webroot or WiseVector or WD with H_D are doing.

Cheers to our MH testers... :emoji_beer: :emoji_beer:(y)
 

ichito

Level 11
Verified
Top Poster
Content Creator
Well-known
Dec 12, 2013
541
Congrats to AVLab(y) especialy due to fact that this lab is not mentioned on their "List of AV Testing Labs"...haha
Listed below are some noteworthy institutes that provide tests of security software.
(...)
The above-mentioned testing labs follow the best industry practices, so users have some valuable sources for security software tests and reviews without having to rely on “recommendations” of sales staff, marketing departments or opinions of unknown “advisors” on forums.
🤣
 

Digmor Crusher

Level 23
Verified
Top Poster
Well-known
Jan 27, 2018
1,236
You could have all these companies test the exact same malware at the exact same time and the results would be totally different because of the way they test.

A caveat should be issued anytime they put out test results " For Entertainment Purposes Only."
 
F

ForgottenSeer 823865

Apparent Misunderstanding.

Umbra was not claiming that you said it.

But I can verify that he is correct --- it has been stated here (by others).
Indeed, thanks to confirm it.

Ha ha LOL.

Can you quote exactly the point where I say MT Hub is the Grail?
You won't find him. And the rest is just your conclusion. Your interpretation. So don't put words and terms in my mouth that I haven't used.

Read:

Umbra said:
"Really ? you didn't read those "i dont trust labs, but MH" kind of stuff?"

i put in color so it is easier for you now.
 

SunMan09

New Member
Jan 15, 2020
9
Common mistakes or issues that make antivirus tests unreliable. I think MalwareTips Malware Hub covers most of these issues.
- Using PUPs and adware in malware-only test
- Using duplicate malware samples in a malware pack
- Testing malware samples that all not released in the same time period
- Malware fails to run as intended in a virtual environment instead of a dedicated system that is more reliable
- Failing to use recommended remediation methods from antivirus vendors that led to threats not properly being removed
- Not doing static execution of malware one by one
- Not knowing the capabilities and characteristics of the malware that is used for a malware test
- Doing a test with executables in a malware pack excluding exploits, fileless malware, rootkits, scripts, and macros.
- Not utilizing all of the protection features that are applicable to antivirus vendor
- Testing when 3rd-party security programs or mechanisms are actively working in conflict when the one that is mainly tested
- Testing a program to avoid its weaknesses (virus repair signatures, behavior blocking, and application whitelisting features)
This SANS Paper indicates the environment that can be used for malware testing: Malware Analysis: Environment Design and Artitecture
 

SumTingWong

Level 28
Verified
Top Poster
Well-known
Apr 2, 2018
1,706
I think antivirus with 100% on everything are bias and marketing purpose like how the duck Bitdefender got a green mark on system performance impact on these AV lab tests, but when I loaded Bitdefender onto my gaming rig, my gaming rig became grandpa very quick like someone explain with 8 core Ryzen 7 3700x cpu, 32GB DDR4 @ 3200mhz and OS on SSD.
 

silversurfer

Level 85
Verified
Honorary Member
Top Poster
Content Creator
Malware Hunter
Well-known
Aug 17, 2014
10,057
Common mistakes or issues that make antivirus tests unreliable. I think MalwareTips Malware Hub covers most of these issues.
- Using PUPs and adware in malware-only test
- Using duplicate malware samples in a malware pack
- Testing malware samples that all not released in the same time period
- Malware fails to run as intended in a virtual environment instead of a dedicated system that is more reliable
- Failing to use recommended remediation methods from antivirus vendors that led to threats not properly being removed
- Not doing static execution of malware one by one
- Not knowing the capabilities and characteristics of the malware that is used for a malware test
- Doing a test with executables in a malware pack excluding exploits, fileless malware, rootkits, scripts, and macros.
- Not utilizing all of the protection features that are applicable to antivirus vendor
- Testing when 3rd-party security programs or mechanisms are actively working in conflict when the one that is mainly tested
- Testing a program to avoid its weaknesses (virus repair signatures, behavior blocking, and application whitelisting features)
You are wrong with most of your presumptions, nothing more to discuss from my point of view!

- Using PUPs and adware in malware-only test
- Using duplicate malware samples in a malware pack

- Testing malware samples that all not released in the same time period
- Malware fails to run as intended in a virtual environment instead of a dedicated system that is more reliable
- Failing to use recommended remediation methods from antivirus vendors that led to threats not properly being removed
- Not doing static execution of malware one by one
- Not knowing the capabilities and characteristics of the malware that is used for a malware test
- Doing a test with executables in a malware pack excluding exploits, fileless malware, rootkits, scripts, and macros.
- Not utilizing all of the protection features that are applicable to antivirus vendor
- Testing when 3rd-party security programs or mechanisms are actively working in conflict when the one that is mainly tested
- Testing a program to avoid its weaknesses (virus repair signatures, behavior blocking, and application whitelisting features)
 

mlnevese

Level 26
Verified
Top Poster
Well-known
May 3, 2015
1,531
Yes but any program will not perform the same on every system there are millions if not billions of different configurations, so expecting every software to perform the same on every computer is ludicrous.

It probably conflicted with something in your machine. I have Bitdefender in my father-in-law's computer that is a 2nd generation I3 with no significant impact and had ESET slow down one of my computers.

Statistically all brands will slow down someone's computer due to a conflict with something. It's impossible to test all configurations and that's why I always tell anyone who is considering Software A x Software B to test it in their computer for the full length of its trial and use the machine normally instead of deciding by throwing random samples at it. This is the only way to check if it's not blocking something you need or creating a performance impact. it also gives you time to get used to the GUI and decide which one seems best for your needs.
 
Last edited:

upnorth

Moderator
Verified
Staff Member
Malware Hunter
Well-known
Jul 27, 2015
5,457
Interesting IMO that 13 years old information are used in this thread to try prove a point. Maybe share something more fresh would help the discussion.

Another thing I gonna weight in on when it comes to the Hub here on MT. For those that don't know, the testers are not hired or even payed by anyone with their work and shared results. They do it on their own free time and 100% Free! That said, it dosen't automatic make their work and shares less important as for example the implementation of statistic reports that is brilliant created as it gives anyone a better and easier overview and I hope more testers would use, but it's all shared on a security forum that is available and open for anyone. Yes, anyone can register to MT and join and then also if wanted request to join the testers group. Pretty different procedure when it comes to genuine AV labs. :rolleyes:

Also it exist something on this site/forum called " Disclaimers! ". That's crucial important to understand and maybe actually try read those would help. If one can't understand, simply try ask and I'm pretty sure the Hub testers and the staff will do their best to help.
 

Tiamati

Level 12
Thread author
Verified
Top Poster
Well-known
Nov 8, 2016
574
Ty for your answers guys. That was the kind of discuss I was expecting. A lot of different opinions... i think that is much better than if everyone was thinking the same.

About Hub, i think there is some space for improvement in the way the results are organized. Maybe if someone could be responsible to collect and organize results in a single sheet, it would be much easier to interpret results. I took a look in some tests of bitdefender for example, and took sometime to understand the idea of the results, mostly because it was poorly explained and because it was difficult to compare with others AVs on hub. I believe that there are a lot of guys like me that faced some problem understanding the results. BTW, i have to agree that the tests should be implemented considering a vector of "inoculation". Anyway, i understand that those tests are completely voluntary, so i'm not "judging" anyone... I'm just suggesting. Maybe you could create a permanent topic on hub so people could contribute with more ideas.

Finally, i see o lot of people being skeptical about av lab tests - and so do I - , but unless we have some real info to suspect of a test (like the one posted about Virus Bulletin), i see no reason to distrust every single lab. For example, i agree that AV-comparative is tendentious when basically noone had bad results, but it doesn't mean every other aspect of the test is garbage and can't be analysed.
 
  • Like
Reactions: RKRN3
F

ForgottenSeer 823865

For example, i agree that AV-comparative is tendentious when basically no one had bad results, but it doesn't mean every other aspect of the test is garbage and can't be analysed.
You can't analyze them because they refuse to share the name and date of their samples with the public, so we have to trust them? no thanks...
I asked as Average Joe, they refused pretexting some BS "we want avoid infections to propagate... Then i asked as an AV representative, they said i have to do it via "official" channels...yeah....sure...
 

Solarquest

Moderator
Verified
Staff Member
Malware Hunter
Well-known
Jul 22, 2014
2,525
I agree with Silversurfer and Upnorth, I already answered before in other posts, so I won't repeat it here again.

I'll just add few thoughts.
What's a perfect test? What's a good one? Can you please describe it?
Is it a test of AVs done as soon as malware is created and found or a little bit later? How much later?
Best would probably be if all possible attack vectors were tested and many, many samples were used, not only few ones.
Great would be if all AVs could be tested at the same time, with thousands of different samples, attacking the system in all possible ways to test all defense mechanism of all AVs....test should probably also be repeated after few hours to check how fast AVs update signatures, then after one day, 2 days...one week....best in real machines, not in VMs.
Did I forget something?
Is this possible? Can we do it? Can Labs do it? Can you do it? I don't think so...so I agree that all test results are imperfect.
We test for free, using our systems, our free time and money; as usual we do our best to upload new samples with low detection and to post test results: what and how many samples were detected on static scan, what, how many and how were samples detected on run? Were files encrypted, was system infected? We try to do it as often as possible, as good as possible.
Is this perfect or complete? No, but over time we definitely show some interesting and important points (e.g what AVs had better static detection, what had better dynamic detection, what are better with ransomware, what with .exe etc...)....of course considering we tested new samples, with low detection, already on our HD.
As usual, I welcome constructive ideas and invite new testers and malware hunters to join the HUB.
 
Last edited:
F

ForgottenSeer 823865

I'll just add few thoughts.
Is it a test of AVs done as soon as malware is created and found or a little bit later? How much later?
Security experts and professional malware researchers deploy several honeypots, so they get the freshest malware.
How does the HUB? grabbing them on some malware samples sharing sites?

Best would probably be if all possible attack vectors were tested and many, many samples were used, not only few ones.
Great would be if all AVs could be tested at the same time, with thousands of different samples, attacking the system in all possible ways to test all defense mechanism of all AVs....test should probably also be repeated after few hours to check how fast AVs update signatures, then after one day, 2 days...one week....best in real machines, not in VMs.
That is obvious, if not the test wont be comprehensive. A proper test must consider the components of the security solution.

Is this possible? Can we do it? Can Labs do it? Can you do it? I don't think so...so I agree that all test results are imperfect.
It is possible, it is just a question of financial resources; labs can do it easily, after all they grab lot of cash from vendors, the hub can even do it, but less easily.

1- Establish a common detailed methodology and procedure, adapted to the tested security solutions
- Real system, chosen OS,​
- Order to execute the samples,​
- Restore a backup or rollback the system, after each execution (no one in the world will get infected by 20+ malware running one after the others...)​
- Etc...​
2- Select one volunteer per security solution.
3- Give each volunteer the same fresh malware pack.
4- via video/audio conference, all volunteers execute the procedure simultaneously under the supervision of a "test leader".
etc...etc..

Not so complicated. Then the HUB will be closer to a "professional" testing environment.

We test for free, using our systems, our free time and money; as usual we do our best to upload new samples with low detection and to post test results: what and how many samples were detected on static scan, what, how many and how were samples detected on run? Were files encrypted, was system infected? We try to do it as often as possible, as good as possible.
Is this perfect or complete? No, but over time we definitely show some interesting and important points (e.g what AVs had better static detection, what had better dynamic detection, what are better with ransomware, what with .exe etc...)....of course considering we tested new samples, with low detection, already on our HD.
As usual, I welcome constructive ideas and invite new testers and malware hunters to join the HUB.
If you claim you do it as a hobby and your have some limitations, no problem.



Just my opinion.
 
Last edited by a moderator:

Divine_Barakah

Level 29
Verified
Top Poster
Well-known
May 10, 2019
1,854
I do not rely on any tests to choose my Av product. I install trials, consult the awesome members here to choose. I really like what goes on here in the hub but I have one concern. Some products are tested for two weeks, some for one, and some for three. The whole idea about testing in the hub is great but what about testing four products using the same samples at the same time. I mean 4 members start a session and start testing at the same time using the same samples but with different products. This, I believe, would be more reliable but I am sure it is not easy to do.

I am also quite sure that testing in VM is not reliable enough. I could be wrong though.
 
  • Like
Reactions: Tiamati
F

ForgottenSeer 823865

The whole idea about testing in the hub is great but what about testing four products using the same samples at the same time. I mean 4 members start a session and start testing at the same time using the same samples but with different products. This, I believe, would be more reliable but I am sure it is not easy to do.
I gave an idea of a procedure on my post above. It is not complicated, just need to be organized.
 

Tiamati

Level 12
Thread author
Verified
Top Poster
Well-known
Nov 8, 2016
574
It is possible, it is just a question of financial resources; labs can do it easily, after all they grab lot of cash from vendors, the hub can even do it, but less easily.

1- Establish a common detailed methodology and procedure, adapted to the tested security solutions
- Real system, chosen OS,- Order to execute the samples,- Restore a backup or rollback the system, after each execution (no one in the world will get infected by 20+ malware running one after the others...)- Etc...2- Select one volunteer per security solution.
3- Give each volunteer the same fresh malware pack.
4- via video/audio conference, all volunteers execute the procedure simultaneously under the supervision of a "test leader".
etc...etc..

Not so complicated. Then the HUB will be closer to a "professional" testing environment.

I endorse the idea! It would be awesome if it could be put in practice
 
  • Like
Reactions: Divine_Barakah

artek

Level 5
Verified
May 23, 2014
236
Security experts and professional malware researchers deploy several honeypots, so they get the freshest malware.
How does the HUB? grabbing them on some malware samples sharing sites?


That is obvious, if not the test wont be comprehensive. A proper test must consider the components of the security solution.


It is possible, it is just a question of financial resources; labs can do it easily, after all they grab lot of cash from vendors, the hub can even do it, but less easily.

1- Establish a common detailed methodology and procedure, adapted to the tested security solutions
- Real system, chosen OS,​
- Order to execute the samples,​
- Restore a backup or rollback the system, after each execution (no one in the world will get infected by 20+ malware running one after the others...)​
- Etc...​
2- Select one volunteer per security solution.
3- Give each volunteer the same fresh malware pack.
4- via video/audio conference, all volunteers execute the procedure simultaneously under the supervision of a "test leader".
etc...etc..

Not so complicated. Then the HUB will be closer to a "professional" testing environment.


If you claim you do it as a hobby and your have some limitations, no problem.
So just put a BIG disclaimer notifying members/visitors that the HUB is not as accurate/developped as test labs, the tests being made by amateurs and its purpose is just to get a different source of informations.


Just my opinion.

+1 to most of this. Also testing the samples hours, sometimes even days apart, is generally unfair to the products that ran the sample the earliest.
 
Status
Not open for further replies.

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top