Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

blackice

Level 28
Verified
Just thinking...how about BB/AI test without internet connection and only BB component active(disable av component). Would be interesting to see how different security softwares performs. I know...some security softwares does not have separate BB component.
A lot of YouTube testers try to achieve this. But it causes problems. A lot of vendors components work together. For example when real-time Protection is disabled in ESET the HIPS component is not fully functional. So it fails to detect certain aspects of malware it would normally be able to detect.
 

blackice

Level 28
Verified
as which ? I don't think Bitdefender is going downhill being the product of the year in 2019
I was just giving an example of what information one can gather from such testing. The good products tend to perform consistently , including Bitdefender (which I really like the 2020 version). In my example if a product were to consistently or regularly perform below the average then it could provide information. My point is only that the testing is a valuable piece in a larger data set for one to make a decision.
 
Last edited:

blackice

Level 28
Verified
Maybe 5 years ago. Nowadays every test is essentially the same. Not a single product has pushed past 5% compromised over a year-long period since 2015. Hell, even on a month-by-month basis nothing's gone above 3% since 2018.
Yeah, but if a product is compromised more than 2-3% when all other vendors have no issues that’s kind of a glaring negative mark.
 

Arequire

Level 26
Verified
Content Creator
Yeah, but if a product is compromised more than 2-3% when all other vendors have no issues that’s kind of a glaring negative mark.
Even then it's only evidence of a downturn against the specific samples used by the testing lab. They have no relevance to how someone's antivirus is going to react to any malware they encounter in the wild.
Not to mention the samples used are at least 15 days out of date and malware authors would've altered these samples to hell and back by the time these tests are published.

Edit: I changed "months out of date" to "at least 15 days out of date" based on the date of the test being published. Although the samples used in February are still classified as months out of date.
 

blackice

Level 28
Verified
Yes, how a bout a situation, when there's no internet connection? Then BB comes in. According to youtube, GData and Emsisoft BBs works wonders in this situation.
My apologies, I thought you meant without any signatures as well as offline. I believe AV Comparatives offline detection does actually give some hints. Windows Defender is terrible offline. But it can’t separate the companies with good signatures that are updates regularly and good BB. I think the best test for good BBs is zero day malware, as we see in the hub, and ransomware variants that aren’t prevalent.
 

blackice

Level 28
Verified
Even then it's only evidence of a downturn against the specific samples used by the testing lab. They have no relevance to how someone's antivirus is going to react to any malware they encounter in the wild.
Not to mention the samples used are months out of date and malware authors would've altered these samples to hell and back by the time these tests are published.
This is very true. Which is why I think it should only be used as one data point of many.
 

Andy Ful

Level 64
Verified
Trusted
Content Creator
...
The average users I know never install software.
So they cannot be average.:)
But seriously, the false positives are not only related to blocked executables:
"False Positive (False Alarm) Test Result
The false-alarm test in the Real-World Protection Test consists of two parts: wrongly blocked domains (while browsing) and wrongly blocked files (while downloading/installing). It is necessary to test both scenarios because testing only one of the two above cases could penalize products that focus mainly on one type of protection method, either URL filtering or on-access/behaviour/reputation-based file protection."
 

ErzCrz

Level 7
Verified
I prefer some of the avlab.pl and checklab.pl results but all of these are just tools to help people decide. Av-comparatives does at least do a good variety. I' just not certain about av-test.org as the results a bit hit and miss and less detailed.


 

Andy Ful

Level 64
Verified
Trusted
Content Creator
Bitdefender is not good in THIS test compared to Microsoft.
This is normal due to statistical errors. There are 3 new malware threats created per second. So, daily you have about 260 000 new samples. For 2 month test, we have 15 mln samples. Even if 99.9% of samples are polymorphic or similar, then you have 15 000 sufficiently different samples (it is an example, the real number is not known).
There were only 372 samples in the test (~2.5% of samples from my example). Testing something on 2.5% of samples (preselected from 15 mln samples) gives you some statistical error, which is probably greater than the difference in detection between Bitdefender and Microsoft.(y)
If I correctly remember the AV-Comparatives document about the test reliability statistics, then sufficient reliability can be obtained by testing several thousands of malware samples.

Edit.
From the AV-Test report (Facts & Analyses on the Threat Scenario: The AV-TEST Security Report 2019/2020), there are currently 4.3 malwares per second and about 78% targeted Windows systems.
About 3500 sufficiently new ransomware files were found in the period of this AV-Comparatives test (exactly 5225 in the Q1 2020 according to Kaspersky).
In the first half of 2019, SonicWall Real-Time Deep Memory Inspection (RTDMI) technology unveiled 74,360 ‘never-before-seen’ malware variants.
 
Last edited:

MacDefender

Level 11
Verified
I made a comment on a previous test about the relevance of these false positive tests. Last time around, it included such fun relics like the DVD copy of Microsoft Encarta, last sold 12 years ago. False positives can be a really painful thing, but today I'm a lot more worried about the false positives where critical Windows files get removed, or popular games' anti-cheat engines get flagged, etc. The last time I looked through the examples of false positives that F-Secure and some of the other allegedly FP-heavy engines flagged, I did not recognize any of the software.
 
Top