The use of AV-labs: to provide information on how products perform and protect so AV-buyers and freeware AV-users can make informed decisions.
This is so true, but sadly in real world people don't take it that way. For them a 1% difference means so much they are ready to antivirus-hop.The use of AV-labs: to provide information on how products perform and protect so AV-buyers and freeware AV-users can make informed decisions.
I like it when the tests reveal enough information about their samples that we can draw meaningful real world conclusions.All these tests are designed merely as a guide, but due to melting differences now (not like long time ago with Symantec:98.9%, McAfee: 81.6%) they can no longer be even that.
It's like when you buy a fridge, you have volume in litres, but you also have other factors to consider.
When considering which software will be used in the long run, there are factors, such as performance, lack of issues, alerts frequency/design and others. I personally wouldn't use an antivirus slowing me down or bothering me every 10 minutes, even if it was triple-engine with Kaspersky, Bitdefender and Avast.
Same like my test unfortunately.Even recently, @harlan4096’s tests showed that F-Secure had repeated blind spots to both scriptors as well as broad whitelisting of legitimate processes that could be abused (Python, Node.JS, etc).
You can only go that far relying on signatures. Then the behavioural blocker was amazing til it had a slider to adjust the aggressiveness, but it didn’t take long for them to kill that, as well as the IDS adjustment.For example, years ago this community was one of the first to start noting how BitDefender’s signatures lagged for zero day protection.