AV-Comparatives - False Alarm Test March 2019

Andy Ful

From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
Forum Veteran
Dec 23, 2014
10,058
1
66,062
8,398
65
Poland
The test did not take into account the prevalence of clean samples. If I correctly remember the prevalence of samples is counted in SE Lab tests.
212314
 
Last edited:
Good to see emsisoft and G data staying away from these tests.
No doubt on Trend micro fp's ..their hash based user prevalence technique is taking a toll on them ..same as F Secure in the past.
Symantec and TM will stay on the same (Fp) level if it would have participated.
Panda is out of my criteria anyway
 
  • Like
Reactions: roger_m and plat
TotalDefense don't have their own antivirus engine. It's highly unlikely that Panda got so bad over the years. Also, no Vipre, Webroot, ZoneAlarm and QuickHeal?
 
Good to see WD with lower false positives. It used to have 100+ false positives in AV-Comparatives tests.
The test overestimates the false positives rate because the prevalence of samples was not taken into account. For example, Avast had 15 false positives and McAfee 9, but the false positives of Avast will be seen by a few thousands of users as compared to hundreds of thousands of users for McAfee.