Security News AV-Comparatives Real-World Protection Test October 2018

RoboMan

Level 34
Thread author
Verified
Top Poster
Content Creator
Well-known
Jun 24, 2016
2,399
We all know this are mostly garbage. It's shared so we can discuss about the results :=)
The results are based on the test set of 192 live test cases (malicious URLs found in the field), consisting of working exploits (i.e. drive-by downloads) and URLs pointing directly to malware. Thus exactly the same infection vectors are used as a typical user would experience in everyday life. The test-cases used cover a wide range of current malicious sites and provide insights into the protection given by the various products (using all their protection features) while surfing the web.
Every month (from February to June and from July to November) we update the charts on our website showing the protection rates of the various tested products over the various months. The interactive charts can be found on our website. The chart below shows only the protection scores for the month of OCTOBER 2018 (192 test cases). The results of the false-positives test are also shown in the monthly factsheets/graph below.

2018-11-15 13_14_38-Real-World Protection Test October 2018 – Factsheet _ AV-Comparatives.png


Source: Real-World Protection Test October 2018 – Factsheet | AV-Comparatives
 

RoboMan

Level 34
Thread author
Verified
Top Poster
Content Creator
Well-known
Jun 24, 2016
2,399
BONUS: performance test, being the number #1 the lightest and the number #17 the heaviest.

Guess who gets "the worst optimized AV" for Windows? LOL
 

Attachments

  • 2018-11-15 14_01_07-Performance Test October 2018 _ AV-Comparatives.png
    2018-11-15 14_01_07-Performance Test October 2018 _ AV-Comparatives.png
    28.2 KB · Views: 517
5

509322

Anyone want to bet on how many pages this thread will grow to? :)

Some readers will say (because all they are looking at\for is 100 %)...
  • Avast
  • AVG
  • Avira
  • BullGuard
  • Emsisoft
  • ESET
  • K7
  • Kaspersky
  • McAfee
  • QuickHeal
  • Symantec
  • Tencent
  • VIPRE
Oops.gif


And your implied mock is absolutely right on the money... these threads always turn into dramas. They start out as a joke right from the very beginning -- the fact that the difference between the "winners" and the "losers" is typically 5 % or less. So we are talking about 95 % or better performance for the "loser." Then throw-in fanboism plus those who have a love interest and cannot control themselves... and it is game over. Thread locked. People warned and\or banned.
 

ChemicalB

Level 8
Verified
Sep 14, 2018
360
I wonder if the UAC and SmartScreen alerts are part of the constant high false positive results for Microsoft, or, should they be?
Yeah good point, they tested WD, but yes UAC and SmartScreen are part of the security center.
It is curious to see a lot of false positives, I think they've raised the range to improve the detection but certainly MS has to work on that.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
Nobody here should believe that Windows Defender on default settings is better than most other AVs, only because the results of one AV-Comparatives test. So, let's look at the results from the last 6 months:
Bit Defender 100+100+100+100+100+100 = 100
Microsoft 100+100+99.5 + (0.5)+100+100+99.5 +5 = 99.8 (+0.1 user dependent)
Kaspersky 99+99.5+100+100+99.6+100 = 99.7
Avast 99.5+100+99.5+98.9+99.6+100 = 99.6

Two questions.
Why did not Kaspersky have a good result in May (99%)?
Probably the coincidence or maybe not all modules worked as they should work, after the major update in April (it would not be the first time).

Why Microsoft has consistently good results for some months? It may be a coincidence or maybe it follows from adding scripts and macros to "Block at first sight" feature (introduced in Windows 10 ver. 1607 1803 ). If one would look at the Microsoft results from about a year ago (July 2017 - April 2018) then Microsoft would score at 98.3 (99.9 including user dependent actions).

Anyway, those results have nothing to do with the real protection of many users, because in the real world the users are infected mostly by ignoring AV detection, running cracks or pirated software, etc. Those infection vectors cannot be properly measured by any AV Lab.

Post edited - I used wrongly the ver. 1803 for "Block at first sight".
 
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,040
Every AV test can measure something, but it is hard to connect the detection results with users' real world protection. Yet, when analyzing the average test results for several months, one can see if the detection of some AVs is getting better or worse as compared to other AVs.

Unfortunately, there is no possibility to conclude if the scoring differences are relevant for the users. For example, any particular elephant is many times more vulnerable to be injured by the small piece of meteor as compared to any particular man (man is smaller). But, is that fact relevant to the elephant's health? No, because something 'very close to 0' + something 'very close to 0' is still 'very close to 0'.

When the user looks at the 99% AV score, he can think "I have a 1% chance to be infected in the period of 1 month". But, that is an illusion. Depending on the testing procedure and user activities, the chance can be either 0.000000002045901% or 9.00234187%, as well. Generally, it cannot even be measured properly.
 
Last edited:

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top