Andy Ful
From Hard_Configurator Tools
Thread author
Verified
Honorary Member
Top Poster
Developer
Well-known
- Dec 23, 2014
- 8,593
How big are your chances to be infected?
My idea is simple. First I will gather the data from 2 last years about the samples missed by popular AVs and compare this to the total number of samples. Next, I will calculate the chances to be infected in the next 10 years, assuming that the user can see his/her AV to block one Real-World (0-day) and three Malware Protection (non-0-day) malware a year.
The users who can see such malware more frequently (happy clickers) have to increase the chances proportionally.
In this post, I have in mind the most important infection vectors tested by popular AV labs (AV-Comparatives, AV-Test, SE Labs). I included the tests from the years 2019 and 2020 (until October) for the popular AVs (Home versions).
AV-Comparatives used to publish two kinds of reports: Real-World and Malware Protection. The first is related to web-based threats and the second to the threats originated from USB drives & network drives.
AV-Test combines Real-World and Malware Protection results into one report, but it is possible to separate the results (I will do it later in this post).
SE Labs used to publish only the results for Real-World type tests.
I have used the (Real-World-------Malware Protection) notation to separate the Real-World and Malware Protection data.
2019-2020 (October) missed samples all types of tests (sorted by the sum of missed samples)
(SE Labs, AV-Comparatives Real-World, Malware Protection, AV-Test)
1.Norton (Symantec)........... (08------04)
2.F-Secure............................ (16------26)
3.Kaspersky Lab.................. (18------29)
4.Microsoft.......................... (36------25)
5.Avira.................................. (42------26)
6.Avast................................. (37------59)
7.McAfee............................. (76------37)
8.TrendMicro...................... (11------257)
The horrible scoring for Trend Micro follows from AV-Comparatives Malware Protection tests from the year 2020:
The average number of missed samples in the Real-World scenario ~ 30 samples.
The total number of samples in Real-World scenario ~ 7340 samples
c1 ~ 10 * 30/7340 * 100% ~ 4%
The average number of missed samples in the Malware Protection scenario ~ 60 samples.
The total number of samples in Real-World scenario ~ 213000 samples
c2 ~ 10 * 3 * 60/213000 * 100% ~ 1%
So, the chance to be infected in the next 10 years is close to 5% (the chance is always smaller than c1+c2).
If one can see the AV alarms more frequently than one 0-day and three widespread malware a year, then the chances have to be increased proportionally.
The c1 chance of the 0-day (web-based malware) infection can be decreased to 1% (or less) when using the AV with aggressive reputation checking (like Norton) or using Edge web browser (SmartScreen+PUA protection enabled).
Edit1
The calculation details are included in the attachment: chances_to_be_infected.txt
Edit2
The above calculation method is suitable only for a sufficiently small number of years. The precise formula is slightly more complex:
c = {1 - { [ (1 - r)^a ]*[ (1 - R)^b ] }^n } * 100%
In our example: a=1, b=3, r=30/7340, R=60/213000, n=10, c= 4.82%
Edit3
This result could be refined by including the increasing number of new malware each year. But in fact, the number of new malware a year seems approximately constant. According to AV-Test, it is about 140 mln new malware a year.
Malware Statistics & Trends Report | AV-TEST (av-test.org)
Edit4.
Sorting the AVs by the sum of missed samples is not an especially good idea, because such a sum is like adding apples to oranges. The more appropriate sorting is presented there:
How big are your chances to be infected? | MalwareTips Community
My idea is simple. First I will gather the data from 2 last years about the samples missed by popular AVs and compare this to the total number of samples. Next, I will calculate the chances to be infected in the next 10 years, assuming that the user can see his/her AV to block one Real-World (0-day) and three Malware Protection (non-0-day) malware a year.
The users who can see such malware more frequently (happy clickers) have to increase the chances proportionally.
In this post, I have in mind the most important infection vectors tested by popular AV labs (AV-Comparatives, AV-Test, SE Labs). I included the tests from the years 2019 and 2020 (until October) for the popular AVs (Home versions).
AV-Comparatives used to publish two kinds of reports: Real-World and Malware Protection. The first is related to web-based threats and the second to the threats originated from USB drives & network drives.
AV-Test combines Real-World and Malware Protection results into one report, but it is possible to separate the results (I will do it later in this post).
SE Labs used to publish only the results for Real-World type tests.
I have used the (Real-World-------Malware Protection) notation to separate the Real-World and Malware Protection data.
2019-2020 (October) missed samples all types of tests (sorted by the sum of missed samples)
(SE Labs, AV-Comparatives Real-World, Malware Protection, AV-Test)
1.Norton (Symantec)........... (08------04)
2.F-Secure............................ (16------26)
3.Kaspersky Lab.................. (18------29)
4.Microsoft.......................... (36------25)
5.Avira.................................. (42------26)
6.Avast................................. (37------59)
7.McAfee............................. (76------37)
8.TrendMicro...................... (11------257)
The horrible scoring for Trend Micro follows from AV-Comparatives Malware Protection tests from the year 2020:
Malware Protection Test September 2020
The Malware Protection Test September 2020 assesses program’s ability to protect a system against malicious files before, during or after execution.
www.av-comparatives.org
Malware Protection Test March 2020
The Malware Protection Test March 2020 assesses program’s ability to protect a system against malicious files before, during or after execution.
www.av-comparatives.org
The average number of missed samples in the Real-World scenario ~ 30 samples.
The total number of samples in Real-World scenario ~ 7340 samples
c1 ~ 10 * 30/7340 * 100% ~ 4%
The average number of missed samples in the Malware Protection scenario ~ 60 samples.
The total number of samples in Real-World scenario ~ 213000 samples
c2 ~ 10 * 3 * 60/213000 * 100% ~ 1%
So, the chance to be infected in the next 10 years is close to 5% (the chance is always smaller than c1+c2).
If one can see the AV alarms more frequently than one 0-day and three widespread malware a year, then the chances have to be increased proportionally.
The c1 chance of the 0-day (web-based malware) infection can be decreased to 1% (or less) when using the AV with aggressive reputation checking (like Norton) or using Edge web browser (SmartScreen+PUA protection enabled).
Edit1
The calculation details are included in the attachment: chances_to_be_infected.txt
Edit2
The above calculation method is suitable only for a sufficiently small number of years. The precise formula is slightly more complex:
c = {1 - { [ (1 - r)^a ]*[ (1 - R)^b ] }^n } * 100%
In our example: a=1, b=3, r=30/7340, R=60/213000, n=10, c= 4.82%
Edit3
This result could be refined by including the increasing number of new malware each year. But in fact, the number of new malware a year seems approximately constant. According to AV-Test, it is about 140 mln new malware a year.
Malware Statistics & Trends Report | AV-TEST (av-test.org)
Edit4.
Sorting the AVs by the sum of missed samples is not an especially good idea, because such a sum is like adding apples to oranges. The more appropriate sorting is presented there:
How big are your chances to be infected? | MalwareTips Community
Attachments
Last edited: