Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Security Statistics and Reports
Modern protection without signatures – comparison test on real threats (Advanced In The Wild Malware Test)
Message
<blockquote data-quote="Andy Ful" data-source="post: 977821" data-attributes="member: 32260"><p><strong><span style="font-size: 18px">How many unknown samples are included in AVLab tests?</span></strong></p><p></p><p>The question should be asked because most AVs usually miss 0 samples in AVLab tests.</p><p></p><p>It seems that a similar methodology to AVLab tests can be seen in AV-Comparatives Malware Protection tests. The samples are executed from the disk and AVs are on default settings.</p><p></p><p>Let's compare the results of Avast, Avira, Defender, and Malwarebytes in AV-Comparatives tests.</p><p></p><p>AV-Comparatives Malware Protection 2021 few-weeks-old prevalent samples (about 20 000 total samples)</p><p><span style="color: rgb(41, 105, 176)"><strong>Avast+Avira ............. 4+4 missed samples</strong></span></p><p>Defender ....................18 missed samples</p><p>Malwarebytes ........... 22 missed samples</p><p></p><p>We can also gather the results of Avast and Avira for a similar number of samples in AVLab tests:</p><p>AVLab (about 18 000 samples in 13 tests, January 2020 - January 2022)</p><p><span style="color: rgb(41, 105, 176)"><strong>Avast+Avira ............... 0+1</strong></span></p><p></p><p>Unfortunately, only Avast and Avira are commonly tested both by AVLab and AV-Comparatives.</p><p>The number of <span style="color: rgb(41, 105, 176)"><strong>missed samples</strong></span> strongly suggests that in the AVLab tests the unknown samples are very rare for Avira and Avast (even rarer than for few-weeks-old prevalent samples). The results of Avast and Avira are typical in AVLab tests for many tested AVs.</p><p></p><p>From the results of AV-Comparatives tests, it follows that the difference in protection between Defender and Malwarebytes is very small: about <span style="color: rgb(0, 168, 133)"><strong>4 samples per 20 000</strong></span>.</p><p>If we decrease the number of unknown samples, then we can expect that this difference in protection should be similarly small. When we have 0 unknown samples then both AVs will miss 0 samples.</p><p></p><p>Now, let's look at the current AVLab results (missed samples):</p><p>Malwarebytes ......... 1 per 1834 samples, or proportionally ~ 11 per 20 000 samples</p><p>Defender ............. 397 per 1834 samples, or proportionally ~ 4338 per 20 000 samples</p><p>The difference in protection: (4338-11) per 20 000 samples</p><p></p><p>The number of missed samples by Defender is horribly high compared to Malwarebytes, which is in clear contradiction to the results of AV-Comparatives.</p><p>Now the difference in protection is over <span style="color: rgb(184, 49, 47)"><strong>4000 samples</strong></span> and for AV-Comparatives it was only <span style="color: rgb(0, 168, 133)"><strong>4 samples</strong></span>.</p><p></p><p>Hmmm!!!</p><p>We can call this result truly anomalous!!! It is obvious that in the AVLab test many samples are unknown to Defender, even if they are known to Avast and Avira.</p><p></p><p><span style="color: rgb(184, 49, 47)"><strong>Summary.</strong></span></p><p><span style="color: rgb(184, 49, 47)"><strong>In the AVLab tests, we have three groups of AVs (at least). </strong></span></p><ul> <li data-xf-list-type="ul"><span style="color: rgb(184, 49, 47)"><strong>Group 1 (most of the tested AVs) - almost all samples are known to these AVs,</strong></span></li> <li data-xf-list-type="ul"><span style="color: rgb(184, 49, 47)"><strong>Group 2 (Defender, Trend Micro) - <s>many</s> some "dead" samples are unknown to these AVs,</strong></span></li> <li data-xf-list-type="ul"><span style="color: rgb(184, 49, 47)"><strong>Group 3 (Comodo, Emsisoft,...) - these AVs use file reputation, HIPS, etc., so they do not care much about "dead" samples.</strong></span></li> </ul><p>In such a situation is very possible, that the missed samples are unknown to Defender(or Trend Micro), because they never hit the customers and these vendors are simply slow with adding such ("dead") samples to the cloud signature database or behavior-based detections in the cloud.</p><p></p><p>I added Trend Micro to Group 2 because it got a similar result as Defender when tested by AVLab in May 2020.</p><p>[URL unfurl="false"]https://avlab.pl/en/results-may-2020/[/URL]</p><p></p><p>One could suspect that AVLab tests are flawed, but kinda similar behavior is also visible for Trend Micro in the AV-Comparatives tests:</p><p></p><p>Consumer Tests</p><p>AV-Comparatives Malware Protection 2021</p><p></p><p>........................... March ..... Sep</p><p>Avast ................... 1 ............ 3</p><p>Avira ................... 2 ............ 2</p><p>Bitdefender ......... 0 ............ 2</p><p>Eset .................... 10 .......... 18</p><p>Kaspersky ........... 4 ........... 10</p><p>Malwarebytes ..... 6 ........... 16</p><p>McAfee ............... 0 ............. 0</p><p>Microsoft ......... 15 ............. 3</p><p>Norton ................ 0 ............. 0</p><p>Panda ................. 2 ............. 7</p><p><span style="color: rgb(184, 49, 47)"><strong>TrendMicro .... 103 .......... 121</strong></span></p><p>Vipre .................. 0 .............. 2</p><p>Samples .... 10013 ...... 10029</p><p></p><p>Over 200 missed samples per 20 000 total samples is an anomalous result, but still many times lower than for Defender in AVLab test (over 4000 missed samples per 20 000 total samples). Furthermore, Defender missed only about 20 samples per 20 000 total samples in AV-Comparatives tests.<img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite132" alt=":unsure:" title="Unsure :unsure:" loading="lazy" data-shortname=":unsure:" /></p><p></p><p>One of the MT members (@McMcbrad) asked the Trend Micro staff about this behavior. The answer was that those missed samples were not important for the security of TrendMicro customers. That is why I called such samples "dead".</p><p>TrendMicro probably does not care to add quickly the signatures of relatively old "dead" samples, and Defender can add most of such signatures in one or two weeks. Avast, Avira, and probably many AVs can manage such samples in the cloud very quickly (via fast signatures or behavior-based detections). So even when the customers were infected in the wild, the Malware Protection test cannot reflect this in the test results.</p><p></p><p><strong>Why there can be many "dead" samples?</strong></p><p>From the Microsoft reports it follows that most malware can hit only once (over 90% infections). After one shot it becomes 'dead' and it is not used in the attacks anymore. The AVs are very quick to add fast signatures or behavior-based detections in the cloud. In most cases, the fast signature can be created in several minutes after the successful infection. Most often, the attackers use a new polymorphic version of malware which can give far more chances to infect another computer. If the AV is slow with managing such "dead" samples, many of them can be counted as misses in the Malware Protection tests.</p><p></p><p><strong>Why the "dead" samples are not so important in the Real-World tests?</strong></p><p>Such tests use 0-day or 1-day malware, so the AV vendors have no time to manage the "dead" samples.</p><p>The problem of "dead" samples is most important for a few days malware and in the less degree for a few weeks malware. Such malware samples are used in Malware Protection tests.</p><p></p><p><strong>Can the "dead" samples be reused?</strong></p><p>I am not sure. If so then probably in cracks and pirated software. But, the chances of infections via "dead" samples are very low.</p><p></p><p><strong>Does it mean that Defender on default settings is OK in businesses?</strong></p><p>No. Defender in these settings is not good at fighting lateral movement. This can be partially seen in the AVLab. The "dead" but unknown samples can mimic the <strong><span style="color: rgb(184, 49, 47)">fresh unknown samples</span></strong> used in the highly targeted attacks. These malware samples can be executed at Level 3 as <strong><span style="color: rgb(184, 49, 47)">payloads</span></strong> without the MOTW, bypassing Defender's BAFS and SmartScreen. Other AVs can have more chances, because they have got some additional features like Network Protection, file reputation lookup, HIPS, etc.</p><p><strong><span style="color: rgb(0, 168, 133)">The AVLab seems to mimic the business scenario only for Defender when there are many "dead samples" and the BAFS feature is disabled. </span></strong>It is possible that a similar situation can be also for Comodo and Emsisoft (Group 3), but these AVs can block the unknown "dead" samples by the reputation file lookup.</p><p>The perfect results for several other AVs can probably reflect only the fact that in one or two days, they can complete the detections in the cloud by borrowing "dead" samples from other AVs or hunting the payloads. So, in the test with few-days-old samples, almost all malware can be detected/blocked by the cloud backend (fast signatures + behavior-based detections).</p><p></p><p>Post edited.</p><p>Added Group 3 to underline the distinctive features based on file reputation, HIPS, etc.</p><p></p><p>Edit 1.</p><p>I examined other AVLab tests since the year 2020. Defender and Trend Micro missed several samples in some tests and Defender blocked all samples in some others. So, the "dead" samples can be probably a reason for missing several samples by Defender and Trend Micro in these tests, but not for a few hundred Defender misses in the current test.<img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite111" alt=":(" title="Frown :(" loading="lazy" data-shortname=":(" /></p><p></p><p>Edit 2.</p><p>My reasoning about unknown samples was based on the assumption that in the AVLab tests the samples were executed without MOTW. This followed directly from the Defender results (BAFS did not work in the current test). Anyway from the testing methodology, it follows that files must have MOTW, so the BAFS issue was caused by another reason. In this case, we must skip the Avast from considerations because for EXE files with MOTW it uses CyberCapture (detonation in the sandbox). This feature makes a big difference when comparing AVLab with AV-Comparatives. Fortunately, this does not change the main reasoning - still the 4 samples missed in AV-Comparatives tests by Avira are slightly greater than 1 sample missed by this AV in AVLab tests. So, the number of unknown samples in AVLab tests is comparable (probably smaller) than in AV-Comparatives tests.</p><p>See also:</p><p>[URL unfurl="false"]https://malwaretips.com/threads/modern-protection-without-signatures-%E2%80%93-comparison-test-on-real-threats-advanced-in-the-wild-malware-test.112630/post-977993[/URL]</p></blockquote><p></p>
[QUOTE="Andy Ful, post: 977821, member: 32260"] [B][SIZE=5]How many unknown samples are included in AVLab tests?[/SIZE][/B] The question should be asked because most AVs usually miss 0 samples in AVLab tests. It seems that a similar methodology to AVLab tests can be seen in AV-Comparatives Malware Protection tests. The samples are executed from the disk and AVs are on default settings. Let's compare the results of Avast, Avira, Defender, and Malwarebytes in AV-Comparatives tests. AV-Comparatives Malware Protection 2021 few-weeks-old prevalent samples (about 20 000 total samples) [COLOR=rgb(41, 105, 176)][B]Avast+Avira ............. 4+4 missed samples[/B][/COLOR] Defender ....................18 missed samples Malwarebytes ........... 22 missed samples We can also gather the results of Avast and Avira for a similar number of samples in AVLab tests: AVLab (about 18 000 samples in 13 tests, January 2020 - January 2022) [COLOR=rgb(41, 105, 176)][B]Avast+Avira ............... 0+1[/B][/COLOR] Unfortunately, only Avast and Avira are commonly tested both by AVLab and AV-Comparatives. The number of [COLOR=rgb(41, 105, 176)][B]missed samples[/B][/COLOR] strongly suggests that in the AVLab tests the unknown samples are very rare for Avira and Avast (even rarer than for few-weeks-old prevalent samples). The results of Avast and Avira are typical in AVLab tests for many tested AVs. From the results of AV-Comparatives tests, it follows that the difference in protection between Defender and Malwarebytes is very small: about [COLOR=rgb(0, 168, 133)][B]4 samples per 20 000[/B][/COLOR]. If we decrease the number of unknown samples, then we can expect that this difference in protection should be similarly small. When we have 0 unknown samples then both AVs will miss 0 samples. Now, let's look at the current AVLab results (missed samples): Malwarebytes ......... 1 per 1834 samples, or proportionally ~ 11 per 20 000 samples Defender ............. 397 per 1834 samples, or proportionally ~ 4338 per 20 000 samples The difference in protection: (4338-11) per 20 000 samples The number of missed samples by Defender is horribly high compared to Malwarebytes, which is in clear contradiction to the results of AV-Comparatives. Now the difference in protection is over [COLOR=rgb(184, 49, 47)][B]4000 samples[/B][/COLOR] and for AV-Comparatives it was only [COLOR=rgb(0, 168, 133)][B]4 samples[/B][/COLOR]. Hmmm!!! We can call this result truly anomalous!!! It is obvious that in the AVLab test many samples are unknown to Defender, even if they are known to Avast and Avira. [COLOR=rgb(184, 49, 47)][B]Summary. In the AVLab tests, we have three groups of AVs (at least). [/B][/COLOR] [LIST] [*][COLOR=rgb(184, 49, 47)][B]Group 1 (most of the tested AVs) - almost all samples are known to these AVs,[/B][/COLOR] [*][COLOR=rgb(184, 49, 47)][B]Group 2 (Defender, Trend Micro) - [S]many[/S] some "dead" samples are unknown to these AVs,[/B][/COLOR] [*][COLOR=rgb(184, 49, 47)][B]Group 3 (Comodo, Emsisoft,...) - these AVs use file reputation, HIPS, etc., so they do not care much about "dead" samples.[/B][/COLOR] [/LIST] In such a situation is very possible, that the missed samples are unknown to Defender(or Trend Micro), because they never hit the customers and these vendors are simply slow with adding such ("dead") samples to the cloud signature database or behavior-based detections in the cloud. I added Trend Micro to Group 2 because it got a similar result as Defender when tested by AVLab in May 2020. [URL unfurl="false"]https://avlab.pl/en/results-may-2020/[/URL] One could suspect that AVLab tests are flawed, but kinda similar behavior is also visible for Trend Micro in the AV-Comparatives tests: Consumer Tests AV-Comparatives Malware Protection 2021 ........................... March ..... Sep Avast ................... 1 ............ 3 Avira ................... 2 ............ 2 Bitdefender ......... 0 ............ 2 Eset .................... 10 .......... 18 Kaspersky ........... 4 ........... 10 Malwarebytes ..... 6 ........... 16 McAfee ............... 0 ............. 0 Microsoft ......... 15 ............. 3 Norton ................ 0 ............. 0 Panda ................. 2 ............. 7 [COLOR=rgb(184, 49, 47)][B]TrendMicro .... 103 .......... 121[/B][/COLOR] Vipre .................. 0 .............. 2 Samples .... 10013 ...... 10029 Over 200 missed samples per 20 000 total samples is an anomalous result, but still many times lower than for Defender in AVLab test (over 4000 missed samples per 20 000 total samples). Furthermore, Defender missed only about 20 samples per 20 000 total samples in AV-Comparatives tests.:unsure: One of the MT members (@McMcbrad) asked the Trend Micro staff about this behavior. The answer was that those missed samples were not important for the security of TrendMicro customers. That is why I called such samples "dead". TrendMicro probably does not care to add quickly the signatures of relatively old "dead" samples, and Defender can add most of such signatures in one or two weeks. Avast, Avira, and probably many AVs can manage such samples in the cloud very quickly (via fast signatures or behavior-based detections). So even when the customers were infected in the wild, the Malware Protection test cannot reflect this in the test results. [B]Why there can be many "dead" samples?[/B] From the Microsoft reports it follows that most malware can hit only once (over 90% infections). After one shot it becomes 'dead' and it is not used in the attacks anymore. The AVs are very quick to add fast signatures or behavior-based detections in the cloud. In most cases, the fast signature can be created in several minutes after the successful infection. Most often, the attackers use a new polymorphic version of malware which can give far more chances to infect another computer. If the AV is slow with managing such "dead" samples, many of them can be counted as misses in the Malware Protection tests. [B]Why the "dead" samples are not so important in the Real-World tests?[/B] Such tests use 0-day or 1-day malware, so the AV vendors have no time to manage the "dead" samples. The problem of "dead" samples is most important for a few days malware and in the less degree for a few weeks malware. Such malware samples are used in Malware Protection tests. [B]Can the "dead" samples be reused?[/B] I am not sure. If so then probably in cracks and pirated software. But, the chances of infections via "dead" samples are very low. [B]Does it mean that Defender on default settings is OK in businesses?[/B] No. Defender in these settings is not good at fighting lateral movement. This can be partially seen in the AVLab. The "dead" but unknown samples can mimic the [B][COLOR=rgb(184, 49, 47)]fresh unknown samples[/COLOR][/B] used in the highly targeted attacks. These malware samples can be executed at Level 3 as [B][COLOR=rgb(184, 49, 47)]payloads[/COLOR][/B] without the MOTW, bypassing Defender's BAFS and SmartScreen. Other AVs can have more chances, because they have got some additional features like Network Protection, file reputation lookup, HIPS, etc. [B][COLOR=rgb(0, 168, 133)]The AVLab seems to mimic the business scenario only for Defender when there are many "dead samples" and the BAFS feature is disabled. [/COLOR][/B]It is possible that a similar situation can be also for Comodo and Emsisoft (Group 3), but these AVs can block the unknown "dead" samples by the reputation file lookup. The perfect results for several other AVs can probably reflect only the fact that in one or two days, they can complete the detections in the cloud by borrowing "dead" samples from other AVs or hunting the payloads. So, in the test with few-days-old samples, almost all malware can be detected/blocked by the cloud backend (fast signatures + behavior-based detections). Post edited. Added Group 3 to underline the distinctive features based on file reputation, HIPS, etc. Edit 1. I examined other AVLab tests since the year 2020. Defender and Trend Micro missed several samples in some tests and Defender blocked all samples in some others. So, the "dead" samples can be probably a reason for missing several samples by Defender and Trend Micro in these tests, but not for a few hundred Defender misses in the current test.:( Edit 2. My reasoning about unknown samples was based on the assumption that in the AVLab tests the samples were executed without MOTW. This followed directly from the Defender results (BAFS did not work in the current test). Anyway from the testing methodology, it follows that files must have MOTW, so the BAFS issue was caused by another reason. In this case, we must skip the Avast from considerations because for EXE files with MOTW it uses CyberCapture (detonation in the sandbox). This feature makes a big difference when comparing AVLab with AV-Comparatives. Fortunately, this does not change the main reasoning - still the 4 samples missed in AV-Comparatives tests by Avira are slightly greater than 1 sample missed by this AV in AVLab tests. So, the number of unknown samples in AVLab tests is comparable (probably smaller) than in AV-Comparatives tests. See also: [URL unfurl="false"]https://malwaretips.com/threads/modern-protection-without-signatures-%E2%80%93-comparison-test-on-real-threats-advanced-in-the-wild-malware-test.112630/post-977993[/URL] [/QUOTE]
Insert quotes…
Verification
Post reply
Top