Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Software
Security Apps
Microsoft Defender
How the hell WD works on Windows Home & Pro?
Message
<blockquote data-quote="Andy Ful" data-source="post: 915198" data-attributes="member: 32260"><p><strong><span style="font-size: 18px">How strong is WD (default settings) in the Home environment?</span></strong></p><p></p><p>WD was is tested for a long time by professional AV testing labs (home or consumer reports):</p><p>AV-Comparatives (Real-world, Malware Protection): <a href="https://www.av-comparatives.org/consumer/test-results/" target="_blank">Test Results - AV-Comparatives (av-comparatives.org)</a></p><p>AV-Test: <a href="https://www.av-test.org/en/antivirus/home-users/" target="_blank">Home users (av-test.org)</a></p><p>Se Labs: <a href="https://selabs.uk/consumer/" target="_blank">SE Labs</a></p><p></p><p>One should not be excited or disappointed by the results of the particular test because any such test has a strong random factor. This is noted for example in the AV-Comparatives test methodology:</p><p>"<em>Our tests use much more test cases (samples) per product and month than any similar test performed by other testing labs. Because of the higher statistical significance this achieves, we consider all the products in each results cluster to be equally effective, assuming that they have a false-positives rate below the industry average</em>."</p><p><a href="https://www.av-comparatives.org/real-world-protection-test-methodology/" target="_blank">Real-World Protection Test Methodology - AV-Comparatives (av-comparatives.org)</a></p><p>As can be seen from the reports, the best AVs are usually in the first cluster (10 Avs or more), so they can be equally effective on malware in-the-wild (despite the differences in the particular test).</p><p></p><p>So, the best method is to gather the results of 4 types of tests (AV-Comparatives Real-World, AV-Comparatives Malware Protection, AV-Test, and SE Labs) for a long period. I did it for the period <strong><span style="color: rgb(0, 168, 133)">April 2018-June (October) 2020</span></strong>. Here are the results (1-9 places) for AVs that participated in all these tests (with exception of Bitdefender):</p><p></p><p><strong><span style="color: rgb(41, 105, 176)">AV-Comparatives Real-World (July 2018 - October 2020)</span></strong></p><p>1. TrendMicro, 2. F-Secure, 3. Norton, 4. Avira Pro, 5. Bitdefender, 6. Kaspersky, 7. Microsoft, 8. Avast, 9. McAfee</p><p></p><p><strong><span style="color: rgb(41, 105, 176)">AV-Comparatives Malware Protection (September 2018 - September 2020)</span></strong></p><p>1. Avast, 2. Norton, 3. Bitdefender, 4. Avira Pro, 5. Microsoft, 6. Kaspersky, 7. F-Secure, 8. McAfee, 9. Trend Micro</p><p></p><p><strong><span style="color: rgb(41, 105, 176)">AV-Test <strong>(June 2018 - June 2020)</strong></span></strong></p><p>1. Norton, 2. Kaspersky, 2. Trend Micro, 4. Bitdefender, 4. F-Secure, 6. Avira Pro, 7. Avast, 7. Microsoft, 9. McAfee,</p><p></p><p><strong><span style="color: rgb(41, 105, 176)">SE Labs (April 2018 - June 2020)</span></strong></p><p>1. Norton, 2. Kaspersky, 3. Trend Micro, 3. F-Secure, 5. Microsoft, 6. Avira Free, 7. *Bitdefender, 8. McAfee, 9. Avast</p><p>Bitdefender participated only in one SE Labs test and missed 5 samples. The first 5 AVs never missed more than 3 samples in any SE Lab test.</p><p></p><p><a href="https://malwaretips.com/threads/consumer-real-world-protection-test-july-october-2020.105179/post-914261" target="_blank">(1) AV-Comparatives - Consumer Real-World Protection Test July-October 2020 | MalwareTips Community</a></p><p><a href="https://malwaretips.com/threads/microsoft-defender-pros-and-cons-november-2020.105069/post-913427" target="_blank">(1) AVLab.pl - Microsoft Defender - pros and cons (November 2020) | MalwareTips Community</a></p><p><a href="https://malwaretips.com/threads/microsoft-defender-pros-and-cons-november-2020.105069/post-914023" target="_blank">(1) AVLab.pl - Microsoft Defender - pros and cons (November 2020) | MalwareTips Community</a></p><p></p><p>It is hard to do the proper statistics, so let's make the simplest one (average place):</p><p></p><p><strong>April 2018-June (October) 2020 Final List of 4 types of tests (rounded +- 0.25).</strong></p><p>Norton ................... (3+2+1+1)/4 ~ 2</p><p><strong><span style="color: rgb(0, 168, 133)">Trend Micro ...........(1+9+2+3)/4 ~ 4</span></strong></p><p><span style="color: rgb(0, 168, 133)"><strong>F-Secure .................(2+7+4+3)/4 = 4</strong></span></p><p><strong><span style="color: rgb(0, 168, 133)">Kaspersky ..............(6+6+2+2)/4 = 4</span></strong></p><p><span style="color: rgb(0, 168, 133)"><strong>* Bitdefender ............(5+3+4+7)/4 ~ 5 </strong></span></p><p><span style="color: rgb(0, 168, 133)"><strong>Avira .......................(4+4+6+6)/4 = 5</strong></span></p><p><strong><span style="color: rgb(0, 168, 133)">Microsoft ...............(7+5+7+5)/4 = 6</span></strong></p><p><strong><span style="color: rgb(0, 168, 133)">Avast ......................(8+1+7+9)/4 ~ 6</span></strong></p><p>McAfee ..................(9+8+8+8)/4 ~ 8</p><p></p><p><strong>It seems that even when considering 2 year testing period and 4 types of tests, we have very little differences between the most popular AVs (most are grouped around the 5th place). </strong><span style="color: rgb(184, 49, 47)"><strong>Furthermore, the second-best AV on the final scoring (Trend Micro) was the last (and far away) in the AV-Comparatives Malware Protection tests and there is no AV that could be consistently better in all 4 types of tests than Avast (second-last on the final list).</strong></span></p><p>My personal opinion is that these tests cannot measure the real differences between malware protection (home environment) of the most popular AVs (<strong><span style="color: rgb(0, 168, 133)">marked in green on the list</span></strong>), because the differences are too small. Probably, only Norton and McAfee can be distinguished from the other due to very consistent high (Norton) and low (McAfee) scorings.</p><p></p><p>So, the answer for Microsoft is that WD anti-malware protection for the home users is as good as the protection of most AVs (free or Home versions). It does not mean that WD (on default settings) is as strong as for example Kaspersky (KIS participated in all tests). The advantage of some solutions (like KIS) can be seen in the business environment.</p><p></p><p><strong><span style="font-size: 18px">Edit1.</span></strong></p><p>If we include the false positives rate which is consistently biggest for Norton and Trend Micro in AV-Comparatives False Alarm tests, then the differences will be even smaller.</p><p>For example in AV-Comparatives False Alarm tests September 2018 - September 2020 (6 tests):</p><p></p><p>-----------------------The number of false positives -------</p><p>Kaspersky.........5 + 3 + 0 + 10 + 3 = 21</p><p>Bitdefender.......9 + 6 + 7 + 7 + 6 = 35</p><p>Avira..................2 + 4 + 1 + 24 + 8 = 39</p><p>Avast.................5 + 15 + 7 + 15 +10 = 52</p><p>F-Secure..........15 + 17 + 4 + 24 + 9 = 69</p><p>McAfee............35 + 9 + 2 + 25 + 10 = 81</p><p>Microsoft.........32 + 8 + 13 + 9 + 21 = 83</p><p>Norton..............47 + 19 + 7 + 25 + 41 = 139</p><p>Trend Micro.....40 + 81 + 14 + 1 + 5 = 141</p><p><a href="https://www.av-comparatives.org/testmethod/false-alarm-tests/" target="_blank">False Alarm Tests Archive - AV-Comparatives (av-comparatives.org)</a></p><p></p><p>It easy to see that Norton and Trend Micro which have the best anti-malware scoring, also have the worst false positives rate.</p><p>It is also interesting that Trend Micro scored very very poorly in the AV-Comparatives Malware Protection test in the year 2020, just when the false positives rate was the best (1+5 false positives). On the contrary, when Trend Micro had stellar protection results, the false positives rate was very very high.</p><p></p><p><strong><span style="font-size: 18px">Edit2.</span></strong></p><p><span style="font-size: 15px">The phenomenon of consistently high results of Norton worried me because it cannot probably be understood as the statistical (random) effect. </span></p><p><span style="font-size: 15px"></span></p><p><span style="font-size: 15px">But, when looking at consistently <strong>high false positives rate and user-dependent choices</strong> in the tests, it is clear that Norton uses aggressive file reputation check (something similar to SmartScreen and PUA protection in Edge). So, one could use Windows Defender + Edge web browser (SmartScreen + PUA enabled) to get similar strong protection.</span></p><p><span style="font-size: 15px"><a href="https://support.norton.com/sp/en/us/home/current/solutions/v80629965" target="_blank">Respond to incorrect Norton alerts about unsafe downloaded files</a></span></p><p><span style="font-size: 15px"></span></p><p><span style="font-size: 15px">Edit3</span></p><p><span style="font-size: 15px"><span style="color: rgb(184, 49, 47)"><strong>Corrected the error in the AV-Test Kaspersky scoring</strong></span> and added the scorings of AV-Comparatives Real-World test from September 2018 to be closer to the testing period of other tests.</span></p></blockquote><p></p>
[QUOTE="Andy Ful, post: 915198, member: 32260"] [B][SIZE=5]How strong is WD (default settings) in the Home environment?[/SIZE][/B] WD was is tested for a long time by professional AV testing labs (home or consumer reports): AV-Comparatives (Real-world, Malware Protection): [URL='https://www.av-comparatives.org/consumer/test-results/']Test Results - AV-Comparatives (av-comparatives.org)[/URL] AV-Test: [URL='https://www.av-test.org/en/antivirus/home-users/']Home users (av-test.org)[/URL] Se Labs: [URL='https://selabs.uk/consumer/']SE Labs[/URL] One should not be excited or disappointed by the results of the particular test because any such test has a strong random factor. This is noted for example in the AV-Comparatives test methodology: "[I]Our tests use much more test cases (samples) per product and month than any similar test performed by other testing labs. Because of the higher statistical significance this achieves, we consider all the products in each results cluster to be equally effective, assuming that they have a false-positives rate below the industry average[/I]." [URL='https://www.av-comparatives.org/real-world-protection-test-methodology/']Real-World Protection Test Methodology - AV-Comparatives (av-comparatives.org)[/URL] As can be seen from the reports, the best AVs are usually in the first cluster (10 Avs or more), so they can be equally effective on malware in-the-wild (despite the differences in the particular test). So, the best method is to gather the results of 4 types of tests (AV-Comparatives Real-World, AV-Comparatives Malware Protection, AV-Test, and SE Labs) for a long period. I did it for the period [B][COLOR=rgb(0, 168, 133)]April 2018-June (October) 2020[/COLOR][/B]. Here are the results (1-9 places) for AVs that participated in all these tests (with exception of Bitdefender): [B][COLOR=rgb(41, 105, 176)]AV-Comparatives Real-World (July 2018 - October 2020)[/COLOR][/B] 1. TrendMicro, 2. F-Secure, 3. Norton, 4. Avira Pro, 5. Bitdefender, 6. Kaspersky, 7. Microsoft, 8. Avast, 9. McAfee [B][COLOR=rgb(41, 105, 176)]AV-Comparatives Malware Protection (September 2018 - September 2020)[/COLOR][/B] 1. Avast, 2. Norton, 3. Bitdefender, 4. Avira Pro, 5. Microsoft, 6. Kaspersky, 7. F-Secure, 8. McAfee, 9. Trend Micro [B][COLOR=rgb(41, 105, 176)]AV-Test [B](June 2018 - June 2020)[/B][/COLOR][/B] 1. Norton, 2. Kaspersky, 2. Trend Micro, 4. Bitdefender, 4. F-Secure, 6. Avira Pro, 7. Avast, 7. Microsoft, 9. McAfee, [B][COLOR=rgb(41, 105, 176)]SE Labs (April 2018 - June 2020)[/COLOR][/B] 1. Norton, 2. Kaspersky, 3. Trend Micro, 3. F-Secure, 5. Microsoft, 6. Avira Free, 7. *Bitdefender, 8. McAfee, 9. Avast Bitdefender participated only in one SE Labs test and missed 5 samples. The first 5 AVs never missed more than 3 samples in any SE Lab test. [URL='https://malwaretips.com/threads/consumer-real-world-protection-test-july-october-2020.105179/post-914261'](1) AV-Comparatives - Consumer Real-World Protection Test July-October 2020 | MalwareTips Community[/URL] [URL='https://malwaretips.com/threads/microsoft-defender-pros-and-cons-november-2020.105069/post-913427'](1) AVLab.pl - Microsoft Defender - pros and cons (November 2020) | MalwareTips Community[/URL] [URL='https://malwaretips.com/threads/microsoft-defender-pros-and-cons-november-2020.105069/post-914023'](1) AVLab.pl - Microsoft Defender - pros and cons (November 2020) | MalwareTips Community[/URL] It is hard to do the proper statistics, so let's make the simplest one (average place): [B]April 2018-June (October) 2020 Final List of 4 types of tests (rounded +- 0.25).[/B] Norton ................... (3+2+1+1)/4 ~ 2 [B][COLOR=rgb(0, 168, 133)]Trend Micro ...........(1+9+2+3)/4 ~ 4[/COLOR][/B] [COLOR=rgb(0, 168, 133)][B]F-Secure .................(2+7+4+3)/4 = 4[/B][/COLOR] [B][COLOR=rgb(0, 168, 133)]Kaspersky ..............(6+6+2+2)/4 = 4[/COLOR][/B] [COLOR=rgb(0, 168, 133)][B]* Bitdefender ............(5+3+4+7)/4 ~ 5 Avira .......................(4+4+6+6)/4 = 5[/B][/COLOR] [B][COLOR=rgb(0, 168, 133)]Microsoft ...............(7+5+7+5)/4 = 6 Avast ......................(8+1+7+9)/4 ~ 6[/COLOR][/B] McAfee ..................(9+8+8+8)/4 ~ 8 [B]It seems that even when considering 2 year testing period and 4 types of tests, we have very little differences between the most popular AVs (most are grouped around the 5th place). [/B][COLOR=rgb(184, 49, 47)][B]Furthermore, the second-best AV on the final scoring (Trend Micro) was the last (and far away) in the AV-Comparatives Malware Protection tests and there is no AV that could be consistently better in all 4 types of tests than Avast (second-last on the final list).[/B][/COLOR] My personal opinion is that these tests cannot measure the real differences between malware protection (home environment) of the most popular AVs ([B][COLOR=rgb(0, 168, 133)]marked in green on the list[/COLOR][/B]), because the differences are too small. Probably, only Norton and McAfee can be distinguished from the other due to very consistent high (Norton) and low (McAfee) scorings. So, the answer for Microsoft is that WD anti-malware protection for the home users is as good as the protection of most AVs (free or Home versions). It does not mean that WD (on default settings) is as strong as for example Kaspersky (KIS participated in all tests). The advantage of some solutions (like KIS) can be seen in the business environment. [B][SIZE=5]Edit1.[/SIZE][/B] If we include the false positives rate which is consistently biggest for Norton and Trend Micro in AV-Comparatives False Alarm tests, then the differences will be even smaller. For example in AV-Comparatives False Alarm tests September 2018 - September 2020 (6 tests): -----------------------The number of false positives ------- Kaspersky.........5 + 3 + 0 + 10 + 3 = 21 Bitdefender.......9 + 6 + 7 + 7 + 6 = 35 Avira..................2 + 4 + 1 + 24 + 8 = 39 Avast.................5 + 15 + 7 + 15 +10 = 52 F-Secure..........15 + 17 + 4 + 24 + 9 = 69 McAfee............35 + 9 + 2 + 25 + 10 = 81 Microsoft.........32 + 8 + 13 + 9 + 21 = 83 Norton..............47 + 19 + 7 + 25 + 41 = 139 Trend Micro.....40 + 81 + 14 + 1 + 5 = 141 [URL='https://www.av-comparatives.org/testmethod/false-alarm-tests/']False Alarm Tests Archive - AV-Comparatives (av-comparatives.org)[/URL] It easy to see that Norton and Trend Micro which have the best anti-malware scoring, also have the worst false positives rate. It is also interesting that Trend Micro scored very very poorly in the AV-Comparatives Malware Protection test in the year 2020, just when the false positives rate was the best (1+5 false positives). On the contrary, when Trend Micro had stellar protection results, the false positives rate was very very high. [B][SIZE=5]Edit2.[/SIZE][/B] [SIZE=4]The phenomenon of consistently high results of Norton worried me because it cannot probably be understood as the statistical (random) effect. But, when looking at consistently [B]high false positives rate and user-dependent choices[/B] in the tests, it is clear that Norton uses aggressive file reputation check (something similar to SmartScreen and PUA protection in Edge). So, one could use Windows Defender + Edge web browser (SmartScreen + PUA enabled) to get similar strong protection. [URL='https://support.norton.com/sp/en/us/home/current/solutions/v80629965']Respond to incorrect Norton alerts about unsafe downloaded files[/URL] Edit3 [COLOR=rgb(184, 49, 47)][B]Corrected the error in the AV-Test Kaspersky scoring[/B][/COLOR] and added the scorings of AV-Comparatives Real-World test from September 2018 to be closer to the testing period of other tests.[/SIZE] [/QUOTE]
Insert quotes…
Verification
Post reply
Top