Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Video Reviews - Security and Privacy
Windows Defender Firewall Critique Part 2
Message
<blockquote data-quote="Andy Ful" data-source="post: 1102442" data-attributes="member: 32260"><p>Unfortunately, there is no way to check if the test's premise is sound. You have to believe the tester's authority. After reading the documentation of AMTSO (<a href="https://www.amtso.org/" target="_blank">AMTSO - Anti-Malware Testing Standards Organization - AMTSO</a>), I am ready to believe them. Making a test with AMTSO standards is extremely hard. I can also believe some well-known researchers. I have no reason to believe most enthusiasts (with a few exceptions).</p><p></p><p></p><p></p><p>That is normal. If you take the results of real-world tests of leading AV testing labs, the top AVs miss on average about 1 per 500 samples. There are more than 400000 samples per day in the wild, so we have many unhappy people daily.</p><p></p><p></p><p></p><p>I think that the labels are sensible. The samples from the "Real-World" and "Malware Protection" tests are very different. The first category includes very fresh web-based samples. The second category includes the older samples originating from non-web sources. The average infection rate of the fresh web-based samples is several times higher. The web-based samples are mainly morphed samples that live short in the wild. The non-web malware can be stored (alive) for weeks.</p><p></p><p>For example: </p><p><span style="color: rgb(0, 168, 133)"><strong>Avast </strong></span>"Real-World" infection rate: 10/6748 ~ 0.0015</p><p><span style="color: rgb(0, 168, 133)">Avast</span> "Malware Protection" infection rate: 15/270634 ~ 0.000055</p><p>[URL unfurl="true"]https://malwaretips.com/threads/the-best-home-av-protection-2021-2022.120387/[/URL]</p><p></p><p></p><p>I am unsure if this is true, but one cannot exclude the possibility that the influence of AV vendors on AV testing labs can make the tests somewhat biased.</p><p>Anyway, in the end, the cumulative results of those tests are probably close to truth:</p><p></p><p><strong>Real-World 2021-2022: SE Labs, AV-Comparatives, AV-Test (7548 samples in 24 tests)</strong></p><p><strong><span style="color: rgb(0, 168, 133)">Norton 360, Avast, Kaspersky </span>............12 - 18 missed samples <------- top AVs</strong></p><p><strong><span style="color: rgb(41, 105, 176)">Microsoft, McAfee </span>..............................27 - 37 missed samples <------- very good AVs</strong></p><p></p><p></p><p>Yes, this project and some others (Black Hat, Bug Bounty, etc.) are valuable. </p><p></p><p></p><p>Yes, it is unfortunate and irrational. </p><p></p><p></p><p>Well said. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite109" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /><img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite130" alt="(y)" title="Thumbs up (y)" loading="lazy" data-shortname="(y)" /></p></blockquote><p></p>
[QUOTE="Andy Ful, post: 1102442, member: 32260"] Unfortunately, there is no way to check if the test's premise is sound. You have to believe the tester's authority. After reading the documentation of AMTSO ([URL="https://www.amtso.org/"]AMTSO - Anti-Malware Testing Standards Organization - AMTSO[/URL]), I am ready to believe them. Making a test with AMTSO standards is extremely hard. I can also believe some well-known researchers. I have no reason to believe most enthusiasts (with a few exceptions). That is normal. If you take the results of real-world tests of leading AV testing labs, the top AVs miss on average about 1 per 500 samples. There are more than 400000 samples per day in the wild, so we have many unhappy people daily. I think that the labels are sensible. The samples from the "Real-World" and "Malware Protection" tests are very different. The first category includes very fresh web-based samples. The second category includes the older samples originating from non-web sources. The average infection rate of the fresh web-based samples is several times higher. The web-based samples are mainly morphed samples that live short in the wild. The non-web malware can be stored (alive) for weeks. For example: [COLOR=rgb(0, 168, 133)][B]Avast [/B][/COLOR]"Real-World" infection rate: 10/6748 ~ 0.0015 [COLOR=rgb(0, 168, 133)]Avast[/COLOR] "Malware Protection" infection rate: 15/270634 ~ 0.000055 [URL unfurl="true"]https://malwaretips.com/threads/the-best-home-av-protection-2021-2022.120387/[/URL] I am unsure if this is true, but one cannot exclude the possibility that the influence of AV vendors on AV testing labs can make the tests somewhat biased. Anyway, in the end, the cumulative results of those tests are probably close to truth: [B]Real-World 2021-2022: SE Labs, AV-Comparatives, AV-Test (7548 samples in 24 tests) [COLOR=rgb(0, 168, 133)]Norton 360, Avast, Kaspersky [/COLOR]............12 - 18 missed samples <------- top AVs [COLOR=rgb(41, 105, 176)]Microsoft, McAfee [/COLOR]..............................27 - 37 missed samples <------- very good AVs[/B] Yes, this project and some others (Black Hat, Bug Bounty, etc.) are valuable. Yes, it is unfortunate and irrational. Well said. :)(y) [/QUOTE]
Insert quotes…
Verification
Post reply
Top