Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Software
Security Apps
Microsoft Defender
Is the improved performance of Microsoft Defender a myth? Should we necessarily be using a 3rd party AV?
Message
<blockquote data-quote="Andy Ful" data-source="post: 968144" data-attributes="member: 32260"><p><strong>To the MT readers.</strong></p><p></p><p>Please be patient. My posts are directed not only to [USER=66256]@Local Host[/USER]. Many people do not realize how complex is reliable AV testing. It is not sufficient to throw many malware samples and count how many of them were detected. There are many other important things to do that are skipped in the video tests. The final one is a statistical analysis of how reliable are the results. This is a similar situation to the election forecast. Any company that makes such a forecast has to calculate a statistical error, which usually is about a few percent. There is always such an error because we still do not know the votes of most people. </p><p><strong>So, if one has 1000 malware samples and the testing methodology causes 5% statistical error, then the AV that is protected against all samples can have the same protection in the wild as the AV which failed on 50 samples.</strong></p><p><strong><span style="color: rgb(184, 49, 47)">If one takes into consideration this factor, then magically almost all video tests are compatible with professional tests (video test wonder).</span><img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite109" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></strong></p><p>The professional AV Labs do such analysis. It is visible in the final recommendations. So, if AV-Test gives the 6 points to 10 AVs, it means that the testing methodology cannot say which one is better (even if they missed a different number of malware samples).</p></blockquote><p></p>
[QUOTE="Andy Ful, post: 968144, member: 32260"] [B]To the MT readers.[/B] Please be patient. My posts are directed not only to [USER=66256]@Local Host[/USER]. Many people do not realize how complex is reliable AV testing. It is not sufficient to throw many malware samples and count how many of them were detected. There are many other important things to do that are skipped in the video tests. The final one is a statistical analysis of how reliable are the results. This is a similar situation to the election forecast. Any company that makes such a forecast has to calculate a statistical error, which usually is about a few percent. There is always such an error because we still do not know the votes of most people. [B]So, if one has 1000 malware samples and the testing methodology causes 5% statistical error, then the AV that is protected against all samples can have the same protection in the wild as the AV which failed on 50 samples. [COLOR=rgb(184, 49, 47)]If one takes into consideration this factor, then magically almost all video tests are compatible with professional tests (video test wonder).[/COLOR]:)[/B] The professional AV Labs do such analysis. It is visible in the final recommendations. So, if AV-Test gives the 6 points to 10 AVs, it means that the testing methodology cannot say which one is better (even if they missed a different number of malware samples). [/QUOTE]
Insert quotes…
Verification
Post reply
Top