Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Video Reviews - Security and Privacy
Windows Defender vs Top 100 Infostealers
Message
<blockquote data-quote="Andy Ful" data-source="post: 1112408" data-attributes="member: 32260"><p>Those tests are considered the most reliable among professional tests. But, their results are often misunderstood. For example, people mainly ignore the information included in the testing methodology:</p><p></p><p>[URL unfurl="false"]https://www.av-comparatives.org/real-world-protection-test-methodology/[/URL]</p><p></p><p></p><p>Here is an example from a recent test:</p><p></p><p>[ATTACH=full]286805[/ATTACH]</p><p></p><p></p><p>According to the testing methodology, when the test includes about 500 samples, the first 9 AVs can have similar detection in the wild because they are placed in the same cluster (cluster nr 1). Eset still had a fair chance to be the best in the wild during the testing period, even if it missed 4 samples in that particular test. The same is true for any AV in that cluster. </p><p>Most posts in the threads about testing results are related to the illusion that these results are real in the wild. Many more samples are needed to see the real differences.</p><p>From my experience, the cumulative results of AV-Comparatives, AV-Test, and SE Labs over two years (over 7000 samples) are required to show the differences in the protection of the top solutions.</p></blockquote><p></p>
[QUOTE="Andy Ful, post: 1112408, member: 32260"] Those tests are considered the most reliable among professional tests. But, their results are often misunderstood. For example, people mainly ignore the information included in the testing methodology: [URL unfurl="false"]https://www.av-comparatives.org/real-world-protection-test-methodology/[/URL] Here is an example from a recent test: [ATTACH type="full" alt="1735040706268.png"]286805[/ATTACH] According to the testing methodology, when the test includes about 500 samples, the first 9 AVs can have similar detection in the wild because they are placed in the same cluster (cluster nr 1). Eset still had a fair chance to be the best in the wild during the testing period, even if it missed 4 samples in that particular test. The same is true for any AV in that cluster. Most posts in the threads about testing results are related to the illusion that these results are real in the wild. Many more samples are needed to see the real differences. From my experience, the cumulative results of AV-Comparatives, AV-Test, and SE Labs over two years (over 7000 samples) are required to show the differences in the protection of the top solutions. [/QUOTE]
Insert quotes…
Verification
Post reply
Top