Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Security Statistics and Reports
Real-World Protection Test Feb-Mar 2020
Message
<blockquote data-quote="MacDefender" data-source="post: 905119" data-attributes="member: 83059"><p>Yeah and I’m personally not a fan of such an approach. Using digital signatures on the vendor or whitelisting hashes of specific files are not ideal. One leaves customers vulnerable if a vendor becomes breached, and the other is volatile against zero day updates (Emsisoft for example has had historical problems with marking brand new Firefox updates as malware, which can get in the way of getting critical security updates)</p><p></p><p>I think the main point I’m making regarding this test is that it doesn’t tell us enough about false positives. Testing with a corpus of curated samples for false positives doesn’t tell us enough about the way each vendor is addressing false positives, and whether or not those test results will translate to the kinds of false positive events the average user is likely to run into.</p><p></p><p>It’s easier to argue that in the wild malware samples are generally relevant because, well, they’re in the wild. This random grab bag of esoteric and mostly unsigned software? I’m not convinced that’s what I want for a real-world false positive test.</p></blockquote><p></p>
[QUOTE="MacDefender, post: 905119, member: 83059"] Yeah and I’m personally not a fan of such an approach. Using digital signatures on the vendor or whitelisting hashes of specific files are not ideal. One leaves customers vulnerable if a vendor becomes breached, and the other is volatile against zero day updates (Emsisoft for example has had historical problems with marking brand new Firefox updates as malware, which can get in the way of getting critical security updates) I think the main point I’m making regarding this test is that it doesn’t tell us enough about false positives. Testing with a corpus of curated samples for false positives doesn’t tell us enough about the way each vendor is addressing false positives, and whether or not those test results will translate to the kinds of false positive events the average user is likely to run into. It’s easier to argue that in the wild malware samples are generally relevant because, well, they’re in the wild. This random grab bag of esoteric and mostly unsigned software? I’m not convinced that’s what I want for a real-world false positive test. [/QUOTE]
Insert quotes…
Verification
Post reply
Top