Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Video Reviews - Security and Privacy
Windows Defender vs Ransomware in 2021
Message
<blockquote data-quote="ForgottenSeer 72227" data-source="post: 929578"><p>Well thats great that he did that! It's my fault for not watching the video before posting my last comment! So, I do owe Leo an apology in that regard.</p><p></p><p>Like you I always suggest people take any AV test (professional and/or other wise) with a grain of salt. That being said, all tests do have their place and can have some useful insight, especially when you look back historically. In saying this, I feel that the biggest issue surrounding many of these tests aren't necessarily the results themselves, but rather how one interprets these results. Many people just look at the bar graphs and make generalized conclusions about the various products tested. Many professional tests do have appendixes that go into more detail to explain some of the results...however, not many people go an look at those.</p><p></p><p>For example, WD use to have very high FP rates on AV-comparatives, however when one looks at the appendix...one would see that the vast majority of WD's FPs came from files with very low, to low prevalence. Where as any files with medium, or higher had virtually no FPs. This is helpful because, while the total number of FPs (all categories summed together) is high, it shows that most people really won't have an issue. Most popular programs would normally fall under the medium to high category, which are the more important categories IMHO.</p><p></p><p>Tests like the one Leo does have the same issue. Since they aren't a written documentation like AV-comparatives, they have to go out of their way to explain the results. This is where Leo has the most issues and why many like myself take issue with his tests. Its not always about the result, but rather his lack of explanation as well as not always being the most thorough in his testing. As a result, it gives the wrong impression and leaves people who use his videos to decide which program to use the wrong idea.</p><p></p><p>So in the end, tests can be helpful and provide some insight, however they should still be taken with a grain of salt. One must also take the time to actually learn and understand what the results really mean. Testers like Leo need to start doing a better job of being more thorough with their testing, as well as doing a better job of explaining the results, as to not give the wrong idea/impression.</p></blockquote><p></p>
[QUOTE="ForgottenSeer 72227, post: 929578"] Well thats great that he did that! It's my fault for not watching the video before posting my last comment! So, I do owe Leo an apology in that regard. Like you I always suggest people take any AV test (professional and/or other wise) with a grain of salt. That being said, all tests do have their place and can have some useful insight, especially when you look back historically. In saying this, I feel that the biggest issue surrounding many of these tests aren't necessarily the results themselves, but rather how one interprets these results. Many people just look at the bar graphs and make generalized conclusions about the various products tested. Many professional tests do have appendixes that go into more detail to explain some of the results...however, not many people go an look at those. For example, WD use to have very high FP rates on AV-comparatives, however when one looks at the appendix...one would see that the vast majority of WD's FPs came from files with very low, to low prevalence. Where as any files with medium, or higher had virtually no FPs. This is helpful because, while the total number of FPs (all categories summed together) is high, it shows that most people really won't have an issue. Most popular programs would normally fall under the medium to high category, which are the more important categories IMHO. Tests like the one Leo does have the same issue. Since they aren't a written documentation like AV-comparatives, they have to go out of their way to explain the results. This is where Leo has the most issues and why many like myself take issue with his tests. Its not always about the result, but rather his lack of explanation as well as not always being the most thorough in his testing. As a result, it gives the wrong impression and leaves people who use his videos to decide which program to use the wrong idea. So in the end, tests can be helpful and provide some insight, however they should still be taken with a grain of salt. One must also take the time to actually learn and understand what the results really mean. Testers like Leo need to start doing a better job of being more thorough with their testing, as well as doing a better job of explaining the results, as to not give the wrong idea/impression. [/QUOTE]
Insert quotes…
Verification
Post reply
Top