F
ForgottenSeer 72227
Well thats great that he did that! It's my fault for not watching the video before posting my last comment! So, I do owe Leo an apology in that regard.In his defense if you watch the video you will see that he tests first with an internet connection and afterwards performs the same test without an internet connection.
The results from his first test were quite okay, but the second test really broke the system.
In both tests all the files protected by Controlled Folder Access were spared/not encrypted.
His conclusion was that if you rely on Microsoft Defender (at defaults) you need to enable Controlled Folder Access (not default).
So, it is great to see that Controlled Folder Access did what it is supposed to do.
Like you I always suggest people take any AV test (professional and/or other wise) with a grain of salt. That being said, all tests do have their place and can have some useful insight, especially when you look back historically. In saying this, I feel that the biggest issue surrounding many of these tests aren't necessarily the results themselves, but rather how one interprets these results. Many people just look at the bar graphs and make generalized conclusions about the various products tested. Many professional tests do have appendixes that go into more detail to explain some of the results...however, not many people go an look at those.Yet people keep using them as sources.
For example, WD use to have very high FP rates on AV-comparatives, however when one looks at the appendix...one would see that the vast majority of WD's FPs came from files with very low, to low prevalence. Where as any files with medium, or higher had virtually no FPs. This is helpful because, while the total number of FPs (all categories summed together) is high, it shows that most people really won't have an issue. Most popular programs would normally fall under the medium to high category, which are the more important categories IMHO.
Tests like the one Leo does have the same issue. Since they aren't a written documentation like AV-comparatives, they have to go out of their way to explain the results. This is where Leo has the most issues and why many like myself take issue with his tests. Its not always about the result, but rather his lack of explanation as well as not always being the most thorough in his testing. As a result, it gives the wrong impression and leaves people who use his videos to decide which program to use the wrong idea.
So in the end, tests can be helpful and provide some insight, however they should still be taken with a grain of salt. One must also take the time to actually learn and understand what the results really mean. Testers like Leo need to start doing a better job of being more thorough with their testing, as well as doing a better job of explaining the results, as to not give the wrong idea/impression.