Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
General Security Discussions
AV-C: Malware Protection Test (Mar 2019) and Real-World Protection Test (Feb-Mar 2019)
Message
<blockquote data-quote="Andy Ful" data-source="post: 809580" data-attributes="member: 32260"><p>Some important info:</p><p>"In the Malware Protection Test, malicious files are executed on the system.<strong> While in the Real-World Protection Test the vector is the web, in the Malware Protection Test the vectors are network drives and USB. "</strong></p><p>"If the user is asked to decide whether a malware sample should be allowed to run, and in the case of the worst user decision system changes are observed, the test case is rated as “user-dependent”.</p><p></p><p>[ATTACH=full]212373[/ATTACH]</p><p></p><p>[ATTACH=full]212369[/ATTACH]</p><p></p><p>[ATTACH=full]212372[/ATTACH]</p><p></p><p>One can see the striking difference with false positives, when comparing real-world test (web malware vector) and malware protection test (network drives and USB vector).</p><p>On this test, Trend Micro, Panda, and Microsoft have the poor offline detection rate, but very good protection rate (better than Kaspersky, Eset, F-Secure).</p><p><strong><span style="color: rgb(184, 49, 47)">This test shows that the test results can highly depend on the pool of samples. More reliable results can be seen after averaging the results over a year.</span></strong></p><p></p><p><span style="font-size: 18px">From </span><a href="https://www.av-comparatives.org/tests/summary-report-2018/" target="_blank"><span style="font-size: 18px">Summary Report 2018 | AV-Comparatives</span></a></p><p>"</p><p><span style="font-size: 18px"><strong>Malware Protection winners</strong></span></p><p>The Malware Protection Test evaluates an AV product’s ability to protect against malware coming from removable devices or network shares. Products must provide a high level of protection without producing too many false alarms. In the Malware Protection Test, all samples not detected on-demand or on-access are executed.</p><p><strong>Avast, AVG, AVIRA </strong>and<strong> Bitdefender </strong>scored well in both tests.</p><p></p><p><span style="font-size: 18px"><strong>Real-World Protection Test winners</strong></span></p><p>Security products include various different features to protect systems against malware. Such protection features are taken into account in the Real-World Protection Test, which test products under realistic Internet usage conditions. Products must provide a high level of protection without producing too many false alarms, and without requiring the user to make a decision as to whether something is harmful or not.</p><p>The programs with the best overall results over the course of the year were from: <strong>Bitdefender, AVIRA </strong>and<strong> Kaspersky Lab</strong></p><p></p><p><span style="font-size: 18px"><strong>False Positives winners</strong></span></p><p>False positives can cause as much trouble as a real infection. Due to this, it is important that anti-virus products undergo stringent quality assurance testing before release to the public, in order to avoid false positives. AV-Comparatives carry out extensive false-positive testing as part of the Malware Protection Tests. Additionally, also false alarms from the Real-World Protection Test are counted for this category.</p><p>The products with the lowest rates of false positives during 2018 were <strong>ESET</strong> (10), <strong>AVIRA, Bitdefender, Kaspersky Lab </strong>(15) and<strong> Emsisoft</strong> (19). These figures represent the SUM of the false positives from all FP Tests.</p><p>"</p><p>Post edited.</p><p>I do not believe in Avira stellar results (Avira is from Germany and AV-Comparatives is from Austria).</p></blockquote><p></p>
[QUOTE="Andy Ful, post: 809580, member: 32260"] Some important info: "In the Malware Protection Test, malicious files are executed on the system.[B] While in the Real-World Protection Test the vector is the web, in the Malware Protection Test the vectors are network drives and USB. "[/B] "If the user is asked to decide whether a malware sample should be allowed to run, and in the case of the worst user decision system changes are observed, the test case is rated as “user-dependent”. [ATTACH type="full" alt="212373"]212373[/ATTACH] [ATTACH type="full" alt="212369"]212369[/ATTACH] [ATTACH type="full" alt="212372"]212372[/ATTACH] One can see the striking difference with false positives, when comparing real-world test (web malware vector) and malware protection test (network drives and USB vector). On this test, Trend Micro, Panda, and Microsoft have the poor offline detection rate, but very good protection rate (better than Kaspersky, Eset, F-Secure). [B][COLOR=rgb(184, 49, 47)]This test shows that the test results can highly depend on the pool of samples. More reliable results can be seen after averaging the results over a year.[/COLOR][/B] [SIZE=5]From [/SIZE][URL='https://www.av-comparatives.org/tests/summary-report-2018/'][SIZE=5]Summary Report 2018 | AV-Comparatives[/SIZE][/URL] " [SIZE=5][B]Malware Protection winners[/B][/SIZE] The Malware Protection Test evaluates an AV product’s ability to protect against malware coming from removable devices or network shares. Products must provide a high level of protection without producing too many false alarms. In the Malware Protection Test, all samples not detected on-demand or on-access are executed. [B]Avast, AVG, AVIRA [/B]and[B] Bitdefender [/B]scored well in both tests. [SIZE=5][B]Real-World Protection Test winners[/B][/SIZE] Security products include various different features to protect systems against malware. Such protection features are taken into account in the Real-World Protection Test, which test products under realistic Internet usage conditions. Products must provide a high level of protection without producing too many false alarms, and without requiring the user to make a decision as to whether something is harmful or not. The programs with the best overall results over the course of the year were from: [B]Bitdefender, AVIRA [/B]and[B] Kaspersky Lab[/B] [SIZE=5][B]False Positives winners[/B][/SIZE] False positives can cause as much trouble as a real infection. Due to this, it is important that anti-virus products undergo stringent quality assurance testing before release to the public, in order to avoid false positives. AV-Comparatives carry out extensive false-positive testing as part of the Malware Protection Tests. Additionally, also false alarms from the Real-World Protection Test are counted for this category. The products with the lowest rates of false positives during 2018 were [B]ESET[/B] (10), [B]AVIRA, Bitdefender, Kaspersky Lab [/B](15) and[B] Emsisoft[/B] (19). These figures represent the SUM of the false positives from all FP Tests. " Post edited. I do not believe in Avira stellar results (Avira is from Germany and AV-Comparatives is from Austria). [/QUOTE]
Insert quotes…
Verification
Post reply
Top