AV-TEST Advanced Endpoint Protection: Ransomware Protection test (commissioned by Kaspersky)

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

F

ForgottenSeer 92963

Although I consider Kaspersky a premium product, scoring a 100% where others struggle to block half of the samples sort of discredits the test results for me. It seems just to good to be true. What are your thoughts on commissioned (sponsored) tests? Interested to hear what other forum member's opinion on commissioned (sponsored) tests.
 

Vitali Ortzi

Level 26
Verified
Top Poster
Well-known
Dec 12, 2016
1,585
Although I consider Kaspersky a premium product, scoring a 100% where others struggle to block half of the samples sort of discredits the test results for me. It seems just to good to be true. What are your thoughts on commissioned (sponsored) tests? Interested to hear what other forum member's opinion on commissioned (sponsored) tests.
It’s is true !
This wouldn’t happen in real world but still it will be one of the top spots in real world tests
I just moved from Symantec endpoint to kes on my gaming pc and it’s awesome 👏
Way less false positives XD
 

ExecutiveOrder

Level 2
Sep 21, 2021
47
Although I consider Kaspersky a premium product, scoring a 100% where others struggle to block half of the samples sort of discredits the test results for me. It seems just to good to be true. What are your thoughts on commissioned (sponsored) tests? Interested to hear what other forum member's opinion on commissioned (sponsored) tests.
As long as the whole test is independently developed, executed, verified, and samples were selected independently. I see nothing wrong with it. But because the commissioned test is not scheduled like usual, the vendor who sponsors the test is likely to be better prepared for that particular test, so this is the real grain of salt IMO.
Correct me if I'm wrong but doesn't the company that sponsors such a test always come out on top? I think thats all we need to know about these tests.
Not always the case, there's a good rate of the commissioned test (multi-vendor test) that didn't end up in favor of the sponsor. Note that commissioned test is not scheduled like usual, the vendor who sponsors the test is likely to be better prepared for that particular test. The tester also claims to be independent during the test (develops, executes, verify, and collects samples).

For an example of "not always come out on top":
1. Network Performance Test | Business Security Software | Commissioned by ESET.
Out of 6 vendors, ESET ranked 3rd for "Traffic LAN: Server Client", 1st on "Traffic WAN: Server", and 2nd on "Traffic WAN: Client"
The second test ranked 1st for "Total Client-Side Network Load", the third test ranked 1st (tied to another Vendor) "Size of client-side virus definitions".
In the fourth test, ESET was 2nd faster, 2nd worst CPU load, 2nd worst additional RAM usage for "Machine Load During Update", one vendor managed to get 1st for all factors.
2. Comparison of Anti-Malware Software for Storage 2016 | Commissioned by Kaspersky LAB.
Total 5 vendors, Kaspersky file detection test came out 1st, and 3rd for false positives (note that they also compare with public annual test reports from consumer products which have the same results).
Throughput impact ranked 1st, 2nd, and 3rd from different platforms [out of 4 vendor total, due to one generate invalid reports] (DB, VDA, VDI), there also detailed comparisons split into detailed factors.
Impact on latency got one worst and none best.
3. Advanced Endpoint Protection Test | Commissioned by Bitdefender.
Total 9 vendors, ranked 1st on "Proactive Protection Test" rate but with 2nd worst false positives.
"PowerShell-based File-less Attacks and File-based Exploits Test" ranked 1 tied with Kaspersky (you can check 2019 and 2020 enhanced real-world protection tests that were not commissioned by any vendor, Kaspersky, ESET, and Bitdefender performs very well).
"Real-World Protection Test" 1st protection tied with Sophos, but with one more false positives (4th worst).
"Ransomware Test" 1st with 100% rate, and pretty sure it's not as advanced as the PoC test in this forum thread.

My concern about this test in this thread is whether the default settings of the business products really care much about user data on shared folders thus users need to decide whether they need the shared folder protected or not, not only because one file gets encrypted = fail (it's likely that one document worth a lot for the company). Also, next couple of months, Kaspersky's performance against ransomware might differ because probably they focus and prepare the best for this particular test only even though every day, each vendor always does their best to protect their customers. Preparation was started in Dec'20, months ago, even though other vendors were notified, that the particular vendor who sponsors this test will likely use as much resource as they can, compared to other vendors, to get the best.
 

Anthony Qian

Level 10
Verified
Well-known
Apr 17, 2021
454
As long as the whole test is independently developed, executed, verified, and samples were selected independently. I see nothing wrong with it. But because the commissioned test is not scheduled like usual, the vendor who sponsors the test is likely to be better prepared for that particular test, so this is the real grain of salt IMO.

Not always the case, there's a good rate of the commissioned test (multi-vendor test) that didn't end up in favor of the sponsor. Note that commissioned test is not scheduled like usual, the vendor who sponsors the test is likely to be better prepared for that particular test. The tester also claims to be independent during the test (develops, executes, verify, and collects samples).

For an example of "not always come out on top":
1. Network Performance Test | Business Security Software | Commissioned by ESET.
Out of 6 vendors, ESET ranked 3rd for "Traffic LAN: Server Client", 1st on "Traffic WAN: Server", and 2nd on "Traffic WAN: Client"
The second test ranked 1st for "Total Client-Side Network Load", the third test ranked 1st (tied to another Vendor) "Size of client-side virus definitions".
In the fourth test, ESET was 2nd faster, 2nd worst CPU load, 2nd worst additional RAM usage for "Machine Load During Update", one vendor managed to get 1st for all factors.
2. Comparison of Anti-Malware Software for Storage 2016 | Commissioned by Kaspersky LAB.
Total 5 vendors, Kaspersky file detection test came out 1st, and 3rd for false positives (note that they also compare with public annual test reports from consumer products which have the same results).
Throughput impact ranked 1st, 2nd, and 3rd from different platforms [out of 4 vendor total, due to one generate invalid reports] (DB, VDA, VDI), there also detailed comparisons split into detailed factors.
Impact on latency got one worst and none best.
3. Advanced Endpoint Protection Test | Commissioned by Bitdefender.
Total 9 vendors, ranked 1st on "Proactive Protection Test" rate but with 2nd worst false positives.
"PowerShell-based File-less Attacks and File-based Exploits Test" ranked 1 tied with Kaspersky (you can check 2019 and 2020 enhanced real-world protection tests that were not commissioned by any vendor, Kaspersky, ESET, and Bitdefender performs very well).
"Real-World Protection Test" 1st protection tied with Sophos, but with one more false positives (4th worst).
"Ransomware Test" 1st with 100% rate, and pretty sure it's not as advanced as the PoC test in this forum thread.

My concern about this test in this thread is whether the default settings of the business products really care much about user data on shared folders thus users need to decide whether they need the shared folder protected or not, not only because one file gets encrypted = fail (it's likely that one document worth a lot for the company). Also, next couple of months, Kaspersky's performance against ransomware might differ because probably they focus and prepare the best for this particular test only even though every day, each vendor always does their best to protect their customers. Preparation was started in Dec'20, months ago, even though other vendors were notified, that the particular vendor who sponsors this test will likely use as much resource as they can, compared to other vendors, to get the best.
You provided some examples of "not always come out on top", very interesting! However, these tests you mentioned are all conducted by AV-Comparatives, which is somewhat irrelevant here, because the test in question is conducted by AV-Test.

I've done some research about commissioned tests done by AV-Test. Here are what I found:
  1. DNS-Layer Protection & Secure Web Gateway Security Efficacy Test (A test commissioned by Cisco Inc. and performed by AV-TEST GmbH)
    1.png
    1-2.png
  2. Exploit Protection on Windows XP (A test commissioned by Tencent and performed by AV-TEST GmbH)
    2.png
    2-2.png
  3. Symantec Endpoint Protection Cloud Comparison Test for Android: Protection (A test commissioned by Symantec and performed by AV-Test GmbH)
    3.png
    3-2.png
  4. Trend Micro Comparison Test: Performance (A test commissioned by Trend Micro and performed by AV-Test GmbH)
    4.png
    (Note: Virusbuster is a Trend Micro product.)
  5. Personal Identifiable Information Protection: Sensitive Data Discovery test (The test is commissioned by Kaspersky and performed by AV-TEST GmbH)
    5.png
  6. Advanced Endpoint Protection: Fileless Threats Protection test (The test is commissioned by Kaspersky and performed by AV-TEST GmbH.)
    6.png
  7. AV-TEST Evaluates Secure Web Gateway and DNS-Layer Security Efficacy (A test commissioned by Cisco Inc. and performed by AV-TEST GmbH)
    7.png

  8. ......

As you can see, the company sponsoring the test consistently performs well and ranks first in nearly all commissioned tests conducted by AV-Test!

Also, I think AV-Comparatives is a truly independent testing organization deserving of praise.
 

Attachments

  • 2.png
    2.png
    66.7 KB · Views: 118

ExecutiveOrder

Level 2
Sep 21, 2021
47
You provided some examples of "not always come out on top", very interesting! However, these tests you mentioned are all conducted by AV-Comparatives, which is somewhat irrelevant here, because the test in question is conducted by AV-Test.

As you can see, the company sponsoring the test consistently performs well and ranks first in nearly all commissioned tests conducted by AV-Test!

Also, I think AV-Comparatives is a truly independent testing organization deserving of praise.
Interesting, thanks for sharing! Yes, it's only relevant for answering the question in general or without the context of this forum thread.
I'm using AV-Comparatives because I'm familiar with it, and easier to search (included in their public search engine) and I'm always doubtful if AV-Test is somewhat biased because it has tons of good reputation too just like other testing organizations, so I thought it would be similar.
The main thing that bothering for a long time, Kaspersky getting "Top Product" awards 32 times in a row! AV-Test is Kaspersky shill? Perhaps... JK!
Anyway, after seeing the commissioned tests comparison from both organizations, guess I'll trust AV-Comparatives more and I wish any new commissioned test will remain neutral (the last real test comprising multiple AV vendors was in early 2018).
 
F

ForgottenSeer 92963

You [ExecutiveOrder] provided some examples of "not always come out on top", very interesting! However, these tests you mentioned are all conducted by AV-Comparatives, which is somewhat irrelevant here, because the test in question is conducted by AV-Test.

I've done some research about commissioned tests done by AV-Test. Here are what I found: [PICTURES OF PROOF REMOVED for space] As you can see, the company sponsoring the test consistently performs well and ranks first in nearly all commissioned tests conducted by AV-Test!

Also, I think AV-Comparatives is a truly independent testing organization deserving of praise.

@Anthony Qian (thanks for proving AV-TEST bias) and @ExecutiveOrder (thanks for showing AV-Comparatives Independence) (y)(y)(y)
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,592
Such tests are focused on the custom scenario which is proposed by the sponsor. The sponsors usually know that in this scenario their products will behave well. But, there are many methods to prevent ransomware infections. It is very hard to interpret these tests in relation to the protection in the wild.

If one tests children in China by using the English version of the test, then the results will be not as good as for children in England. This does not mean that the results in England are biased and one cannot conclude that Chinese children are stupid (we know that they are very smart).:)
 

ExecutiveOrder

Level 2
Sep 21, 2021
47
Such tests are focused on the custom scenario which is proposed by the sponsor. The sponsors usually know that in this scenario their products will behave well.
So, Kaspersky only proposes custom ideas or scenarios for testing: "Real-World ransomware attacks user files on local system, Real-World ransomware attacks user files on remote shared folder, Proof of Concept ransomware attacks user files on local system" and confident enough their product will perform very well no matter how AV-Test will perform the test.
While the rest (developing, verifying, selecting sample, and executing) are independently conducted by AV-Test?
Meanwhile, in the 7 other examples above I couldn't find anything convincing that could indicate AV-Test performing the tests "independently".
 
  • Like
Reactions: KonradPL

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,592
So, Kaspersky only proposes custom ideas or scenarios for testing: "Real-World ransomware attacks user files on local system, Real-World ransomware attacks user files on remote shared folder, Proof of Concept ransomware attacks user files on local system" and confident enough their product will perform very well no matter how AV-Test will perform the test.

Yes, I think so. Kaspersky team requested a broad scenario to test the method they use to fight ransomware. The test results are good because generally, Kaspersky method is very effective (one of the best). Anyway, the attacks on remote shared folders could be prevented before they might happen (this was not fully tested).

It is hard to interpret such tests. They assume the kind of breach and test what could happen afterwards, but do not test the ability to prevent the breach. Of course, these tests can be useful for AV vendors - they know best if the test results show the weak points of their products.

I think that discussing the protection of other products on the basis of this test would be risky.
It would be also risky to say that Kaspersky has got bulletproof anti-ransomware protection. Anyway, there are several tests that show very effective anti-ransomware protection of Kaspersky products.
 
Last edited:
F

ForgottenSeer 92963

Such tests are focused on the custom scenario which is proposed by the sponsor. The sponsors usually know that in this scenario their products will behave well. But, there are many methods to prevent ransomware infections. It is very hard to interpret these tests in relation to the protection in the wild.

If one tests children in China by using the English version of the test, then the results will be not as good as for children in England. This does not mean that the results in England are biased and one cannot conclude that Chinese children are stupid (we know that they are very smart).:)
Andy,

By definition, a test is biased when the testing conditions are influenced by only one vendor. It is not about manipulating the test results, it is about manipulating the test scenario's.

How would the Portuguese and Polish soccer team react when the FIFA agrees to level the playing field by only allowing players who have played less than 100 games for their national team (excluding top goal scorers like Renaldo and Lewandowski)? The world champion still needs to qualify in the series and win all knock matches, but the conditions under which the world cup series was played was biased.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,592

Kees1958,​

Using the term "biased" is not precise and rather relative. For example, I do not think that your FIFA example is biased if these regulations would be established for the next 20 years for all countries.:)
Anyway, it is probably true that these tests include some commercial promotion. But any vendor can do the same (and many already did).

Let's agree that such tests are only small parts of the much bigger picture. Until we do not see the whole picture, we cannot truly compare the products.:unsure:
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top