AV-TEST AV-TEST Awards 2025: celebrating the very best of IT security products

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Parkinsond

Level 62
Thread author
Verified
Well-known
Dec 6, 2023
5,187
14,807
6,069
The 2025 Award in the test category of Best Protection for consumer users goes to the security package Total Security from Bitdefender.

The Best Protection 2025 Award for Corporate Users goes to Bitdefender Business Security.

For its minimal system load, with simultaneously high protection, McAfee Total Protection consistently achieved high test scores and thus earned the Best Performance 2025 Award for Consumer Users.

For their extremely professional work with zero false alarms, AV-TEST is presenting the Best Usability 2025 Awards for Consumer Users to Avast Free Antivirus, AVG Internet Security, Avira Security, F-Secure Total, Kaspersky Premium, Norton 360, TotalAV Premium and McAfee Total Protection.

For the best annual performance in all Windows tests and the most precise detection with no false alarms, AV-TEST is presenting the Best Usability 2025 Award for Corporate Users to Kaspersky Small Office Security and Kaspersky Endpoint Security.

For its consistently strong results, AV-TEST is giving the Best Advanced Protection 2025 Award for Consumer Users to McAfee Total Protection.

Based on their perfect protection performance in all test scenarios, the AV-TEST Institute is presenting the following manufacturers and products with the Best Advanced Protection 2025 Award for Corporate Users: Kaspersky Endpoint Security, Kaspersky Small Office Security and Microworld eScan Enterprise EDR.

 
Last edited:
There's something important to highlight, and it's that BitDefender keeps dominating protection tests for a while now. They have made an outstanding job on refining and optimizing their product. Great static detection, amazing behavioural protection, malware roll-back module, privacy modules, cryptomining protection, firewall, vulnerability scan. All packed up in a light antivirus (at least for most. I personally had a very light experience). The bugged antivirus is a thing of the past. Although I'm in love with Kaspersky, I really am amazed by BitDefender and will keep recommending it. Very complete, and robust product.

Also, congratulations to the McAfee team for the consistent results and the complete makeover they achieved on their previously infamous product. If anyone has details on their daily performance/usage/protection, I'd love to have an insight on how the product works, besides the protection tests I can find online. :)
 
To add some context to the observation about ESET missing out on awards and the speculation regarding its behavioral analysis: it’s not that ESET is deficient, but rather that its core engineering philosophy actively clashes with how standardized lab tests are conducted.

When comparing ESET to engines like Kaspersky (K) or Bitdefender (B), we are looking at two totally different approaches to endpoint security, and standardized testing heavily favors only one of them.

Testing organizations like AV-TEST and AV-Comparatives have a strict mandate: every product must be evaluated on its out-of-the-box, default settings. This makes sense for the average consumer who will never open a settings menu, but it massively disadvantages ESET.

Kaspersky and Bitdefender are designed to be highly aggressive, automated, post-execution engines. They let an unknown file run, monitor its behavior, and if it starts acting like ransomware, their engines automatically kill the process and roll back the damage. Because they automate this aggressively out-of-the-box, they ace the "Advanced Threat Protection" scenarios in standardized tests.

ESET, on the other hand, is built like an engineer's tool. Its philosophy is to provide a quiet, low-friction baseline by default to avoid annoying the average user with false positives and alert fatigue. ESET excels at pre-execution memory scanning, but it deliberately chooses not to automate highly aggressive post-execution rollbacks out-of-the-box.

Instead of automating the behavioral blocks, ESET hands the keys to the user. When an advanced user manually hardens ESET—shifting the HIPS (Host Intrusion Prevention System) to Interactive or Policy-Based mode and creating custom firewall rules, it essentially transforms into a lightweight EDR (Endpoint Detection and Response) solution. With a hardened configuration, you can explicitly deny the exact execution chains malware relies on (e.g., blocking Office applications from spawning PowerShell, or preventing executables from running in the AppData Temp folders). When configured this way, threats are fundamentally denied by the system rules before behavioral analysis even needs to step in.

Standardized lab tests grade an antivirus on its "cruise control" setting, not its top speed. ESET frequently misses out on awards not because its engine is weak, but because it refuses to automate aggressive behavioral blocks by default at the risk of breaking legitimate user workflows. It reserves its true defensive power for power users and IT admins who know how to build custom rulesets.
 
Instead of automating the behavioral blocks, ESET hands the keys to the user. When an advanced user manually hardens ESET—shifting the HIPS (Host Intrusion Prevention System) to Interactive or Policy-Based mode and creating custom firewall rules, it essentially transforms into a lightweight EDR (Endpoint Detection and Response) solution. With a hardened configuration, you can explicitly deny the exact execution chains malware relies on (e.g., blocking Office applications from spawning PowerShell, or preventing executables from running in the AppData Temp folders). When configured this way, threats are fundamentally denied by the system rules before behavioral analysis even needs to step in.
On most home user scenarios, the product will stay configured as it came when installed. Rarely an end user configures a security product, because they don't know what they're doing. Therefore, if ESET ships their products with a weak setting applied, their users will most likely stay underprotected until infected... I like ESET, but if they heavily rely on their HIPS to be set to "interactive" or policies/rules to be created by an Average Joe, they're doomed as a product. Maybe shift your market view to IT professionals only.
 
On most home user scenarios, the product will stay configured as it came when installed. Rarely an end user configures a security product, because they don't know what they're doing. Therefore, if ESET ships their products with a weak setting applied, their users will most likely stay underprotected until infected... I like ESET, but if they heavily rely on their HIPS to be set to "interactive" or policies/rules to be created by an Average Joe, they're doomed as a product. Maybe shift your market view to IT professionals only.
Assuming ESET leaves users "underprotected" on default settings is a misconception driven by how these synthetic lab tests are designed. It all comes down to where the antivirus engine focuses its power.

ESET’s default configuration is not weak; it is just heavily front-loaded. ESET's DNA is built around stopping threats before they execute. Their Advanced Memory Scanner, UEFI Scanner, and deep heuristic analysis are arguably some of the best in the industry at catching heavily obfuscated malware right as it tries to unpack itself into system memory.

For the average home user on default settings, ESET aims to kill the threat at the door.

The problem with standardized tests (like the AV-TEST ATP scenarios is that they often intentionally bypass these initial pre-execution layers. They force the malware to run in order to explicitly test the post-execution behavioral monitoring and rollback capabilities.

Kaspersky/Bitdefender, are designed with incredibly aggressive post-execution rollback features. If a threat gets inside and starts encrypting files, they automate the cleanup. They excel in these specific tests.

Eset chooses not to automate aggressive behavioral rollbacks out-of-the-box to prevent false positives and system instability.

You are entirely correct that for a high-risk home user, someone who frequently pirates software, clicks random links, and ignores browser warnings, an engine with an aggressive, automated post-execution safety net (like Kaspersky or Bitdefender) is the better "install and forget" choice.

But for an average user with basic digital hygiene, ESET's default pre-execution engine is incredibly robust, exceptionally light on system resources, and rarely breaks legitimate software. It’s not that their consumer product is doomed; it’s that their enterprise-first, "prevent rather than remediate" philosophy just doesn't produce the flashy post-infection rollback metrics that win consumer lab awards.
 
Having used ESET for well over 20 years maybe ESET ought to market their product as a little weak until setting are adjusted & I've just looked on their home page & they do not say or slightly elude to that! Most users today (they once did) just don't have the knowledge or understanding to set ESET up & frankly they don't want to either - Its not beyond ESET's ability to change that, most users today want fit and forget, that could be done on install unless bloody mindedness & the ability to walk on water is part of of ESET's prime philosophy??

Having used alternatives to ESET & have no problems with an alternative approach - (BTW ESET is on my laptop) Maybe they are step marching, its all the other soldiers who are out of step?? And the less that users can faff with a product with fingers that do not understand in my opinion the better? Just my random opinion...
 
Last edited:
Assuming ESET leaves users "underprotected" on default settings is a misconception driven by how these synthetic lab tests are designed. It all comes down to where the antivirus engine focuses its power.

ESET’s default configuration is not weak; it is just heavily front-loaded. ESET's DNA is built around stopping threats before they execute. Their Advanced Memory Scanner, UEFI Scanner, and deep heuristic analysis are arguably some of the best in the industry at catching heavily obfuscated malware right as it tries to unpack itself into system memory.

For the average home user on default settings, ESET aims to kill the threat at the door.

The problem with standardized tests (like the AV-TEST ATP scenarios is that they often intentionally bypass these initial pre-execution layers. They force the malware to run in order to explicitly test the post-execution behavioral monitoring and rollback capabilities.

Kaspersky/Bitdefender, are designed with incredibly aggressive post-execution rollback features. If a threat gets inside and starts encrypting files, they automate the cleanup. They excel in these specific tests.

Eset chooses not to automate aggressive behavioral rollbacks out-of-the-box to prevent false positives and system instability.

You are entirely correct that for a high-risk home user, someone who frequently pirates software, clicks random links, and ignores browser warnings, an engine with an aggressive, automated post-execution safety net (like Kaspersky or Bitdefender) is the better "install and forget" choice.

But for an average user with basic digital hygiene, ESET's default pre-execution engine is incredibly robust, exceptionally light on system resources, and rarely breaks legitimate software. It’s not that their consumer product is doomed; it’s that their enterprise-first, "prevent rather than remediate" philosophy just doesn't produce the flashy post-infection rollback metrics that win consumer lab awards.
I understand ESET is mostly pre-execution, but their HIPS isn't. HIPS monitors behavior after a process starts running. It looks at actions like registry changes, process injection, file system modifications, privilege escalation attempts, etc. I don't think an HIPS is the best approach for post-execution protection, but even if it is, needing it to be set to "interactive" or relying on pre-made rules is chaotic for a novice user.

I do believe static protection in ESET is outstanding, and I can’t really comment on how strong their dynamic protection is nowadays since I haven’t tested it in a few years. But if that dynamic layer still leans heavily on HIPS, then it raises a usability concern. Interactive HIPS, while powerful for advanced users who understand system behavior, tends to generate prompts that require technical judgment. For the average user, that often turns into either blindly allowing everything (defeating the purpose) or blocking legitimate actions and breaking normal system functionality.

Ideally, post-execution protection should be as automated and context-aware as possible, minimizing user decisions. If the effectiveness of the system depends too much on user interaction or manual rule tuning, it introduces inconsistency and user error, which are exactly the things security solutions should aim to eliminate.
 
After being on Microsoft Defender exclusively for quite some time I was surprised recently testing McAfee. The performance difference is actually notable. I think a recent change in Defender has taken up a little more background performance. When AB testing I found McAfee allowed Handbrake jobs to reliably complete 2-5% faster. Rendering tests reflect this as well. Which isn't big, but when encoding 4K for 8 hour it'll make a difference for my use case. So, I would say the performance award doesn't surprise me. Previously between when testing other solutions (such as ESET) I found MD to work identically for encoding and rendering. File transfers are the only other place I've ever noticed any difference.
 
After being on Microsoft Defender exclusively for quite some time I was surprised recently testing McAfee. The performance difference is actually notable. I think a recent change in Defender has taken up a little more background performance. When AB testing I found McAfee allowed Handbrake jobs to reliably complete 2-5% faster. Rendering tests reflect this as well. Which isn't big, but when encoding 4K for 8 hour it'll make a difference for my use case. So, I would say the performance award doesn't surprise me. Previously between when testing other solutions (such as ESET) I found MD to work identically for encoding and rendering. File transfers are the only other place I've ever noticed any difference.
Mcafee ability to scan files only on execution is making a difference? No scan on access helps
 
I understand ESET is mostly pre-execution, but their HIPS isn't. HIPS monitors behavior after a process starts running. It looks at actions like registry changes, process injection, file system modifications, privilege escalation attempts, etc. I don't think an HIPS is the best approach for post-execution protection, but even if it is, needing it to be set to "interactive" or relying on pre-made rules is chaotic for a novice user.

I do believe static protection in ESET is outstanding, and I can’t really comment on how strong their dynamic protection is nowadays since I haven’t tested it in a few years. But if that dynamic layer still leans heavily on HIPS, then it raises a usability concern. Interactive HIPS, while powerful for advanced users who understand system behavior, tends to generate prompts that require technical judgment. For the average user, that often turns into either blindly allowing everything (defeating the purpose) or blocking legitimate actions and breaking normal system functionality.

Ideally, post-execution protection should be as automated and context-aware as possible, minimizing user decisions. If the effectiveness of the system depends too much on user interaction or manual rule tuning, it introduces inconsistency and user error, which are exactly the things security solutions should aim to eliminate.
I tend to agree that even "post-execution" HIPS procedures should be fully automated. Leaving the decision up to a user who has no in-depth knowledge of how the OS works is completely pointless.
An example of this approach is the "Folder Guard" feature, where similar functionality can be achieved by configuring HIPS, but the average user is unaware of this option. Folder Guard is therefore an example of such automation.
 
Congrats to all vendors. As expected Bitdefender and Kaspersky ever so consistent but to many people surprise ESET did not get even a single award.
yes re Eset and giving it 5.5 / 6.0 Performance seems more subjective than numerically valid imo. At AV-C Eset ranked better (less impact) than McAfee. grain of salt

 
BitDefender keeps dominating protection tests for a while now. They have made an outstanding job on refining and optimizing their product. All packed up in a light antivirus (at least for most. I personally had a very light experience). T
Bitdefender was always heavy almost buggy for me in the past, sounds like I need to try it again... :unsure: