@Adrian Ścibor,
Thanks for another valuable test.
How does AVLab determine that the result is not Excellent? Do you use a clustering method similar to AV-Comparatives?
We require 99% overall malware detection in the wild to get EXCELLENT badge. This is explained in many places, including Recent Results in the attached section.
You do not seem to perform much feature or specific capability testing anymore - such as banking, ransomware, malicious scripts, and fileless malware testing.
No interest from the security software publishers, or they just don't want to pay for the such testing/service?
In general, malware in-the-wild covers several of these threat categories. However, the specific test is difficult because it requires working ransomware on the day of the test, which is why older samples are useless. It is not easy to find unique ransomware families in the wild in a reasonable number, e.g. 50-100 samples.
On the other hand... We could take the SHA256 in March for only 16 ransomware, only 6 banking trojans and make a separate table, but the results would look similar. That would have additional marketing value, if nothing else.
If you have any suggestions, please let us know.
The Banking Test is carried out once a year and is planned for Q3-Q4 2025.
Why ESET isn't here?
There is no requirement to test every solution in a given edition. Eset will definitely be there, we still have 4 editions in 2025.
@Adrian Ścibor
Why is Emsisoft suddenly the fastest AV program? It was almost always at the bottom of previous tests. Has Emsisoft made such extensive changes to the program?
Thanks for this question! Really!
This type of test uses the latest software, the latest versions, everything is the latest, so there is no time to freeze the methodology for six months or a year. It makes no sense.
0.105s is not that super-fast, because in the past we have had times even lower at other Vendors like WatchGuard (0,0XXs) or F-Secure (0,1XX too).
It's certainly not a measurement error, as we have in black and white shown in CSV how it's counted:
- remediation start time (time to download the malware)
- remediation end time (time to respond to the malware in the browser or system).
for pre_launch, the RT = 0s always - this is how we handle a pending threat in the browser,
for post_launch, the RT is more than 0s for example in yellow:
For each software, we keep track of its files and logs, where information about threats is stored (First Indicators). Usually there are a few, but sometimes there are more than a dozen in different folders:
Let me explain this in another words...
The testing process is a... process...
Things are constantly changing, we also need to change, we need to improve tools and METHODOLOGY. Developers like Emsisoft makes changes in the engine, in the drivers. Microsoft is changing the system, and we have to adapt to the new guidelines.
We try to do everything meticulously. It's just that Sysmon showed something we hadn't considered before - it's the process of developing testing tools.
Remediation Time is an additional parameter - UNIQUE on a global scale.
The 2025 summary will show that in January Emsisoft got 180s, and in March 2025 0.105s. In May it could still be under 1s, etc. We'll see...
We simply have to qualify this as a natural development of methodologies and testing tools in response to changes made by software vendors like Emsisoft and changes in Windows. We have seen similar differences in March 2025 in Comodo software (usually 50-100s, now 5s) and Xcitium (usually 40-150s, now 10s). We are tracking new infection markers in software, which is why we noticed additional and faster R Time values.
Besides, the Advanced In-The-Wild Malware Test has been running since July 2019 and imagine that we haven't changed anything since then

Would you like to? I don't think so
While wheel vendors improve bicycle frames and cross-country shoe manufacturers improve shock absorption and energy return, we improve software testing.
We try best with maximum transparency for Community and Vendors.