AVLab.pl Learn more about Remediation Time – response time to security incidents (the results from protection test in January 2023)

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Adrian Ścibor

From AVLab.pl
Thread author
Verified
Well-known
Apr 9, 2018
211
Dear MalwareTips Users and Community,

We already have the results for January 2023 published. By the way - thank You for voting us in AV-Comparatives Survey! We took place 3rd! Amazing :geek:

Remediation Time in test

Starting from this edition, in order to stand out more from other testing laboratories as precursors in the security industry, we introduce a new comparative feature – the so-called “response time to a threat and resolve security incident” – Remediation Time. We will measure it for each tested product for home and business to further highlight the differences between protection software when confronted with threats in the wild (coming from the Internet).

In other words: in the Sysmon logs we have the UTC time logged for:

1. Downloading malware via Firefox.
2. If the product doesn't stop it at an early stage, a run occurs. If a threat detection by any technology has occurred, we have such a time logged, e.g. as "moved to quarantine", or "blocked on firewall" or different method.

From point 1 to point 2 this is known as a Remediation Time.

Remediation Time Average reports on the Recent Results webpage. From the next edition onwards, each product will have its own website with more detailed technical information and awards won. Give us some more time, please :)

Threat Landscape in January 2023

We have prepared the following summary based on the logs collected from the Sysmon tool in Windows 10 and database with collected information from the test:
  • 11 security solutions took part in the test.
  • Ultimately, we used 447 unique URLs with malware.
  • Exactly 381 malware samples were hosted with HTTP.
  • Websites encrypted with HTTPS (in theory – safe) contained 66 malware samples.
  • During the analysis, each malware samples took on average 28 potentially harmful actions in Windows 10.
  • Most malware originated from servers located in the Netherlands, Russia and USA.
  • .com, .org, .za domains were mostly used to host malware.
  • The average detection of URLs or malicious files at the PRE-Launch level was 58%.
  • At the POST-Launch level, file detection was on average 38%.
Full publication is published on website: Learn More About Remediation Time – Response Time To Security Incidents. The Results From Protection Test In January 2023 » AVLab Cybersecurity Foundation
 

Attachments

  • security level test in numbers - January 2023.jpg
    security level test in numbers - January 2023.jpg
    629.9 KB · Views: 194
Last edited:

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,458
Thanks for an innovative approach to testing. (y)
Is there a particular reason for preferring Firefox web browser?
The protection of some AVs is better when using the most popular Chrome web browsers.

Edit.
If I correctly recall, the tested AVs are those used in the home or small businesses. That is why free AVs are included, too.
 
Last edited:

Adrian Ścibor

From AVLab.pl
Thread author
Verified
Well-known
Apr 9, 2018
211
Thanks for an innovative approach to testing. (y)
Is there a particular reason for preferring Firefox web browser?
The protection of some AVs is better when using the most popular Chrome web browsers.

Edit.
If I correctly recall, the tested AVs are those used in the home or small businesses. That is why free AVs are included, too.
Hi Andy.
We use FF since beginning of 2022 because of much less problem that Chrome with automation. They cut out needed features or make it harder to use it in automate test.

Most AVs we test has Firefox addon except Microsoft Defender Browser Protection - available for EDGE and browser's chrome engine. It is not our fault that Microsoft has not taken care of Firefox users.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,458
Hi Andy.
We use FF since beginning of 2022 because of much less problem that Chrome with automation.
OK. :)
I am not sure how this choice can impact the results for AVs. It is not only the case with web browser extensions. For example, the BAFS (very important Defender 0-day protection) does not work properly with Firefox. In this case, it is rather the Firefox issue, because Defender is a part of the Windows system.

But, there is a more important issue related to Defender.
For some reason (probably due to API used for the automation), BAFS does not work properly in most AVLab tests. When BAFS works, then the macros and files like EXE, JS, VBS (downloaded directly from the Internet) are locked just after the download and checked against the cloud backend. So the files are detected mostly at the Pre-Launch level. That is how Defender works on most computers. On the contrary, in the AVLab tests, we can see the opposite results. I am not sure if Defender should be tested in such a scenario. It would be good to contact Microsoft to solve this issue.
 

Adrian Ścibor

From AVLab.pl
Thread author
Verified
Well-known
Apr 9, 2018
211
OK. :)
I am not sure how this choice can impact the results for AVs. It is not only the case with web browser extensions. For example, the BAFS (very important Defender 0-day protection) does not work properly with Firefox. In this case, it is rather the Firefox issue, because Defender is a part of the Windows system.

But, there is a more important issue related to Defender.
For some reason (probably due to API used for the automation), BAFS does not work properly in most AVLab tests. When BAFS works, then the macros and files like EXE, JS, VBS (downloaded directly from the Internet) are locked just after the download and checked against the cloud backend. So the files are detected mostly at the Pre-Launch level. That is how Defender works on most computers. On the contrary, in the AVLab tests, we can see the opposite results. I am not sure if Defender should be tested in such a scenario. It would be good to contact Microsoft to solve this issue.
It is not related with API problem or any automation, but Firefox or Microsoft system, which does not want to support FF for some reason. Maybe this is intentional or unconscious in order to provide less security for FF? I don't think monopoly is a good thing for either Windows system or Chrome engine browsers. We all need to have a choice about which browsers we want to use. MD users should install another addon e.g. Malwarebytes Browser Guard or other such.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,458
It is not related with API problem or any automation, but Firefox or Microsoft system, which does not want to support FF for some reason.
That is good. So, you have to consider three choices:
  1. Test the scenario that can give the proper results (for the most popular AV) on about 5% of computers.
  2. Change something to get the proper results (for all tested AVs) on 90+ % of computers.
  3. Skip Defender in the tests.
Even if you will choose the first possibility, it will be still interesting to see the Defender protection without BAFS.
The AVLab tests will be still valuable for other AVs.
 

SeriousHoax

Level 49
Verified
Top Poster
Well-known
Mar 16, 2019
3,861
Thanks for the test.
Don't take it the wrong way but I have some serious doubts about the testing method and some more disclaimer would be nice like not all products scan file on write or on access by default, eg: Emsisoft.

Firstly, what Andy pointed out is correct. MD's result here is not very accurate and won't match with a real world scenario in most cases. MD with only 18% pre-launch detection is absurd when a file is downloaded from the internet. Either it's fully due to MD's issue with Firefox or the issue was with the automated method that you used.
Besides, you should try your best to fix your issue with Chrome. As much as we would like to see less monopoly, Chrome is still the most used browser on Windows so it's only fair to use Chrome to make the test more similar to a real-world scenario. Second choice would be Edge as it's the default browser and Firefox should be the third and last choice. Firefox is also third in market share in Windows behind Chrome and Edge I think, so it's a proper order IMO.
Bitdefender's result is also very unusual. Are you sure something wrong didn't happen during the test? Bitdefender is always one of the best in every test, including Mitre Evaluation for Endpoint products. I have never seen it missing this many threats in any test as far as I can remember in recent years, specially when almost all other products got 100% which also includes your own previous tests. Even the not so good Webroot blocked 100%. I remember maybe in 2021/2020, MD had an issue in one of AV-Comparative's test where something wrong happened. It was a bug so in the end MD's result wasn't published. So it's not uncommon to have an odd bug either on the AVs's side or on the test's.
Some popular vendors in the past even had issues with lab's testing method which was sorted out later among them. AMTSO standard is accepted by all. Don't know how it is for avlab.pl.
Were all the AV vendors made aware of the test before or after it was done?
Also, using free AV, Paid AV, Business AV all in the same test is not ideal. Do something about it if possible. But we certainly appreciate your desire to be more transparent than other testing labs.
 

Zero Knowledge

Level 20
Verified
Top Poster
Content Creator
Dec 2, 2016
849
Nice test. New spin on the usual!

But I would never trust an AV/AM/EDR to recover from a known infection where I had been compromised. How could you trust your OS? What if the exploit or bug is still exploitable after remediation? When you know your infected or suspect you are just BURN it down and reinstall the OS or recover with a backup. Even that does not guarantee your system is clean if your targeted by a advanced attacker.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,458
Testing some AVs in the automated scenario is very difficult, so we should give credit to the efforts of AVLab.
The testing methodology has been significantly changed and some time is required to solve all possible issues.
The AVLab tests are not included on the AMSTO webpage:
https://www.amtso.org/tests/
But the current testing methodology seems OK for most AVs.

I could not find the info about which samples were tested. In previous tests, almost all samples were *.exe files which are only about 1/3 of malware in the wild. This can be a problem for Real-World protection tests.
There is also an open question related to morphed samples. If morphed samples are allowed, then the statistical variations are bigger, and much more samples are required to make the results statistically significant.
Even when morphed samples are not allowed, over 10000 samples are required to see a real difference between Microsoft Defender free and Kaspersky (KIS), with a confidence level of 95%.
 
Last edited:

ErzCrz

Level 22
Verified
Top Poster
Well-known
Aug 19, 2019
1,152
The Comparison table does contain the SHAs. The below two was the ones it failed on.

"adec73575e6741e99bd6ce8c92713e1618d5d861ab488440876fe0d87ea62e31"
"8d2f398982564f5b71a557b2250a9cf4d0e797f05678c4ffdb3872a9782ee245"
 

Adrian Ścibor

From AVLab.pl
Thread author
Verified
Well-known
Apr 9, 2018
211
Thanks for the test.
Don't take it the wrong way but I have some serious doubts about the testing method and some more disclaimer would be nice like not all products scan file on write or on access by default, eg: Emsisoft.

Firstly, what Andy pointed out is correct. MD's result here is not very accurate and won't match with a real world scenario in most cases. MD with only 18% pre-launch detection is absurd when a file is downloaded from the internet. Either it's fully due to MD's issue with Firefox or the issue was with the automated method that you used.
Besides, you should try your best to fix your issue with Chrome. As much as we would like to see less monopoly, Chrome is still the most used browser on Windows so it's only fair to use Chrome to make the test more similar to a real-world scenario. Second choice would be Edge as it's the default browser and Firefox should be the third and last choice. Firefox is also third in market share in Windows behind Chrome and Edge I think, so it's a proper order IMO.
Bitdefender's result is also very unusual. Are you sure something wrong didn't happen during the test? Bitdefender is always one of the best in every test, including Mitre Evaluation for Endpoint products. I have never seen it missing this many threats in any test as far as I can remember in recent years, specially when almost all other products got 100% which also includes your own previous tests. Even the not so good Webroot blocked 100%. I remember maybe in 2021/2020, MD had an issue in one of AV-Comparative's test where something wrong happened. It was a bug so in the end MD's result wasn't published. So it's not uncommon to have an odd bug either on the AVs's side or on the test's.
Some popular vendors in the past even had issues with lab's testing method which was sorted out later among them. AMTSO standard is accepted by all. Don't know how it is for avlab.pl.
Were all the AV vendors made aware of the test before or after it was done?
Also, using free AV, Paid AV, Business AV all in the same test is not ideal. Do something about it if possible. But we certainly appreciate your desire to be more transparent than other testing labs.
We would try something upcoming months. However, EDGE and Chrome has the same engine's problem that was unresolvable for us before. Therefore, we decided to use Firefox, which is still so great browser but not on the top in mainstream.

Testing some AVs in the automated scenario is very difficult, so we should give credit to the efforts of AVLab.
The testing methodology has been significantly changed and some time is required to solve all possible issues.
The AVLab tests are not included on the AMSTO webpage:
https://www.amtso.org/tests/

At beginning of February 2023, we decide to join to AMTSO. The procedure of joining will take some time. Be patient, please. I hope this is a step forward for us and the community. I would be happy to tell you more, how AMTSO membering will get benefits for all and how it works behind the stage. Transparency first, trust later. Maybe the AMA in the future with AVLab?

But the current testing methodology seems OK for most AVs.

I could not find the info about which samples were tested. In previous tests, almost all samples were *.exe files which are only about 1/3 of malware in the wild. This can be a problem for Real-World protection tests.
You can find SHAs by clicking on "DOWNLOAD COMPARISON TABLE" on Recent Results webpage.

There is also an open question related to morphed samples. If morphed samples are allowed, then the statistical variations are bigger, and much more samples are required to make the results statistically significant.
Even when morphed samples are not allowed, over 10000 samples are required to see a real difference between Microsoft Defender free and Kaspersky (KIS), with a confidence level of 95%.
Samples of this kind conflict with the AMTSO standard, I suppose. They are not recommended if they are created by manually-obfuscated and included in the test. At the moment we use our own malware sources (sources are transparent in the methodology website at STEP 3: Methodology » AVLab Cybersecurity Foundation), but by joining AMTSO we will additionally gain access to the RTTL database. This should increase the numbers of available malware in the test. We have to wait for the next few months until we have fully joined AMTSO.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,458
It would be helpful if the "Threat landscape" statistics could include information about the file types used as the initial attack vector in the test.
For example:
  • malicious EXE files (.exe, .com, .scr).
  • The rest files (including the attack vectors via scripts, shortcuts, DLLS, documents, legal but vulnerable EXE files, LOLBins, etc).
 

cruelsister

Level 43
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 13, 2013
3,221
I could not find the info about which samples were tested. In previous tests, almost all samples were *.exe files which are only about 1/3 of malware in the wild. This can be a problem for Real-World protection tests.
There is also an open question related to morphed samples. If morphed samples are allowed, then the statistical variations are bigger, and much more samples are required to make the results statistically significant.
Even when morphed samples are not allowed, over 10000 samples are required to see a real difference between Microsoft Defender free and Kaspersky (KIS), with a confidence level of 95%.
Indeed malware coming in forms other than exe's are underutilized in many tests. This is an issue as a number of AM applications either ignore or don't have the ability to properly detect malicious scriptors.

Also perhaps less emphasis should be placed on sample quantity to cut down on pointless detections of morphed samples and more emphasis on uniqueness of mechanism.
 

Adrian Ścibor

From AVLab.pl
Thread author
Verified
Well-known
Apr 9, 2018
211
From the point of view of static and dynamic analysis, a polymorphic file is still the same file, which is a copy of the previous one. So having, for example, 1000 original samples and adding a few bytes with some tool to change the SHA256, they are still the same files, so you have 2000 files in total. They do nothing malicious other than what they were originally programmed to do. Therefore, adding copies of the same files, I don't think is a good recommendation.

Basically let's say for example:

You have the file A.exe. Add a few bytes, add obfuscation, and you have file B.exe with a different checksum. You can upload to VirusTotal and check that first there was a detection rate of 40/60 and now it's 1/60 and you say - hey, antivirus sucks!

What you don't actually know is that the engines on VirusTotal are usually not the same as the ones you have installed on your home or company PC:


VT webpage quoting source: https://support.virustotal.com/hc/e...de-statistics-comparing-antivirus-performance

VirusTotal's antivirus engines are command line versions, so depending on the product, they will not behave exactly the same as the desktop versions: for instance, desktop solutions may use techniques based on behavioral analysis and count with personal firewalls that may decrease entry points and mitigate propagation, etc.

If you want to read more about VT engines usage, please read my analysis (sorry, in Polish only, please use machine translating): Dlaczego VirusTotal pokazuje wyniki skanowania inne niż w rzeczywistości? » AVLab.pl
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,458
We agree that Including morphed samples in the test is not a good idea for several reasons. So, there is a question: how these files (polymorphic, metamorphic, and oligomorphic) are managed in the automated scenario?
Did Bitdefender miss 13 samples due to morphed malware or there were 13 different malware?
 
F

ForgottenSeer 98186

People use Firefox browser on Windows. So testing that configuration is a completely legitimate configuration and the results are legit as well.

It is difficult to see the objection as Microsoft Defender only missed 2 samples out of 447. In no way can the Microsoft Defender results be considered poor. The result is excellent. There should be no objection to the results nor any recommendation to not test Microsoft Defender with such a test system configuration.

Changing the test methodology so that Microsoft Defender produces better results is considered "gaming the system" by AMSTO. They have discussed the type of objection raised in this thread many times over the years and it is their official stance that it is against their recommendations.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top