App Review The Shadowra's Battlefield Antivirus 2021

It is advised to take all reviews with a grain of salt. In extreme cases some reviews use dramatization for entertainment purposes.
Content created by
Shadowra

Shadowra

Level 36
Thread author
Verified
Top Poster
Content Creator
Malware Tester
Well-known
Sep 2, 2021
2,586
Hello and welcome to the antivirus tournament!
This is the last video of 2021, because after that I will be with my family.

You have chosen your 10 favorite antivirus programs, and these same programs are going to face each other in a merciless duel!
Who will be elected number one?

Attention: The ranking has already been done during the editing.



Antivirus tested : Avast, Avira, Microsoft Defender, WiseVector, Bitdefender, Kaspersky, ESET, F-Secure, Trend Micro, Norton.

Emsisoft could not participate because its evaluation version has already been used... Impossible to activate it.

Next video: January 2022 ;) (after the holidays)

Shooting time: 10 hours
Editing time : 1 hour
Upload : 6 minutes

Edit : The settings are default! Only the search for PUPs or other threats has been activated
 
Last edited:

Shadowra

Level 36
Thread author
Verified
Top Poster
Content Creator
Malware Tester
Well-known
Sep 2, 2021
2,586
@Shadowra What happened to your channel? Many of your videos does not appear anymore.

I invite you to go and see this topic


I had to create a new one and for security reasons and the Battlefield, I did not reupload everything
 

harlan4096

Super Moderator
Verified
Staff Member
Malware Hunter
Well-known
Apr 28, 2015
8,910
Some comments / what I see in the malware pack:
  • As in most of the video tests in general, many samples even can't run (probably are not even malware).
  • Others samples are probably legit (or even with a valid digital signing), or considered legit by some av firms, but not for others. Probably Apps with bundle “adware/PUP/PUA”, where the installation, in many cases, depends on the user selection, and this is the reason some avs firms don't consider them in their detections.
Also, it would be interesting to differentiate in final Second Opinion Results:
  1. How many of the defections of Second Opinion Scanners are actually resident running (or are just inactive leftovers/remnants, see point 2.) -> System Infected
  2. How many of the defections of Second Opinion Scanners are just inactive (not detected in original sample folder) -> System Not Clean
In Hub, before running SOS, I delete all the original samples (of course, those that are not active running). I don't count in final Second Opinion Scanners results the original samples remained and not active running, or those original samples that probably remained, but their payload were blocked/detected in dynamic.

Different method, different criteria he he :)
 
Last edited:

Gandalf_The_Grey

Level 83
Verified
Honorary Member
Top Poster
Content Creator
Well-known
Apr 24, 2016
7,256
Of course :)

Kaspersky 10
Wisevector 9
Trend Micro 8
Avast 7
Norton 6
Bitdefender 5
Microsoft Defender 4
Eset 3
Avira 2
F-Secure 1 😉
@Shadowra I enjoyed watching your video and appreciate all the work that has gone into it.

Microsoft Defender did great, its sole weakness in this test was phishing URLs. Clean system.
Eset did a fantastic job, no weaknesses in this test. Clean system.
Avira was the supprise of this test, did a respectable job, but not a completely clean system.
F-Secure did a fantastic job, like Eset, no weaknesses and a clean system.

Why did you rank them like you did?

It looks to me that Eset and F-Secure should share the first place, Microsoft in third place, because they were the only ones with a clean system and Avira in fourth place.
 
Last edited:

Shadowra

Level 36
Thread author
Verified
Top Poster
Content Creator
Malware Tester
Well-known
Sep 2, 2021
2,586
@Shadowra I enjoyed watching your video and appreciate all the work that has gone into it.

Microsoft Defender did great, its sole weakness in this test was phishing URLs. Clean system.
Eset did a fantastic job, no weaknesses in this test. Clean system.
Avira was the supprise of this test, did a respectable job, but not a completely clean system.
F-Secure did a fantastic job, like Eset, no weaknesses and a clean system.

Why did you rank them like you did?

It looks to me that Eset and F-Secure should share the first place, Microsoft in third place, because they were the only ones with a clean system and Avira in fourth place.

Thanks for your comment :)

Microsoft is 4th because Eset, Avira and F-Secure have better antiphishing management.
I also look at the remaining malware, and especially the cleaning.

Eset has better management than Defender on phishing and pack test, in terms of cleaning (Defender always has this bug during the analysis)

For Avira, its asset is APC, which has saved it several times, just like F-Secure.

I hope I answered your question :)
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
I do not think that using a few on-demand scanning tools for scoring the infections is a good idea.
  1. The anti-phishing test is inconclusive. If the phishing website is not blocked, then it can be caused by the fact that it was not malicious anymore (it is not checked in the test). For example, it could be compromised only for a while and initially undetected by any AV. Next, some AVs could blacklist this URL after real infection and some could skip it after checking that it is already clean (system protected by the AV was never hit). In this case, the test results are wrong. The AVs that were compromised in the wild got good test results, and AVs that were never hit got bad results.
  2. Kaspersky left 3 samples on the Desktop, a few leftovers in the Registry (probably PUA), one unpacked executable in the TEMP folder, and one in the custom folder. The file leftovers seemed to be blocked by Kaspersky because there were no active processes in the Killswitch.
I am not sure if Kaspersky allowed the real infection in the test - this would require deeper inspection. So, we cannot truly say that the last AV in the test is not as good as the first.:unsure:
Generally, modern AVs work in a complex way and it is often not easy to be sure about the infection, especially when the AV fights the malware on the post-execution level.

The AV testing labs use a special environment that can monitor and log the suspicious actions, so they can reliably recognize Indicators of Compromise. It is sometimes very hard to find IoCs when not using such an environment.

But, congrats for taking a challenge. Some people can probably learn something about tested AVs and testing problems.:)(y)
 
Last edited:

WiseVector

From WiseVector
Verified
Top Poster
Developer
Well-known
Dec 14, 2018
643
Thanks for the test. This reminds us that we need to improve WVSX to fight phishing URLs.

Malware testing is never an easy task, especially for large volume samples as it involves two important questions,

(1. Are these samples indeed malicious or not?
(2. Are these malicious samples still functional?

The criteria for determining whether a software is malicious are not quite the same for each vendor, ESET eventually found 3 PUPs After testing WVSX, for the first PUP (slimware DriverUpdate) we had analyzed it manually and decided not to detect it. The second PUP is incodesolutions RemoveIT, it should be abandoned, we did not find the download address from their official website. For the third PUP all the information about it we can find on the Internet is six years old, if you can send the sample to us we will analysis it for conclusions. Also for corrupted files, WVSX does not detect them. I noticed in the video that Windows Defender detects corrupted files when you executed them, I can't understand this because corrupted files won't run and therefore cannot harm the system.

Malware often has a short life cycle, many of them will not produce malicious behavior after their CC servers die, and the CC servers are usually short-lived, so it is better for testers to use fresh samples to test AV, the fresher the better. During the testing process you can use a variety of tools to determine if the malware is still functional or not, such as Process Hacker, TCPVIEW, IDA, AutoRun, etc. For example, if malware always has 0 CPU usage means it may have been dormant because you don't meet the certain conditions, such as geographic location, software installed on the system, etc. If the malware's network connection gets stuck in SYN_SENT state, that probably means its CC servers are down. In our daily work, we also execute some fresh malware samples to test the effectiveness of WVSX, if the malware refuses to perform the malicious behavior, we will reverse engineer to determine what are the conditions needed to trigger it.

Based on this test video, I think there are a lot of corrupt, gray, and even legit files in there, so "Attack all" isn't really a good thing.
 
Last edited:

harlan4096

Super Moderator
Verified
Staff Member
Malware Hunter
Well-known
Apr 28, 2015
8,910
Also for corrupted files, WVSX does not detect them. I noticed in the video that Windows Defender detects corrupted files when you executed them, I can't understand this because corrupted files won't run and therefore cannot harm the system.

This is a typical behavior We can see very often in VirusTotal detection. For years I have been sending to Kaspersky malware detected by BD (engine and clones) and by other avs firms, and K. analyst verdicts were those file were corrupted 🤷‍♂️
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top