App Review Windows Defender vs Malware in 2021 (The PC Security Channel)

It is advised to take all reviews with a grain of salt. In extreme cases some reviews use dramatization for entertainment purposes.

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Y'all complain Leo isnt doing it correctly, while claim independent labs, which do the exact same thing as him, is somehow "correct".
It is not so hard to see the difference, for example:

Making professional tests requires several people, resources, and time.
Shrinking the people, resources, and time = Leo's video.
 
Last edited:

EndangeredPootis

Level 10
Verified
Well-known
Sep 8, 2019
461
It is not so hard to see the difference, for example:

Making professional tests requires several people, resources, and time.
Shrinking the people, resources, and time = Leo's video.
Yet I see no difference in testing methods between leo and independent labs, all thats diferent is that as they are companies they have access to more malware samples.
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
EndangeredPootis,

There are some common differences:
  1. Several AVs are tested against the same malware samples.
  2. Each AV is tested on a different machine because the AVs are tested at the same time.
  3. The system is reverted to the initial state before testing another sample.
  4. The test for any AV and any sample takes several minutes to allow the malware infection or allow the AV to remediate the attack.
  5. During each test, the system is monitored. After each test, the system Logs and Logs of some additional monitoring tools are examined to see the signs of infection.
  6. The tests are repeated periodically with the same testing methodology.
  7. The results are examined statistically to find the awarded AVs.
I kindly assume that Leo has access to the samples already recognized by some authority as malware.
The more of these points are skipped in the test, the harder is the analysis of results. In the case of Leo's "tests" all these points are skipped, so reliable analysis is impossible.
These points are important, and that is why Leo's "tests" are only presentations.
 
Last edited:

koloveli

Level 4
Well-known
Sep 13, 2012
191
When malware is already on the system, then we don't talk about "prevention" anymore. Prevention of malware is happening before the infection happens.
yes and not.😁
detect a malware is prevention, but not a prevention complete as more comprehensive as in case of comodo internet security, lmt antimalware... for example...
 

Kongo

Level 36
Thread author
Verified
Top Poster
Well-known
Feb 25, 2017
2,585
yes and not.😁
detect a malware is prevention, but not a prevention complete as more comprehensive as in case of comodo internet security, lmt antimalware... for example...
Detecting and removing malware before it infects a system is prevention, after infecting a system, it's simply not prevention anymore... But that's not the topic of this thread anyway. :)

Unbenannt.PNG
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
yes and not.😁
detect a malware is prevention, but not a prevention complete as more comprehensive as in case of comodo internet security, lmt antimalware... for example...
Did Microsoft take over the Comodo AV?:unsure:
Please look here:
Believe me, most people on this thread know a lot about prevention, keyloggers, and the protection of Comodo, Keyscrambler, Zemana AntiLogger, and Spyshelter. If you are interested in discussion then you have to open a separate thread.:)(y)
 
Last edited:

EndangeredPootis

Level 10
Verified
Well-known
Sep 8, 2019
461
EndangeredPootis,

There are some common differences:
  1. Several AVs are tested against the same malware samples.
  2. Each AV is tested on a different machine because the AVs are tested at the same time.
  3. The system is reverted to the initial state before testing another sample.
  4. The test for any AV and any sample takes several minutes to allow the malware infection or allow the AV to remediate the attack.
  5. During each test, the system is monitored. After each test, the system Logs and Logs of some additional monitoring tools are examined to see the signs of infection.
  6. The tests are repeated periodically with the same testing methodology.
  7. The results are examined statistically to find the awarded AVs.
I kindly assume that Leo has access to the samples already recognized by some authority as malware.
The more of these points are skipped in the test, the harder is the analysis of results. In the case of Leo's "tests" all these points are skipped, so reliable analysis is impossible.
These points are important, and that is why Leo's "tests" are only presentations.
Theres nothing to prove their tests even take place, they just say "hey this product did (xyz) in this test" and expect us to take their word for it.

Even Equene Kaspersky has expressed concerns on independent labs/testing sites.

Time and time again I bring this up, and I will do it again:

Lets take our favourite PUP/grayware product TotalAV, according to av-comparatives, it got an 90% detection ratio against malware when offline, how can this be? all it uses is the Avira Engine, so it cant have access to their cloud, unless, of course, they are just making the numbers up, it even got a better score than Kaspersky, which "somehow", had worse scores ONLINE than the Offline TotalAV, doesnt that ring alarm bells to you? on top of that, it has an smaller performance impact and fewer false positives than Avira despite using their engine, hmm...

Also, according to av-test.org, Ahnlab, an company ive never heard before, somehow always get top scores, and when looking at their site, they only have Endpoint products, are they testing business solutions vs home products? seems kinda unfair if you ask me, and upon further inspection, they seem to be just using the Bitdefender engine
 
Last edited:
  • Like
Reactions: Kongo

ErzCrz

Level 22
Verified
Top Poster
Well-known
Aug 19, 2019
1,168
Is Comodo taken by Microsoft?:unsure:
Hehe.

It would be good to see MD tested with ConfigureDefender tweaks or even H_C would have prevented some of that in the test executing in the first place. It was good to see that MD had crashed his malware distributor script twice.

I like Comodo for what it is and I return to it occasionally but I prefer buit-in protection. I paid for this laptop which includes the OS and that OS should protect me as much as possible. The thing that annoys me is that as a WIn 10 Home user, I get less protection as standard compared to the Pro. Thankfully tools like H_C, CD & FWH fill those gaps ;) I'd love to see more Firewall Hardening improvement to default block new connections but that's another topic...

MD does what it does pretty well and has improved a lot particularly combined with new Edge and your router firewall and tools luke uBO.

Keep safe - Erz
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Theres nothing to prove their tests even take place, they just say "hey this product did (xyz) in this test" and expect us to take their word for it.
Good point. I believe that these tests are moderately faithful and the possible "dirty tricks" caused by money are negligible on average in the period of one or two years. Please remember that there are many people involved in these tests. Some of them already left the AV testing labs and work for other companies. It is improbable that so many people did not tell us the "hidden truth" after many years of testing. There are also several AV vendors whose products participate in these tests. Did you hear them saying that these tests are usually not faithful? I did not. There are some vendors (like Emsisoft, Cylance, etc.) that do not participate in tests because the testing methodology does not fit well with the AV protection. For example, Cylance can detect only PE files, and even if it detects 100% PE malware, the result will not be good for the fileless malware.
Even Equene Kaspersky has expressed concerns on independent labs/testing sites.
I have similar concerns, but they can be overcome in a limited way by including many tests over a period of one or 2 years. These concerns are also covered by awards. As you can see, there are usually several AVs in the top awarded group. It means that the differences between these AVs are statistically insignificant (should be thought to have similar protection).
 

Andy Ful

From Hard_Configurator Tools
Verified
Honorary Member
Top Poster
Developer
Well-known
Dec 23, 2014
8,510
Hehe.

It would be good to see MD tested with ConfigureDefender tweaks
As you wish :)
Leo's presentations:
ConfigureDefender HIGH (minus ASR rules):



ConfigureDefender MAX:



AV-Comparatives Enterprise tests (ConfigureDefender HIGH, minus ASR rules), for example:
Cumulative results:

Effitas MRG Effitas 360 Assessment & Certification Programme (ConfigureDefender MAX)
Cumulative results:

In all these tests, the Defender is compared with AVs Business versions that have ATP features. It is not a top AV, but can compete with several good commercial AVs.
 
Last edited:

upnorth

Level 68
Verified
Top Poster
Malware Hunter
Well-known
Jul 27, 2015
5,458
True, the real way of any malware infection is never like that: running a bunch of samples one after another. But it makes a huge difference when you testing a few samples only, monitoring samples behavior and waiting at least 5 minutes before going to execute the next sample. YouTube testers usually never doing like that for reason almost nobody would like to watch 30-60 minutes full time of malware testing videos. Obviously, it's useless to complain about YouTube testers, but here are always the same discussions on every new malware test video 🥱

To be on topic of MD, the truth is in the middle as almost always. MD isn't really that much weak looks like in this video, but on default settings (that matters the most because average users rarely tweak to improve protection of MD) it's far away from invincible... My personal tests confirms that sometimes MD even on max. protection settings ending up to be infected by one malware sample only, of course happens sometimes also for the most of paid AVs 😉
Who is this guy!? Give him a beer. :emoji_beer:🥳

11-12-38.gif
But absolutely not always. I always been more curious on how some of his tested products UI ( user interface ) looks or some modules/features reacts in certain situations, but it sadly becomes a mess when it's tested like it is and I haven't seen any actual improvement since I saw his first video probably a year ago. Not improvement that I personal find interesting enough. On the other hand member @silversurfer is so extreme spot on right, because Leos views/clicks would drop way too much. Good thing it exist something called a " Mouse Pointer " 🦧 and I personal can always watch something else whenever I feel like it. :coffee:🍪🍸🍹

Btw, thanks @SecureKongo for the share. (y)
 

blackice

Level 39
Verified
Top Poster
Well-known
Apr 1, 2019
2,867
Since I am no malware analyst all I need from an AV is for it to alert me to a threat. At the first real sign it’s not a false positive I’m going to reimage since I don’t have the skills necessary to verify proper remediation. Defender fits this use case most of the time, along with second opinion scanners.
 

SeriousHoax

Level 49
Verified
Top Poster
Well-known
Mar 16, 2019
3,862
Lets take our favourite PUP/grayware product TotalAV, according to av-comparatives, it got an 90% detection ratio against malware when offline, how can this be? all it uses is the Avira Engine, so it cant have access to their cloud, unless, of course, they are just making the numbers up, it even got a better score than Kaspersky, which "somehow", had worse scores ONLINE than the Offline TotalAV
I have seen you use this line so many times, but your assumption is very wrong. Most AV enthusiastic members here know that Avira has pretty good signatures. It has always been one of their strong points for many years. So, Avira and TotalAV with a 90% detection rate while offline is not surprising at all. What you may wonder is the age of the malware tested. Maybe they were not fresh enough so Avira, Avast, BD socred 90.3%, 93.4% and 96.8% respectively in the offline test. Also, don't be surprised by TotalAVs higher number than Kaspersky either in the online protection test. AVs that use Avira signatures like TotalAV, F-Secure can also use Avira's cloud and Kaspersky doesn't push a signature update to the device as frequently as the three products mentioned above. Avira release signatures about 8 times a day, BD about 5-6 times and Avast is always downloading all types of signatures, false-positive fixes, etc through its stream updates.
So it's very easy to understand what is what.
The topic of this threat, Microsoft Defender is highly cloud-dependent and you'll probably always see it having a lower offline detection rate than a suspicious greyware AV like TotalAV as long as the latter is using Avira's signature.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top