AVLab.pl Microsoft Defender - pros and cons (November 2020)

Disclaimer
  1. This test shows how an antivirus behaves with certain threats, in a specific environment and under certain conditions.
    We encourage you to compare these results with others and take informed decisions on what security products to use.
    Before buying an antivirus you should consider factors such as price, ease of use, compatibility, and support. Installing a free trial version allows an antivirus to be tested in everyday use before purchase.

Gandalf_The_Grey

Level 47
Verified
Trusted
Content Creator
Apr 24, 2016
3,621
Ver 1919 build 18363.1198, and Hyper-V's running. Tried core isolation and got an "incompatibility with your device" problem. It's probably because of Avast, so I might try when when I uninstall the AV.
If you update to one the last two builds (2004 or 20H2, I prefer 20H2) you get a report of what driver is blocking core isolation.
With that information you can use Driver Store Explorer to delete old drivers and/or if necessary, delete the last one or two incompatible drivers yourself.
 
F

ForgottenSeer 89360

The argument about "Protection" is invalid in this case. Microsoft provides many layers of "protection", such as core isolation, exploit prevention, SmartScreen, parental controls and firewall. All these features are totally independent from Microsoft Defender.
Defender itself is merely an antivirus + ransomware protection.
Due to this fact it's wrong to even use "protection" as a term when talking about Defender. The right term is raw malware detection.

Defender offers great malware detection that would hardly fail for the average user, even if no configuration utilities were used. Remediation of threats includes removal of all files created by malware, as well as removal of registry entries/keys. Common settings "touched" by malware will be restored to safe defaults.
A rescue environment is also offered, allowing users to remove stubborn malware via booting to another microsystem.
The engine provides great classification for people looking to find out what exactly hit them.

However, the product comes with a great performance hit. Software launch and installation causes >40% CPU spikes on my high-end system. Upon uninstalling a third party AV (pretty much any AV with the exception of Trend Micro) day-to-day tasks feel much more sluggish.

Product has too many FPs. Upon detecting a threat, a weird behaviour can be observed. First, Defender prompts user to take an action. Once you bring the UI and try to exclude the file, by the time you select the action, click "apply" and answer the UAC prompt, Defender has already made the choice instead of you and removed it.

Configuration is beyond minimal unless you either use a utility provided here, or you dive into the group policy editor.

So to put it simply, will Defender detect just as much malware, as other big names - yes it will, and in some cases might outperform them. Is the overall performance of Microsoft Defender better than others? No, it is definitely not.
 
Last edited by a moderator:
F

ForgottenSeer 89360

I suspect that there can be no measurable difference between most of the popular AVs, except for targetted attacks. If one takes a look at AVLab reports, all AVs have the same protection (differences are less than measurement errors). This can follow from the fact that AVLab honeypots catch mostly the large scale attacks (with a little delay). The biggest AV testing labs have a much larger infrastructure and can catch more 0-day malware used in targetted attacks - so we can see more missed samples in the reports.
If so, then the results of most AV tests are not relevant for home users.

I also suspect that the stellar results of Norton can come from corporate honeypots. Simply, Norton catches the malware used in targetted attacks on the Norton protected enterprises before AV testing labs do. So, Norton has an advantage because it has the biggest market share. This is an advantage for the test results, independently of the number of endpoints that were infected in the targetted attacks. Of course, the Norton protection is probably one of the best, even if the test results are slightly distorted.
One can say that WD/MSD has a bigger advantage, but it is not true because the popularity of WD/MSD is not the same as MSD ATP (paid).

Anyway, the above thoughts & speculations are far from something that could be proved.(y)
The information provided about Norton here is wrong. Once upon a time, Norton really had effective marketing, was backed up the strongest company in the industry, had great protection and large market share. This is no longer the case. Even before the Broadcom disaster, Norton technologies didn't evolve one bit in the last few years. Product is doing well against executables, where it can apply reputation analyses, but protection against anything else is almost non-existent. The biggest market share now, in both corporate and consumer field is held by McAfee

They have the most effective partnership of all companies and their Endpoint portfolio is largely preferred by IT and Security admins, due to the high-efficiency, lack of noise and flexibility to deploy/remove additional components in the shape of plug-ins.
 
F

ForgottenSeer 89360

View attachment 248988
What is your source about AV market share?
I don't trust Statista and Opswat to deliver accurate information about security software market share. I trust information declared by vendors themselves, according to which, Norton and Panda actually have the smallest slice of the pie at the moment. My observations are also based on news that I constantly read, as well as analysis of market share, revenue and other factors. If you do a search on Google, you'll realise both Broadcom Symantec and NortonLifeLock sales are declining continuously. Both companies consistently lay off staff and lose market share. This can be confirmed by looking at the price point of their products - it's obvious that the company is desperate for sales.

Statista doesn't display anything about their data sources and I don't find their observations accurate. The fact that they still listed the company as "Symantec", when it's now Broadcom and NortonLifeLock, and didn't create 2 separate entities shows their lack of competence.

My statement that Norton's protection is now beyond inferior can be proven with a video if anybody doubts that. The resources Broadcom and NortonLifeLock put in detection of threats are constantly being decreased year after year. Symantec relies heavily on machine learning now, as this is the cheapest approach. This leads to very inaccurate protection with many false negatives and positives, as ML is not a universal cure. In terms of R&D, due to low profit in the last few years, Symantec is almost rock bottom.
The fact that McAfee was looking to acquire NortonLifeLock, also proves where both companies stand.

NortonLifeLock registered 616 million of revenue in 2020.
Broadcom Symantec generated 400 million for Broadcom, but it has remained flat. This makes a total of 1 billion generated by the 2 entities for a year altogether, whilst McAfee generated a whopping 1.4 billion just in the first half of 2020, amid the Covid crisis.

According to NortonLifeLock's corporate responsibility report found here https://www.nortonlifelock.com/cont...y/2020-corporate-responsibility-report-en.pdf
50 million of customers are using NortonLifeLock. It's a far cry from what's been declared in McAfee's factsheet found here: https://www.mcafee.com/enterprise/en-us/assets/fact-sheets/fs-mcafee-fact-sheet.pdf
 
Last edited by a moderator:

Andy Ful

Level 69
Verified
Trusted
Content Creator
Dec 23, 2014
5,888
...
I also suspect that the stellar results of Norton can come from corporate honeypots. Simply, Norton catches the malware used in targetted attacks on the Norton protected enterprises before AV testing labs do. So, Norton has an advantage because it has the biggest market share.
...
Anyway, the above thoughts & speculations are far from something that could be proved.(y)
Symantec was the top vendor from October 2019 according to OPSWAT data but not before. For example in January 2019. Symantec was far away from the top vendors (3.75%), but it still had stellar results in AV-Test reports. So, my speculations are not supported by the market share data.:(
 
F

ForgottenSeer 89360

Symantec was the top vendor from October 2019 according to OPSWAT data but not before. For example in January 2019. Symantec was far away from the top vendors (3.75%), but it still had stellar results in AV-Test reports. So, my speculations are not supported by the market share data.:(
Let's put it simply, companies normally reinvest certain percentage of annual profit in R&D. Broadcom sees Symantec as nothing more than a cash generator, so the resources they reinvest are kept to the absolute minimum. Even if they did have the biggest market share, this still wouldn't make any difference, as the money is just spread across various banks and not utilised for an effective core operation. Broadcom and NortonLifeLock have gone so low now, they don't even publish threat writeups and security reports. All security intelligence that Symantec once used to publish is now wiped away.

There might be many reasons why Norton gets fantastic reports, but they are just speculations of mine. It might be that labs collect only executables, where Norton shines, due to a technology it released back in June 2008. Newer, more sophisticated threats require technologies in which Symantec didn't invest.
 

Andy Ful

Level 69
Verified
Trusted
Content Creator
Dec 23, 2014
5,888
I don't trust Statista and Opswat to deliver accurate information about security software market share.
...
Understand. Anyway, by looking at the OPSWAT statistics from several months in the years 2018-2019, I found out that the stellar results of Norton in AV-Test reports, cannot be supported by the market share argument.
Norton missed only one 0-day sample according to AV-Test reports in the years 2016-2020.o_O
 
F

ForgottenSeer 89360

Understand. Anyway, by looking at the OPSWAT statistics from several months in the years 2018-2019, I found out that the stellar results of Norton in AV-Test reports, cannot be supported by the market share argument.
Norton missed only one 0-day sample according to AV-Test reports in the years 2016-2020.o_O
I'm having hard time trusting this 1 missed 0-day sample thing. If I install Norton now, In 30 minutes I will discover at least 5, not detected by any layer of Norton's protection. I don't know what exactly is going on there, but the results are misleading and inaccurate.
 

Andy Ful

Level 69
Verified
Trusted
Content Creator
Dec 23, 2014
5,888
I'm having hard time trusting this 1 missed 0-day sample thing. If I install Norton now, In 30 minutes I will discover at least 5, not detected by any layer of Norton's protection. I don't know what exactly is going on there, but the results are misleading and inaccurate.
So do I (as you already know).:)
 

Andy Ful

Level 69
Verified
Trusted
Content Creator
Dec 23, 2014
5,888
SE Labs tests April 2018 - June 2020 (added July-September results)
the overall level of protection, making no distinction between neutralized and blocked incidents.

Norton Security ..........................100 + 100 + 100 + 100 + 100 + 100 + 100 + 099 + 100 + 099 = 998
Kaspersky Internet Security ......099 + 100 + 100 + 097 + 100 + 100 + 100 + 100 + 100 + 100 = 996
Trend Micro Internet Security ...099 + 098 + 100 + 099 + 100 + 100 + 100 + 100 + 100 + 099 = 995
F-Secure Safe .............................100 + 099 + 100 + 100 + 100 + 100 + 098 + 100 + 099 + 997 = 993
ESET Internet Security................099 + 099 + 099 +099 + 100 + 099 + 098 + 100 + ??? + 099 <= 992
Microsoft Windows Defender ...097 + 100 + 100 + 099 + 100 + 099 + 098 + 099 + 100 + 097 = 989
Avira Free Security Suite ............094 + 097 + 097 + 097 + 099 + 098 + 097 + 099 + 096 + 097 = 971
McAfee Internet Security........... 099 + 086 + 095 + 100 + 099 + 099 + 096 + 097 + 100 + 098 = 969
AVG Antivirus Free Edition........ 094 + 093 + 092 + 097 + 097 + 098 + 098 + 100 + 099 + 099 = 967
Avast Free Antivirus................... 091 + 091 + 093 + 099 + 098 + 098 + 098 +0 99 + 099 + 099 = 965


Other AVs did not participate in all tests but their scorings could be close to WD or worse.
Comodo Internet Security...........??? + ??? + ???? + ??? + ???? + 100 + 100 + 099 + 097 + 099 <= 995
Sophos Home Premium..............??? + ??? + ???? + ??? + 098 + 100 + 098 + 100 + 100 + 095 <= 991
Bitdefender Internet Security......095 + ??? + ???? + ??? + ??? + ???? + ??? + ???? + ???? + ??? <= 995
Check Point ZoneAlarm..............091 + ??? + ???? + ??? + 094 + 099 + 097 + 097 + 095 + ??? <= 973
eScan Internet Security Suite......??? + ??? + ???? + 092 + 096 + ??? + ???? + ??? + ???? + ??? <= 988
G-Data Internet Security..............???? + 089 + 092 + 085 + 092 + 095 ??? + 095 + 097 + ??? <= 945
Bullguard Internet Security..........???? + ??? + ???? + 094 + 087 + ??? + ??? + ??? + ???? + ??? <= 981
Webroot Antivirus........................067 + 088 + 084 + ??? + ??? + 081 + 096 + 081 + 095 + 098 <= 892

So, it is clear that WD/MSD results (default settings) are better than any free AV (except maybe CIS) and better than 60% 50% of the popular commercial AVs tested by SE Labs.
When we take the cumulative results of AV-Test, AV-Comparatives, and SE Labs then only Norton has consistently better results in all these tests. This is another argument for very similar detection of most AVs in the home environment. The Norton case is exceptional.

Post updated.
 
Last edited:
F

ForgottenSeer 89360

SE Labs tests April 2018 - Jun 2020, overall level of protection, making no distinction between neutralised and blocked incidents.

Norton Security ..........................100 + 100 + 100 + 100 + 100 + 100 + 100 + 99 + 100 = 899
Kaspersky Internet Security ........99 + 100 + 100 + 97 + 100 + 100 + 100 + 100 + 100 = 896
Trend Micro Internet Security .....99 + 98 + 100 + 99 + 100 + 100 + 100 + 100 + 100 = 896
F-Secure Safe .............................100 + 99 + 100 + 100 + 100 + 100 + 98 + 100 + 98 = 895
Microsoft Windows Defender ....97 + 100 + 100 + 99 + 100 + 99 + 98 + 99 + 100 = 892
ESET Internet Security................ 99 + 99 + 99 + 100 + 99 + 98 + 100 ? < 894
Avira Free Security Suite ............94 + 97 + 97 + 97 + 99 + 98 + 97 + 99 + 96 = 874
McAfee Internet Security........... 99 + 86 + 95 + 100 + 99 + 99 + 96 + 97 + 100 = 871
AVG Antivirus Free Edition........ 94 + 93 + 92 + 97 + 97 + 98 + 98 + 100 + 99 = 868
Avast Free Antivirus................... 91 + 91 + 93 + 99 + 98 + 98 + 98 + 99 + 99 = 866


Other AVs did not participated in all tests but their scorings could be close to WD or worse.
Comodo Internet Security ? ? ? ? ? 100 + 100 + 99 + 97 <= 896
Sophos Home Premium ? ? ? ? 98 + 100 + 98 + 100 + 100 <= 896
Bitdefender Internet Security 95 ? ? ? ? ? ? ? ? <= 895
Check Point ZoneAlarm 91 + ? ? ? + 94 + 99 + 97 + 97 + 95
eScan Internet Security Suite ? ? ? 92 + 96 ? ? ? ? <= 888
G-Data Internet Security ? 89 + 92 + 85 + 92 + 95 ? 95 + 97
Bullguard Internet Security ? ? ? 94 + 87 ? ? ? ?
Webroot Antivirus 67 + 88 + 84 + ? ? + 81 + 96 + 81 + 95

So, it is clear that WD/MSD results (default settings) are better than any free AV (except maybe CIS) and better than 60% of popular commercial AVs tested by SE Labs.
When we take the cumulative results of AV-Test, AV-Comparatives, and SE Labs then only Norton has consistently better results in all these tests. This is another argument for very similar detection of most AVs in the home environment. The Norton case is exceptional.
I created a piece of malware myself tonight to check this amazing 0-day performance many of them have, only Kaspersky is good enough to block it. They should approve the thread tomorrow.

Malware piece 1:

ADCF8ED8-1420-4BF0-90A7-C6908B5CB3C8.png

Malware piece 2:
BE092C99-2FB0-4D73-B7BD-B867B4FAAD26.png

My test proves few points:
Nobody can score 100% on 0-days
Machine Learning, as much as it’s smart is also stupid. Human-writen generic detections based on math-models are far more potent than automated algorithms
And last, but not least, Kaspersky has the best engine around.

In the above case, Kaspersky has develeoped an algorithm that’s able to detect the “gibberish” language that occurs once I obruscated few lines of PowerShell code. Everybody else, including Microsoft Defender with their long and complicated stories about machine learning and how they blocked threats before others, failed.
 
Last edited by a moderator:

Andy Ful

Level 69
Verified
Trusted
Content Creator
Dec 23, 2014
5,888
I created a piece of malware myself tonight to check this amazing 0-day performance many of them have, only Kaspersky is good enough to block it. They should approve the thread tomorrow.

Malware piece 1:

View attachment 249043

Malware piece 2:
View attachment 249045

My test proves few points:
Nobody can score 100% on 0-days
Machine Learning, as much as it’s smart is also stupid. Human-writen generic detections based on math-models are far more potent than automated algorithms
And last, but not least, Kaspersky has the best engine around.
You cannot count on VT detection. In many cases the samples were blocked by AVs after execution even when they were not detected on VT.
 
F

ForgottenSeer 89360

What your samples do in the system?
The sample imitates an Emotet loader, in fact I got inspired by Emotet, which Microsoft claims to have blocked with machine learning model in seconds... there was this sort of post on their blog, if I am not mistaken. Hope the Emotet team doesn’t come after me with copyright claims 😆

So I downloaded a malicious sample and uploaded it on a benign website (won’t disclose all details for security reasons). This way I bypassed web filter blacklists
The Emotet loader simulator uses BitsTransfer (Emotet uses System.Net.WebClient) to download the malicious file and write it on the Desktop. Writing to the desktop decreases machine learning sensitivity as opposed to writing in temp folder or somewhere else.
To make things a bit more interesting, I used a hex editor to modify slightly the malicious file downloaded, which bypassed any reputation technologies. Finally, PowerShell executes the sample. To decrease machine learning sensitivity throughout the whole process, on the second sample, I’ve removed attributes such as hidden window, no exit and others.
I used a tool widely available on the web to obfuscate the code, just like Emotet creators do. It hasn’t been encoded with base64 (unlike Emotet) but has been concatenated, which makes it human unreadable and also, bypasses signatures and heuristics. To bypass the execution policy, I ran the code as an argument, not as a script.
It took me less than 20 minutes to do all that and the result - few of them failed already. The malicious sample was a variant of nanocore rat (known for its privilege escalation) and was successfully executed in all test cases. There was not even a UAC prompt. Avast’s IDP kicked in and removed nanocore, but wasn’t smart enough to correlate it to my loader. Defender and Malwarebytes did nothing. Kaspersky detected everything upfront (on VirusTotal), Eset detected the first one, probably due to attributes, commonly used by malware. That’s on VirusTotal again.
 
Last edited by a moderator:

Andy Ful

Level 69
Verified
Trusted
Content Creator
Dec 23, 2014
5,888
The sample imitates an Emotet loader, in fact I got inspired by Emotet, which Microsoft claims to have blocked with machine learning model in seconds... there was this sort of post on their blog, if I am not mistaken. Hope the Emotet team doesn’t come after me with copyright claims 😆

So I downloaded a malicious sample and uploaded it on a benign website (won’t disclose all details for security reasons). This way I bypassed web filter blacklists
The Emotet loader simulator uses BitsTransfer (Emotet uses System.Net.WebClient) to download the malicious file and write it on the Desktop. Writing to the desktop decreases machine learning sensitivity as opposed to writing in temp folder or somewhere else.
To make things a bit more interesting, I used a hex editor to modify slightly the malicious file downloaded, which bypassed any reputation technologies. Finally, PowerShell executes the sample. To decrease machine learning sensitivity throughout the whole process, on the second sample, I’ve removed attributes such as hidden window, no exit and others.
I used a tool widely available on the web to obfuscate the code, just like Emotet creators do. It hasn’t been encoded with base64 (unlike Emotet) but has been concatenated, which makes it human unreadable and also, bypasses signatures and heuristics. To bypass the execution policy, I ran the code as an argument, not as a script.
It took me less than 20 minutes to do all that and the result - few of them failed already. The malicious sample was a variant of nanocore rat (known for its privilege escalation) and was successfully executed in all test cases. There was not even a UAC prompt. Avast’s IDP kicked in and removed nanocore, but wasn’t smart enough to correlate it to my loader. Defender and Malwarebytes did nothing. Kaspersky detected everything upfront (on VirusTotal), Eset detected the first one, probably due to attributes, commonly used by malware. That’s on VirusTotal again.
Such an attack will bypass many security solutions. The modified 0-day loader does not do anything malicious so it can be stopped by WD with some ASR rules, but not by default settings. Generally, AV home versions are not good to fight such attacks. Although KIS (default settings) has probably better protection than most Home AVs, it can fail in many cases too. There are so many possibilities that any generic algorithm cannot be sufficiently good.
 
Top