Raiden

Level 13
Verified
Content Creator
The same old topic, that has spammed this community with a new discussion about it at least once a week, if people used the search feature would spare us of the frustration to say the same thing over and over.

Fact is Windows Defender performance has improved with each Windows 10 build, but it still lags behind third party solutions.

Comparing Windows Defender to Kaspersky Free on laptops and desktops over the course of months now, and it's easily noticeable.

But it all goes down to the usage you're going to make of the PC, in some cases Windows Defender will be barely noticeable (but never superior to third parties in any way, I say equal at max).

However if you using the computer for games, work, etc. You'll notice the performance hit from Windows Defender with ease.

Adding to this the fact Windows Defender still has exclusions bugs (which fail to comply with exclusion rules, and keeps nagging you over false positives on top of deleting your files), that another problem with Windows Defender false positives (this constant frustration where I constantly had to send my code to review for Microsoft to whitelist the files, else I wouldn't be able to implement and use it across companies).

Any unknown unsign file will trigger Windows Defender, for someone like me who writes software, scripts, etc for work and even personal usage, it gets in the way a lot (compared to Kaspersky Free which never nagged me once on any machine).

As a bonus, when compiling code with Visual Studio, Windows Defender adds a delay of minutes compared to Kaspersky Free seconds on the same machine.

I'll be looking forward to what the new Windows 10 Build has to offer, however I don't expect miracles.
Very good post and I agree 100%.

I do have to admit it does get a little tiring in what seems like we go around in this endless circle everytime this topic comes up. I think this topic has been discussed very thoughly on this forum and the end result is pretty much always the same. WD has made sigificant impromvents on both the protection front, as well as it's overall performance. Does that mean it's perfect? No, it definitly still has it's issues, but then again no product is perfect. My advice will always be to try the program(s) for yourself and decide if it meets your needs or not. There is way more to a program than how it scores on a test. Everyone has different computer specs (both hardware and software), as well as differing usage, hence why it's always important to try the program for yourself.;)

Totally agree with you @Local Host, especially this:

(y)

In fact, I need to make a note to myself to abstain from replying to threads like this! This thread is not the only example, by any means.
I have to agree and like you, I will try to be more mindful when these threads come up. :)
 
Last edited:

blackice

Level 10
Verified
It isn’t he worst idea to revisit performance of a product, especially one like WD. New people may be looking for recent information. A thread on WD from 2 years ago may not be he most relevant anymore when performance has improved noticeably. And all forums have their circular discussions. At the same time it is a repetitive topic and needs rest between relevant changes.
 

TairikuOkami

Level 23
Verified
Content Creator
At the same time it is a repetitive topic and needs rest between relevant changes.
Maybe it would not be a bad idea to create a pinned topic like Windows Defender 1809, then Windows Defender 1903, etc. Especially considering, that discussing WD might not be on track, if someone is using 1709 and other one 1903. Eg. WD's detection noticeably improved since 1803/1809, the same can go for performance. Other AVs do not face such an issue, since people are always using their latest version.
 

Andy Ful

Level 48
Verified
Trusted
Content Creator
For me, after looking at several benchmarks and using WD for 4 years:
  1. It is far away from 3-rd party AVs while copying many files and installing/uninstalling most applications. The same if someone tries to compile executables, comparing files in folders (binary comparison), etc.
  2. It is usually very fast (one of the best) when launching most applications, downloading most files, web browsing, gaming. For some very suspicious files can be slower due to Cloud-delivered protection (up to 10 seconds on default settings).
  3. There is a well-known cause of slowdowns (not optimized file caching), so everything that depends on it will be slower.
If the user feels WD in another way it means that there is a kind of incompatibility, probably due to hardware drivers. On some computers, WD can be slow and the cause is not known - not related to processor speed or amount of RAM.
 

shmu26

Level 83
Verified
Trusted
Content Creator
I just don't understand why WD's slowness or lack of it is a topic to discuss. People have it on their systems already. Just try it. If you find it fast, use it, and if you find it slow, don't use it. Why isn't that reasonable?
 

Andy Ful

Level 48
Verified
Trusted
Content Creator
I just don't understand why WD's slowness or lack of it is a topic to discuss. People have it on their systems already. Just try it. If you find it fast, use it, and if you find it slow, don't use it. Why isn't that reasonable?
(y)
People still have a problem with treating WD as a normal AV, which can or cannot be the right solution for them.:giggle:
 

monkeylove

Level 3
These tests should be taken with a grain of salt. Not just for defender, but for all security products. A couple of points to his tests on performance.

First, these products are being tested in a VM, so already the system is running slower than if it was ran in the actual hardware. VMs are important for malware testing , but aren't needed for performance testing, as you aren't running any malware samples IMO.

Second, performing benchmarks using a benchmarking tool is irrelevant IMO. In this day in age we can use tools like these to measure things so fine, that to us humans we really wont be able to notice any difference what's so ever. I honestly don't think I someone would be able to accurately tell the difference in a score of 750 vs 720 for example, if a person was asked to see if they notice a difference between 2 systems right un front of them with no scores to show the difference.

When it comes to performance testing there are way to many variables IMO to accurately get a straight answer. The way I look at it is, if you have to run a benchmarking tool to see/measure any differences, then there isn't a performance difference, as you cannot see one without having to run a testing tool. ;)
FWIW, the question was raised in the thread title, and it was answered. Whether or not the numbers are relevant was actually addressed in the videos.