These tests should be taken with a grain of salt. Not just for defender, but for all security products. A couple of points to his tests on performance.
First, these products are being tested in a VM, so already the system is running slower than if it was ran in the actual hardware. VMs are important for malware testing , but aren't needed for performance testing, as you aren't running any malware samples IMO.
Second, performing benchmarks using a benchmarking tool is irrelevant IMO. In this day in age we can use tools like these to measure things so fine, that to us humans we really wont be able to notice any difference what's so ever. I honestly don't think I someone would be able to accurately tell the difference in a score of 750 vs 720 for example, if a person was asked to see if they notice a difference between 2 systems right un front of them with no scores to show the difference.
When it comes to performance testing there are way to many variables IMO to accurately get a straight answer. The way I look at it is, if you have to run a benchmarking tool to see/measure any differences, then there isn't a performance difference, as you cannot see one without having to run a testing tool.