AV-Comparatives May 2017 test. Windows Defender did great!

Game Of Thrones

Level 5
Verified
Well-known
Jun 5, 2014
220
I think proving a product is good or not, is not the HUB's goal.
The real goal is to test a product with malware you might find in your mail box this evening for example, seeing how it reacts and especially dynamically.
At least here there is not the money factor that can distort the test with business contamination.
maybe there is not money factor but there is fanboyism factor! or even I, can be a vendors employee doing shady things, from submitting the hubs samples on first sight to destroying other brands name in the minds of the people and so on... . if an info went public anyone can change it.
 

Windows Defender Shill

Level 7
Verified
Well-known
Apr 28, 2017
326
Also

If I was a G-Data user and fan I would not fret over their exclusion

Because they are essentially just a German version of Bitdefender anyways.
 

mekelek

Level 28
Verified
Well-known
Feb 24, 2017
1,661
Ok

But I didn't just make that up G Data - Wikipedia
they are using Bitdefender engine. just look at the detection name in VT.
well i tested GData for a day and it said AVA Engine and GData Engine. that wiki is outdated probably or AVA doesn't stand for Avira Engine :D

also FSecure is using BD sigs, and GData had more sig detection from their AVA Engine than FSecure from BD. (in a time frame of 5 minutes with both updated that moment)
 
Last edited:

Winter Soldier

Level 25
Verified
Top Poster
Well-known
Feb 13, 2017
1,486
maybe there is not money factor but there is fanboyism factor! or even I, can be a vendors employee doing shady things, from submitting the hubs samples on first sight to destroying other brands name in the minds of the people and so on... . if an info went public anyone can change it.
I got your point, but fanboys can't alter test results, accessible to all of us.
The methodology can be criticized because not sublimated from the official part, but we don't need graduated in computer science engineering to run a malware test.
"Test yourself" is the best thing to understand if something works and at the same time, always there is reason to improve yourself.
Sometimes I write some malware analysis, mainly non-obfuscated .NET samples: is my analysis universally accepted? Absolutely not, but my work demonstrates commitment and a serious awareness of my limits, of course.
 

Nightwalker

Level 24
Verified
Honorary Member
Top Poster
Content Creator
Well-known
May 26, 2014
1,339
well i tested GData for a day and it said AVA Engine and GData Engine. that wiki is outdated probably or AVA doesn't stand for Avira Engine :D

also FSecure is using BD sigs, and GData had more sig detection from their AVA Engine than FSecure from BD. (in a time frame of 5 minutes with both updated that moment)

AVA doesn't stand for Avira Engine, G-Data still uses Bitdefender along with in-house engine (CloseGap) and proprietary behavior blocker.
You can check the detection names on VirusTotal and you can see Bitdefender files when you have G-Data installed.

Example:
Antivirus scan for 0d980ad166c2b204cbdae448f483aba341ceb00a15074deca5f17c6a90d91d45 at 2017-06-15 11:49:56 UTC - VirusTotal
 

mekelek

Level 28
Verified
Well-known
Feb 24, 2017
1,661
AVA doesn't stand for Avira Engine, G-Data still uses Bitdefender along with in-house engine (CloseGap) and proprietary behavior blocker.
You can check the detection names on VirusTotal and you can see Bitdefender files when you have G-Data installed.
my bad then, it's still weird why would they name it like that and have different manual results. Could be FSecure's own sigs made the difference.
 

Solarquest

Moderator
Verified
Staff Member
Malware Hunter
Well-known
Jul 22, 2014
2,525
these tests, you claim are better, are done the same way, the samples are not fresh, and false positives shouldn't be a factor since it doesn't change the end result.
yes i kinda agree that in some tests, the person testing it sooner will have a harder time than the ones that test it later, due to some samples being nulled out by signatures.
but since these official AV tests don't define how the testing is done exactly, for all you know, they wait more with testing of products that pay more just to have better results..
the samples here are sometimes outdated or false positive, sometimes they are not even harmful. another problem is the TIME, the products here are not testing all at the same time someone tests 1 hour later someone tests 1 day later is this sound logical to you?add to this time the moments that this samples was in the reference site that the thread owner is gonna download, the other problem is the sites and references that the person who posts the samples is gonna download files from it. as I said many times when a simple user like us can access this samples an av company can access them too and add the definition for them sooner than the others or update the cloud behavior detection for it. another problem is the dynamic tests, the dynamic tests here are full of flaws, some modules in av's are connected to each other, in some of them nearly you can not test the dynamic capability of it yet you see some people test the product. I myself follow the hub but just for fun and not for referencing it or choosing a product or advising it to others. the hub is some kinda toward some av vendors I'm not gonna name but it's not good to talk about it. there are other problems but I think this is enough.

just a very quick comment from my side.

Our samples are very fresh, mostly 0-1 day old with detection rate below 25.
Test are almost only done within 24 hours from sample availability on the forum.
False positive are possible (proportionally low nr.) but as said before these don't affect the final result.
If an AV continuously fails and leaves system infected is a very bad sign even worse if they still miss after some time (1 day or more in case of special older samples that were uploaded to test BB detection of AV)
Our tests show how good/fast AV update their signatures, how good their "not signature based" modules protect from NEW and unknown malware.
In our test detection rate is merciless, it is what it is....do you really believe real detection rate of AV is close to 100% as shown by these "official" test companies? Really??
AV can access the samples from the HUB as from the same MW sources we use. Why shouldn't they do it? Anyway, if you check our test regularly you'll see that only one probably does it and that all AV fail, even the ones with incredible good static detection.
Do you know how old are the samples used by av-comparatives? Are they so fresh? An AV that misses samples older than 24 hours is doing something wrong, for sure if they were checked on VT since AV companies get copy of them.
 

silversurfer

Level 85
Verified
Honorary Member
Top Poster
Content Creator
Malware Hunter
Well-known
Aug 17, 2014
10,154
the samples here are sometimes outdated or false positive, sometimes they are not even harmful. another problem is the TIME, the products here are not testing all at the same time someone tests 1 hour later someone tests 1 day later is this sound logical to you?add to this time the moments that this samples was in the reference site that the thread owner is gonna download, the other problem is the sites and references that the person who posts the samples is gonna download files from it. as I said many times when a simple user like us can access this samples an av company can access them too and add the definition for them sooner than the others or update the cloud behavior detection for it. another problem is the dynamic tests, the dynamic tests here are full of flaws, some modules in av's are connected to each other, in some of them nearly you can not test the dynamic capability of it yet you see some people test the product. I myself follow the hub but just for fun and not for referencing it or choosing a product or advising it to others. the hub is some kinda toward some av vendors I'm not gonna name but it's not good to talk about it. there are other problems but I think this is enough.

Why you don't want to participate in the Hub, you could be explaining to all testers a more correctly testing procedure ?

BTW, The work of all testers here should be more respected because it's voluntary working without payment!
 

RoboMan

Level 35
Verified
Top Poster
Content Creator
Well-known
Jun 24, 2016
2,400
Once again, labs guide us on what is worth testing and what is worth discarding. Once again, it depends on us, as always. I can guess and afirm any AV here tested can suit our systems if configured, and complemented on a decent way. If we're not downloading cracked torrents, or testing malware on our live systems, any software mentioned will do its job without failing.
 

S3cur1ty 3nthu5145t

Level 6
Verified
May 22, 2017
251
the samples here are sometimes outdated or false positive, sometimes they are not even harmful. another problem is the TIME, the products here are not testing all at the same time someone tests 1 hour later someone tests 1 day later is this sound logical to you?add to this time the moments that this samples was in the reference site that the thread owner is gonna download, the other problem is the sites and references that the person who posts the samples is gonna download files from it. as I said many times when a simple user like us can access this samples an av company can access them too and add the definition for them sooner than the others or update the cloud behavior detection for it. another problem is the dynamic tests, the dynamic tests here are full of flaws, some modules in av's are connected to each other, in some of them nearly you can not test the dynamic capability of it yet you see some people test the product. I myself follow the hub but just for fun and not for referencing it or choosing a product or advising it to others. the hub is some kinda toward some av vendors I'm not gonna name but it's not good to talk about it. there are other problems but I think this is enough.
The samples provided here are pulled from uploading sites, where they are uploaded as discovered, meaning they are in the wild. Not only are they in the wild but every sample is vetted for detection on VT, and the hunters are not allowed to post them if they are over 25 engines detection. They are also vetted for working "non corrupted"

As for the methodology, it is basic and allows the users to wrap the test around the product, unlike testing facilities.

As for time, none of the samples are true zero day, which is the only time, that time, would be a pressing issue.

These volunteers certainly deserve respect, as they do not have to volunteer their time doing so.
 
Last edited:

Game Of Thrones

Level 5
Verified
Well-known
Jun 5, 2014
220
thanks for answers and clarifications I have a respect for our malware hunters and I did not want to have something offensive about any of the work that is being done there.I just said my opinion and if it was disrespectful to any of our member I am sorry.you said some logical things and it's a debate(of course not here :) ) . of course, we can resolve some problems in hub, for example having a red line for testing(4 hours for example) and guiding people about how they should dynamic test some products and banning the dynamic test of some products that can not be tested(guiding about settings for testing)
about some vendors detecting the hub's samples and some not, sometimes, in my opinion, the quality of a sample is important, of course, a product should detect all the threats but sometimes some products focus on what samples are in the wild between the users, IMO some samples that are in forums are not in the wild and their detection is not that important as a quality sample that is affecting users in real world, quality does not mean just the good coding of the sample,but, how many users are seeing it in real world, is it really harmful?, is it PUP or not?, and ... . my problem is with fanboys and people who are referencing the hub(as you said one vendor maybe doing what I said about detecting hubs sample. thanks for this logic). referencing the hub is ok but not for showing people what they should choose.
just a very quick comment from my side.

Our samples are very fresh, mostly 0-1 day old with detection rate below 25.
....

Why you don't want to participate in the Hub, you could be explaining to all testers a more correctly testing procedure ?

BTW, The work of all testers here should be more respected because it's voluntary working without payment!
 
  • Like
Reactions: mekelek

mekelek

Level 28
Verified
Well-known
Feb 24, 2017
1,661
thanks for answers and clarifications I have a respect for our malware hunters and I did not want to have something offensive about any of the work that is being done there.I just said my opinion and if it was disrespectful to any of our member I am sorry.you said some logical things and it's a debate(of course not here :) ) . of course, we can resolve some problems in hub, for example having a red line for testing(4 hours for example) and guiding people about how they should dynamic test some products and banning the dynamic test of some products that can not be tested(guiding about settings for testing)
about some vendors detecting the hub's samples and some not, sometimes, in my opinion, the quality of a sample is important, of course, a product should detect all the threats but sometimes some products focus on what samples are in the wild between the users, IMO some samples that are in forums are not in the wild and their detection is not that important as a quality sample that is affecting users in real world, quality does not mean just the good coding of the sample,but, how many users are seeing it in real world, is it really harmful?, is it PUP or not?, and ... . my problem is with fanboys and people who are referencing the hub(as you said one vendor maybe doing what I said about detecting hubs sample. thanks for this logic). referencing the hub is ok but not for showing people what they should choose.
to be fair, if you would want to specifically state what you mean by Dynamic testing, you wouldn't be able to. and even if you could, most of the cases would be debatable.

the current way samples are tested is fine imo, and everyone should take them for certain products as they want to.
 

EASTER

Level 4
Verified
Well-known
May 9, 2017
145
Not only has Windows improved their signature game recently

But the Creators update is a GAME CHANGER for security

With their block options in Smart Screen and "Application from Windows Store only" option in App settings

Windows can now be ultra secure without 3rd party help

Personally I wouldn't go quite that far just yet on "without third party help", (although it's clear) but in my recent testings with this current O/S 10 CU Build I will agree it is moving ever more closer to looking like a true GAME CHANGER.

And I expect that is exactly where things are headed with the new WD.

Take that anyway you like but indeed it's looking better and better so far I have to agree.
 

Faybert

Level 24
Thread author
Verified
Top Poster
Well-known
Jan 8, 2017
1,318
Lockdown darko999 AV-Comparatives' response to the absence of G Data in the tests:

In order to ensure that our test results give a complete and accurate picture of a product’s capabilities, AV-Comparatives has strict rules about which tests every product must take part in, and which tests are optional. This is to ensure that the aggregated results best reflect the actual protection the tested products provide to their customers. G DATA is currently not inlcuded in our public main-test reports because they declined to participate in some of the mandatory tests (i.e. Real-World Protection Test).
 
F

ForgottenSeer 58943

I'd love to use Gdata, but found it heavier and alert prone.

I miss eXtendia AVK, that was basically a US Licensed version of GData with Kaspersky and Reliable Antivirus(RAV) built in.. Very effective and it had a fantastic interface with a ton of drill down.

I just read another article that Windows Defender still has serious vulnerabilities. Also the other AVC test released for performance showed Windows Defender was the heaviest product tested on systems.. I'd never recommend any AV built into an OS and some of the worst infections we find are WD protected machines.
 

Quassar

Level 12
Verified
Well-known
Feb 10, 2012
585
I dont have trust thier test casue are not association to real... avast full detection.... this av is so poor even in Poland kids dont wanna instal it for free becasue its so sux.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top