Burrito

Level 24
Yeah they cheated the test ,
Symantec tried cheating as well but sophos was ahead of them they are the sponser XD.
That's why I'm interested in more exploit tests
, And doing my own as well.

Didn't get a response yet from Effitas about the samples I requested.

BTW does anyone know about a good test lab without cheating or marking products by abc letters?

Yeah.

What we really need is a good testing protocol.

I wonder where we could find one of those.......... maybe........ RIGHT HERE.

I think you'd have more luck getting samples by going through the steps here at MT for inclusion as a tester in The Hub.

I think the labs will probably not send them to you..

And I look forward to any testing that you do.

(y)
 

plat1098

Level 22
Verified
Again... Secure and Security are not the same.

Again? With due respect, I understand those semantics, OK? However, my point was and always will be: How could you claim and market your operating system as secure, with such a weak link (as per the comparatives) as Defender? That's simply it. Ancient history now, my bringing this up was anecdotal.

No one needs to sell me on Defender or defend it to me. It was simply within the context of placing a subjective value on these comparatives at any given time.

This should be a place to express your opinion either way, hopefully without being disrespected, provoked or "misunderstood."
 

danb

From VoodooShield
Verified
Developer
I have talked to some of these labs throughout the years and you guys are aware of the sneaky things that might or might not happen ;).

You guys might or might not be aware, but what some labs do is to offer public testing for free, which is the actual final test. But before the public test is performed, they offer a round or two of private testing that costs $40,000 or so, which is simply paying for the answers before the test. Same samples and everything.

This is not exactly unethical because one can argue that it helps the vendors improve their product... which is the entire goal anyway. I am not saying that I agree with this, I am just explaining the reasoning behind this method. Either way, it certainly does improve efficacy... I am just not sure by how much.
 

Vitali Ortzi

Level 20
Verified
I have talked to some of these labs throughout the years and you guys are aware of the sneaky things that might or might not happen ;).

You guys might or might not be aware, but what some labs do is to offer public testing for free, which is the actual final test. But before the public test is performed, they offer a round or two of private testing that costs $40,000 or so, which is simply paying for the answers before the test. Same samples and everything.

This is not exactly unethical because one can argue that it helps the vendors improve their product... which is the entire goal anyway. I am not saying that I agree with this, I am just explaining the reasoning behind this method. Either way, it certainly does improve efficacy... I am just not sure by how much.
Yeah that's part of the cheating .
Till final test every av gets 99% signature detection among "zero days " .
Wich is absurd at best.
 

Burrito

Level 24
This is not exactly unethical...

I was told something very similar by an industry rep at a conference once -- but I didn't believe it.

So now... this is second heard.

If true --- I consider it completely unethical and a mockery of testing.

I'm not a good 'reading the legalise fine print guy' -- but if this is occurring and not being disclosed -- those labs who engage in this are bogus.
 

Vitali Ortzi

Level 20
Verified
I was told something very similar by an industry rep at a conference once -- but I didn't believe it.

So now... this is second heard.

If true --- I consider it completely unethical and a mockery of testing.

I'm not a good 'reading the legalise fine print guy' -- but if this is occurring and not being disclosed -- those labs who engage in this are bogus.
Almost every lab is .
If they don't do cheating,
the testing results would probably be very bad .
Possibly too bad to even be included in some tests .

And these truthful venders will lose market share ,wich may results in compromises and slowly losing big part of R&D .
After a few years the company might get bankrupt or else some other company acquire them (if they got good patents).

Or going back the other road with deceptive marketing ,After management and direction changes that the low shares causes.
 
Last edited:

danb

From VoodooShield
Verified
Developer
I was told something very similar by an industry rep at a conference once -- but I didn't believe it.

So now... this is second heard.

If true --- I consider it completely unethical and a mockery of testing.

I'm not a good 'reading the legalise fine print guy' -- but if this is occurring and not being disclosed -- those labs who engage in this are bogus.
Unfortunately this is true. I forgot to mention one thing... if a vendor only signs up for the free public test, the odds of the final free public test being overbooked appear to be significantly higher, and when the tests are overbooked the vendor who does not pay cannot participate in the final free public test. But in all fairness, if no one paid for the private tests, then the lab could not make any money to survive.

But I have to say, my actual experiences with the testing labs and PCMag have been overwhelmingly positive. I mention PCMag because some people like to dis on them and claim that they charge for their reviews or take kickbacks. I can assure you that they do not, at least not from my experiences. They take a deeper dive into products than pretty much anyone.

I think a lot people rip on companies and other people who do not align with their agenda. Just like in politics, the the far right and left wings are both wrong and both absolutely insane. They will stop at nothing to defeat their opponent, not realizing that it probably is not all that rewarding in the end, if you had to lie, cheat and steal to do so. Science, fact, truth, integrity and honesty are all very rare these days. Especially on the internet ;).
 
some companies done the right things already. they the tests. like Dr.Web & Emsisoft
these guys actually decide to not pay the ransom to these test labs and spend the money on improvement of their product and other stuff to protect their users better

that does not mean that a company like Symantec/ESET/Kaspersky don't dare about their users protection. they are simply bigger players and money is not a problem for them so they just keep the testing for extra marketing
in fact the only goal of these tests are marketing not even improvement, that already is in progress. every company has it's own malware hunters hackers and stuff like that. at least as far as i know Dr.Web got alot of Russian hackers inside, then Kaspersky and Symantec should have that as well. maybe not Emsisoft but they still got their own hunters.


isn't it the ony lab that actually believe in Webroot? something should be wrong with that.
 

Spawn

Administrator
Verified
Staff member
Again? With due respect, I understand those semantics, OK? However, my point was and always will be: How could you claim and market your operating system as secure, with such a weak link (as per the comparatives) as Defender? That's simply it. Ancient history now, my bringing this up was anecdotal.

No one needs to sell me on Defender or defend it to me. It was simply within the context of placing a subjective value on these comparatives at any given time.

This should be a place to express your opinion either way, hopefully without being disrespected, provoked or "misunderstood."
Microsoft's marketing team is to sell products and services, that is exactly what happened and was effective enough to make the majority to switch.


You're right, except you perceived my response as a threat, and tells us more about yourself and how you view my posts.
 

danb

From VoodooShield
Verified
Developer
isn't it the ony lab that actually believe in Webroot? something should be wrong with that.
Based on other lab tests, I can certainly see your point.

This is pure speculation on my part, but my best guess (based on my experiences with PCMag) is that PCMag took the time to learn how Webroot works and to really understand what all goes on under the hood, so that they can formulate a valid testing procedure for Webroot. Again, pure speculation... then the testing labs, in an effort to level the playing field, used testing procedures that we valid for most products, but not for Webroot.

Or maybe Burrito is right about Webroot... I have no idea either way.

Either way, I am a HUGE proponent of testing, and verifying with the vendor that test was performed correctly for each product. The main requirement is that all products should be tested in their default configuration, unless otherwise specified by the vendor, and there better be a VERY good reason that an alternate configuration should be used in testing. For example, we ask labs to test VS on AutoPilot, simply because that mode is not deny-by-default and it is more consistent with their testing methodologies. It cracks me up when a product fails a test and the vendor claims "Our product failed because it needs to be tailored for each system". Malware does not discriminate. Most cybersecurity software can be hardened, but at some point it is not usable. This is why dynamic security postures is vital... but that is a whole different story.

I am serious about thinking of recommendations for the testing labs... I am certain they are open to new ideas.
 

Burrito

Level 24
This is pure speculation on my part, but my best guess (based on my experiences with PCMag) is that PCMag took the time to learn how Webroot works and to really understand what all goes on under the hood, so that they can formulate a valid testing procedure for Webroot. Again, pure speculation...

And that is very kind and diplomatic speculation. Some testing labs actually bent over backwards to try to accommodate Webroot. Even though it's invalid, Webroot was part of the impetus to start the "Detected in 24 hours" category. Of course, in that 24 hours, your info has already been stolen, your bitcoin bank is emptied..

This is what PCMag says about the Labs as it relates to their reviews:

1590025675804.png


But of course... there is a problem. Nothing gets crushed like Webroot in all the labs. So it appears that PCMag has a disconnect somewhere.

I won't totally diss PCMag, as they did recognize the greatness of VooDoo. So they got that going for them.

1590025861136.png



Or maybe Burrito is right about Webroot... I have no idea either way.

Well uh.... yeah Dan. Of course Burrito is right. Who would even ponder otherwise... ;)



Either way, I am a HUGE proponent of testing, and verifying with the vendor that test was performed correctly for each product. The main requirement is that all products should be tested in their default configuration, unless otherwise specified by the vendor, and there better be a VERY good reason that an alternate configuration should be used in testing.

Yeah, this is it. You go Dan..



I am serious about thinking of recommendations for the testing labs... I am certain they are open to new ideas.

One would think. Especially from somebody like you... a divergent thinker with a fresh approach.

Do you think you should join AMTSO?

At the last AMTSO meeting, the first item on the agenda was:

1. Why does VooDoo Dan not like us? **


Thanks VooDoo Dan.

1590026753882.png



**Not Really
 

plat1098

Level 22
Verified
I apologize, Spawn, I was wrong.

Either way, I am a HUGE proponent of testing, and verifying with the vendor that test was performed correctly for each product.

This is key. Are there standards across the board? Is there a watchdog? A dirty lab makes every participant look shady when they may not be. This is billions of dollars circulating, there should be regulation and more transparency. Actually, this was such a fun topic, the marketing angle was so obvious just a couple of years ago. Now, at least some like MRG Effitas and CheckLab have gotten very classy and elegant in their presentations. It's worth a look for the education at least.
 

danb

From VoodooShield
Verified
Developer
And that is very kind and diplomatic speculation.
That was what I was aiming for ;).

Maybe one day we will join AMTSO, but not any time soon.

BTW, when I said "I am serious about thinking of recommendations for the testing labs... I am certain they are open to new ideas.", what I meant was MT users should think of ways to improve testing and submit their suggestions to the labs. I actually started thinking about it today when I was at the park. I have heard that there are very smart and capable people at most or all of the labs and that they know what they are doing. But who knows, someone might think of a simple suggestion that changes everything.

I think it would be cool for the labs to install a lightweight sensor on 20,000 or so random machines and all of the running items would be continually monitored by the labs. They could use something like WLC to filter out all of the obviously safe items (and installed security software), and maybe use VT to detect all of the other items. Then do some magical things from there. I think something like this is about the only way to achieve real world efficacy. Just a thought... probably not a very good one ;).

Thank you Burrito ;).

VS.png
 

Vitali Ortzi

Level 20
Verified
But I have to say, my actual experiences with the testing labs and PCMag have been overwhelmingly positive. I mention PCMag because some people like to dis on them and claim that they charge for their reviews or take kickbacks. I can assure you that they do not, at least not from my experiences. They take a deeper dive into products than pretty much anyone.
Old but gold video from cruelsister1
On the matter.
 

Andy Ful

Level 65
Verified
Trusted
Content Creator
The below is probably true:
  1. All AV vendors were/are/will be cheating in some way.
  2. There were/are/will be many tests impaired by money.
  3. Differences between many AVs in a single test are a simple illusion (due to big statistical errors).
  4. Most people misunderstand the test results and compare the tests which cannot be directly compared.
So, are the tests useless?

Yes - for most people.
No - for some people and AV vendors.

Generally, the tests are not useless as an average of many (comparable) tests. If one takes the average then the first two points are unimportant (the vendors cannot cheat and pay all the time). The third point is not valid, because the statistical errors get smaller if the number of tests increases.(y)
 

Vitali Ortzi

Level 20
Verified
The below is probably true:
  1. All AV vendors were/are/will be cheating in some way.
  2. There were/are/will be many tests impaired by money.
  3. Differences between many AVs in a single test are a simple illusion (due to big statistical errors).
  4. Most people misunderstand the test results and compare the tests which cannot be directly compared.
So, are the tests useless?

Yes - for most people.
No - for some people and AV vendors.

Generally, the tests are not useless as an average of many tests. If one takes the average then the first two points are unimportant (the vendors cannot cheat and pay all the time). The third point is not valid, because the statistical errors get smaller if the number of tests increases.(y)
I just can't stand AV companies using legal power in order to not included in some tests .
Or force the tester to max out the AV and application control or not being included with legal power.
Some worse case scenario the av vender can choose the type of attack per test to show his strength while using maxed out settings like in sophos cheating case.
Only few good tests exist but
unfortunately every vender is marked by abc letters.
 

show-Zi

Level 28
Verified
  1. All AV vendors were/are/will be cheating in some way.
  2. There were/are/will be many tests impaired by money.
  3. Differences between many AVs in a single test are a simple illusion (due to big statistical errors).
  4. Most people misunderstand the test results and compare the tests which cannot be directly compared.
This is not confined to the pc software world, and other industries tend to do the same.
In the automobile industry, I have heard that many test drive cars prepared by manufacturers are adjusted very finely at test drive events for the mass media. There is a well-known case of cheating on the fuel efficiency test which became a problem a few years ago.
I think many vendors realize that it is more convenient to use tests and reports that have a little impact on sales as promotional rather than purely evaluating performance.
 
Top