show-Zi

Level 28
Verified
I think the lab tests are helpful to a degree that's a bit informative.
It is not the detection that I focus on, but the rate of false positives. That's because work is often interrupted by vague warnings of "may be dangerous" rather than warnings of "danger.

In the world of automobiles, there are test-drive reports that make a big show of how fast the car can go and how fast it accelerates. But on congested roads, it's also a struggle to pass the carriages in front of you.🥕🐎🏎
I think most of these types of performance comparison tests exist to satisfy the owner's self-satisfaction and provide only a preliminary level of information before use or purchase.
 

Burrito

Level 24
Testing and tests are great.

They are the official report card of capabilities.

Sure, there are challenges and limitations.

I've communicated with Andreas (AV-Comparatives) off-line and other lesser known testers -- they are really bright really smart people. And they do a great job of factoring in as many variables as possible for testing within certain constraints.

Certainly there are some limitations that I don't care for. Like the 'everybody gets a trophy' designed malware test sets where most AVs finish with detection rates close to 100%. But if certain AVs are shown at their real sucky level --- they most likely wouldn't ever be a paying customer. And again.... malware testing is not an easy business.

And ---- that some AVs are tested, but we never know it.... as they pay have their results NOT posted. That's sorta lame.

And other stuff... It's not perfect by any means.

And certainly testing has less meaning to us here at MT. Since many here use multiple capabilities and mitigations and don't run at default settings --- the results of testing don't reflect much in terms of how the average MT system would respond.

All that stated --- testing is great. It generally is reflective in a general sense of the baseline abilities and qualities of products.
 

plat1098

Level 22
Verified
Well, these studies have incredibly enormous, lucrative value to the industry and appeal primarily to mainstream users who probably don't know too much beyond what's on the surface. Perfect! :devilish:

But from a scientific standpoint, it's almost funny. The only way, Burrito, I'd pay attention to any findings is if an impartial data center collected telemetry from a sample size of, say, one million users as a cross-section during a, say, six month time frame. The samples would be taken from a proportionate number of users with various brands installed. It would have to be a blind study but with instructions to avoid subjects purposefully going out and getting infected, or tampering with the security product and/or changing anything about their online behaviors. Hmmm. :unsure: Not as easy as it sounds, by far.

See what I mean? I don't have to know how these labs operate behind the scenes to think that any results are at best, a relative, vague, and ballpark kind of thing. I don't do "vague" or "ballpark." I do, however, really like and respect that CheckLab.pl moved away from the glaringly obvious and shameful red/yellow/green color scheme of the bar graphs. Remember those, where 1/10 of a percent difference between Brands A and B meant a lot of extra red in the bar and had the power to change users' minds and wallets? This was supposed to be scientific back in the day. Yeah, right.
 

toto

Level 4
Verified
I completely disagree I have a friend who worked for the intelligence community (granted he is not the Average Joe) who on his own personal computer once clicked on something he thought innocuous that sliced straight through Kaspersky, it didn't get him though because he had an "extra special security layer" on his machine.
VoodooShield with WLC (which dramatically reduces user interaction) is easy for the Average Joe to learn and once familiarity is established, one can simply ignore browsing alerts and move on to the next site.
The additional protection VoodooShield will give to WD is comparable to the difference between an Apollo Space Rocket and a Pogo Stick.
Is your friend sure that Kaspersky wouldn't detect the threat via it's behaviour blocking components or stop it from connecting to the internet as it might be flagged as suspicious or something, it's really not easy to determine a success story or a failure story in these cases in my opinion. Obviously no one wants to test how Kaspersky or any product would react to a threat proactively and just because it wasn't detected by the signatures of the product it could have been rendered incapable of doing any harm. Just saying that the user that encounters a lot of threats online is someone who is going to allow suspicious files on a default deny application because they will mostly be cracks and keygens but the one that isn't of this sort is going to be hidden in a crack file either way.
 

toto

Level 4
Verified
Regardless of how meaningful those test results are; testing labs themselves are useful. They force improvements in protection via competition. Without those tests, a user would only be left with marketing. The products with best marketing would win, protection wouldn't count at all to sell stuff. No one would be driven to improve protection too much
I guess you're right and the testing labs drive competition but if it's true that these labs can sometimes be corrupted and companies that pay more get better results then it would basically be a marketing choice for a company to do well in these tests and that would be sad
 
Yes, you are speaking the truth here. Many times these testing sites even debunk themselves by releasing different tests which shows different results. Also let's not forget that for several years, websites like Top Ten Reviews were paid by Bitdefender to always be #1 at all times despite many times where you can argue that they are not #1. For me I do not think it is worth sacraficing 300+mb of my ram for protection that I will not need because I do not torrent, download p o r n, or do things that would compromise my computer. An antivirus is not going to prevent DDos attacks like a vpn does. My stratigy is to use a browser extension like Bitdefender traffic light, adguard, or a program like Web of Trust to browse the internet and have a backup antimalware scanner like Malwarebytes, Emisoft Emergency Kit, or Hitman Pro to scan my files once a week or once a month to make sure I am fine.
 

Burrito

Level 24
Well, these studies have incredibly enormous, lucrative value to the industry and appeal primarily to mainstream users who probably don't know too much beyond what's on the surface. Perfect! :devilish:

But from a scientific standpoint, it's almost funny. The only way, Burrito, I'd pay attention to any findings is if an impartial data center collected telemetry from a sample size of, say, one million users as a cross-section during a, say, six month time frame. The samples would be taken from a proportionate number of users with various brands installed. It would have to be a blind study but with instructions to avoid subjects purposefully going out and getting infected, or tampering with the security product and/or changing anything about their online behaviors. Hmmm. :unsure: Not as easy as it sounds, by far.

See what I mean? I don't have to know how these labs operate behind the scenes to think that any results are at best, a relative, vague, and ballpark kind of thing. I don't do "vague" or "ballpark." I do, however, really like and respect that CheckLab.pl moved away from the glaringly obvious and shameful red/yellow/green color scheme of the bar graphs. Remember those, where 1/10 of a percent difference between Brands A and B meant a lot of extra red in the bar and had the power to change users' minds and wallets? This was supposed to be scientific back in the day. Yeah, right.

Yeah plat1098.... I'm with ya. If we could get a test of those parameters, that would be like... uh.... great. Just one time way back in the day, Microsoft had some rough data on computer infections and the resident AV at the time of infection. As you might imagine, there were many parties who were not pleased -- and that disappeared before most knew it was there. But that type of data collection has limitations too. It's kinda sorta similar to the Malwarebytes live worldwide malware infection map --- if that thing is still alive.

But here's the thing about the testing labs and even our own Malware Hub. They tend to show similar results and trends over time. I think actual meaning can be derived from the cumulative results. That's not as good as your million endpoints over half-a-year for sure... but hey, ya work with what ya got sometimes. 🔨

I hope CheckLab.pl doesn't go the AV-Comparative route of everybody gets a trophy. It could have already started.


In the most recent test, there is only a 'speck of red.'

Compared with one of the earlier financial tests -- which was interesting.



In testing, ya gotta let losers be losers.... instead of trying to have a 'tight shot group.'

That's a reason that I like some of SE Labs testing.

They let losers be losers.

1589941182182.png

1589941326965.png
 

blackice

Level 28
Verified
Yes, it can be annoying for a day or two, but once the whitelist is built it is extremely secure and quiet. The annoying part can be avoided by not changing your security config every other day ;).
I do like your approach and I think your software is one of the best approaches to this. It’s very user friendly, as user friendly as default deny can be I suppose.
 

roger_m

Level 31
Verified
Content Creator
I don't think that tests are completely useless. But they definitely are flawed and the results can be quite misleading.

If you look at test results, you will typically see very little difference from one product to the next, which indicates that you should be well protected no matter what antivirus you use. This can be misleading. As an example, Panda usually scores very well in tests, which should mean that it's an excellent antivirus. However, more often that not, it is slow to add signatures for new malware. Combined with its poor behaviour blocking, it doesn't do well at protecting against new malware. If it was tested against newer malware, it wouldn't score nearly as well. I believe that you would start to see a meaningful difference in results between products, if more recent malware was tested. But maybe, the tests reflect the type of malware than an average user is likely to encounter, in which case the age of the sample tested, isn't much of an issue. I'd be interested to hear thoughts on this.

When it comes to performance tests, quite often I find my real world experience with antiviruses, is quite different to the test results. Sometimes, antiviruses that I find to be quite heavy, score better than antiviruses I find to be light. Of course, antivirus performance can vary greatly from one computer to the next. Quite probably the best example of this, is that some people with low end computers are happy with Windows Defender's performance, while some with high end computers find it to be too heavy. I'm sure that average users who see performance test results, are going to expect antiviruses to perform as well for them as they did for the testers.

Having said all of that, I think there is some value in tests. Even if usually tests show that there is very little difference between antiviruses, it does indicate that they are not performing terribly. If an antivirus is never tested, it raises questions about the protection it provides. In my case, I think tests are fine, if you don't take the results too seriously.
I'm on a computer 14 hours a day and honestly I have never encountered malware without actively going out and looking for trouble.
I spend nearly as much time online and also never encounter malware, despite making no effort to visit only well known and trusted sites. As a result, I'm happy to use only an antivirus to protect me, as I don't feel I need anything else. While I don't use WD, in terms of protection, I'd be happy to use it or any antivirus that does not perform terribly in testing. I'm too concerned about having "the best" antivirus, as I don't come across malware, or get infected, so I'm more interested in an antivirus which has the features I want and is light.
 

Back3

Level 6
I think testing labs are useless for most people that use a computing device: most people, maybe 90% don't give a damn about testing labs.

Last week, I had 3 calls from friends: the first one told me that he had lost everything on his laptop screen and did not know what to do. It took me a few minutes to understand that he had turned on the tablet mode and 30 seconds to get his regular screen back. I also helped him get rid of useless shortcuts and set up Mail on his computer. He didn't know he had an antivirus on his computer.
My second friend got a bill from Avast. But he does not use Avast on his laptop, I knew that he had used Avast on a previous computer. I sent him links to unsubscribe from Avast.Another friend had installed Avast 3 years ago. He does know what an antivirus is but he doesn't know what antivirus is running on his computer.
The third friend called me because she had dropped her Iphone in water. She sent a lot of messages to people asking them what to do and most people told her to put her phone into rice for 48 hours. Finally she called me on a landline, crying. I knew she had an Iphone 8 and that these phone are good for 30 minutes in water. I told her : clean your phone and don't eat the rice. I'm sure your phone is running OK. It was!
These people are persons with good jobs and well considered in their community, But they don't know about testing labs and don't care.

Testing labs results are interesting for people like me. I know that sometimes, it's good or bad marketing. But I learned a lot about security just by discussing the usefulness of testing labs. Just for that, it's useful.
 

Local Host

Level 23
Verified
I believe the problem with the majorly of you, is that you focus on the results, instead of following the whole process that led to the results (and the majorly of those test labs share that process).

Which in the end make for accurate results in comparison to real world testing.

Home users are not going to run into zero day malware nor targetted attacks, and testing AVs against average malware is as accurate as it can get for Home Users.

Most AVs nowadays rely on Avira, BitDefender and Kaspersky for signatures, which is why the results for the tests are similar across the board +90%, the difference will be in their prediction algorims, disinfection and overall performance (unfortunately bloatware as well).
 
Last edited:

jackuars

Level 26
Verified
It actually doesn't hurt to look at those PDF reports, understand what the testing environment was, understand the data and take an informed decision.

The problem starts when people don't understand what is being done, why was it done, who was it for and finally 'claiming' they are rigged.

When data is turned into information, it never has any useless details.

Let me quote:
“Jasnah had once defined a fool as a person who ignored information because it disagreed with desired results.”
 
Last edited:

struppigel

Moderator
Verified
Staff member
I think testing labs are useless for most people that use a computing device: most people, maybe 90% don't give a damn about testing labs.

Last week, I had 3 calls from friends: the first one told me that he had lost everything on his laptop screen and did not know what to do. It took me a few minutes to understand that he had turned on the tablet mode and 30 seconds to get his regular screen back. I also helped him get rid of useless shortcuts and set up Mail on his computer. He didn't know he had an antivirus on his computer.
My second friend got a bill from Avast. But he does not use Avast on his laptop, I knew that he had used Avast on a previous computer. I sent him links to unsubscribe from Avast.Another friend had installed Avast 3 years ago. He does know what an antivirus is but he doesn't know what antivirus is running on his computer.
The third friend called me because she had dropped her Iphone in water. She sent a lot of messages to people asking them what to do and most people told her to put her phone into rice for 48 hours. Finally she called me on a landline, crying. I knew she had an Iphone 8 and that these phone are good for 30 minutes in water. I told her : clean your phone and don't eat the rice. I'm sure your phone is running OK. It was!
These people are persons with good jobs and well considered in their community, But they don't know about testing labs and don't care.

Testing labs results are interesting for people like me. I know that sometimes, it's good or bad marketing. But I learned a lot about security just by discussing the usefulness of testing labs. Just for that, it's useful.

Those people might not care about the results directly, but they are still influenced by the people who do.
A lot of those testing lab results appear in well-known security and computer magazines. They say it's their own test, but essentially they pay AV Test and similar to do the testing for them.

Now, the people who read those magazines will not only decide based on the magazine's results, they will also influence the people around them who are not tech-savvy and rely on them.

We see a very significant correlation of good test lab results and more purchased licenses.
 

plat1098

Level 22
Verified
Listen, the one and only time I found a comparatives study quite useful was when Windows 10 first came out. Microsoft was making these hilarious claims that this was the most secure Windows yet, but Defender's malware detections were in the bottom of the canyons. Within a year, MS did start making considerable progress and now seems to be competitive most of the time (if you believe what you read. )

Not trying to be snotty, but what "useful" information could be gleaned here by someone not in the user mainstream? What antivirus has the "best" protection against browser-based financial malware? What to buy? Please! But for mainstream users, that's the whole point! With a vast user market fueled by news of ransomware popping up every other day, this is a gigantic money machine. Yin and yang. But these "labs" needed to clean up their act and display their methods as far as portraying the results goes. Who is overseeing them and the methodologies some of you find interesting? If one tries to argue this point, one would be talking to vapor. That's how...dubious...things were. And most people in the mainstream would find reading the methods to be quite boring. Hence the small print sometimes.

If you find comparatives useful, fine. If not, fine. Should there be a poll? Do you allow comparatives to influence your purchase/use of a given product? Yes No
 

Spawn

Administrator
Verified
Staff member
Microsoft was making these hilarious claims that this was the most secure Windows yet, but Defender's malware detections were in the bottom of the canyons.
Again... Secure and Security are not the same.

Windows 10 is the most secure Windows OS to date.

For the sake of 'security', the RTM version of Windows 10 is no longer the most secure, all the newer versions take the crown upon general release. Next year Windows 10X might take that crown, just have to wait and find out.
 

Vitali Ortzi

Level 20
Verified
Sophos did really well in that test Vitali, read below to see why, taken from page 4 of test:

View attachment 240402

Another reason to put little faith in these tests other than for entertainment purposes only.
Yeah they cheated the test ,
Symantec tried cheating as well but sophos was ahead of them they are the sponser XD.
That's why I'm interested in more exploit tests
, And doing my own as well.

Didn't get a response yet from Effitas about the samples I requested.

BTW does anyone know about a good test lab without cheating or marking products by abc letters?
 
Last edited:
Top