I

illumination

If my extensive testing is any indication, you can max-out Webroot's protections and it is still going to perform poorly. I'd bet it would rank about the same in the tests if you maxed-out each product's protections and re-tested them. Of course someone will scream "Foul, foul !" that's speculation. Nope. It's not. I am making an informed opinion based upon my own set of experiences - whether one considers those subjective or not - with most of the products with fully maxed-out protections.
If we placed true zero days in there, all of them would sink, but this is beside the point. I have already made clear my intent of posting, regardless of said outcomes, the products are still not tested as designed, this in my experience, means the testing is inaccurate.
It is my opinion, although i am not a professional.
 

Andy Ful

Level 52
Verified
Trusted
Content Creator
Any test that can show the big differences between competitors is interesting. One should not take seriously the scoring differences below a few percents, because of the statistical errors. That is why SE Labs finally put all competitors into 2 categories. Furthermore, only Kaspersky, Eset, Norton, and Trend Micro were placed in the first category in the period of one year.
The only surprise for many people would be the Windows Defender scoring in the last test. There are some reasons for that, because 'Block at first sight' feature was extended:
"In Windows 10, version 1803, the Block at First Sight feature can now block non-portable executable files (such as JS, VBS, or macros) as well as executable files."
The lack of the proper script protection in many AVs was discussed many times on this forum. Using Avast free + SysHardener shows how important it can be (see Malware Hub tests on Malwaretips forum).
Anyway, one good Windows Defender scoring in SE Lab test proves nothing. We have to wait for some other tests.
 
Last edited:
5

509322

Any test that can show the big differences between competitors is interesting. One should not take seriously the scoring differences below a few percents, because of the statistical errors. That is why SE Labs finally put all competitors into 2 categories. Furthermore, only Kaspersky, Eset, Norton, and Trend Micro were placed in the first category in the period of one year.
The only surprise for many people would be the Windows Defender scoring in the last test. There are some reasons for that, because 'Block at first sight' feature was extended:
"In Windows 10, version 1803, the Block at First Sight feature can now block non-portable executable files (such as JS, VBS, or macros) as well as executable files."
The lack of the proper script protection in many AVs was discussed many times on this forum. Using Avast free + SysHardener shows how important it can be (see Malware Hub tests on Malwaretips forum).
Anyway, one good Windows Defender scoring in SE Lab test proves nothing. We have to wait for some other tests.
I can design another test where, with absolute certainty, Webroot will place first and Windows Defender will place dead last - every time - no matter what samples are used... whether they are true "zero day" malware or not.

Test results are a function of what and how things are measured - just as @illumination said.

And it should come as no shocker that Microsoft would just love to put every single 3rd-party security soft out of business or otherwise kill it - both paid and free.
 

Andy Ful

Level 52
Verified
Trusted
Content Creator
I can design another test where, with absolute certainty, Webroot will place first and Windows Defender will place dead last - every time - no matter what samples are used... whether they are true "zero day" malware or not.
...
I know that you could.;)
But, not with the random samples. If the samples are random, then after the many tests you get the better results. That is why any single test proves little or nothing. There is an objective issue with Webroot = poor anti-script (Script Shield is in beta tests) and anti-exploit protection (getting ready for beta tests).
Of course, no one can be sure how the samples are prepared by the testing Lab. The more commercial it is, the bigger the chance of adjusting the samples.(y)
Furthermore, I do not think that there is a single test around, that could measure the real protection against targetted attacks. It is much better to look at the AV exploit prices in the Darknet to know it.

Edit.
That was only the comment to the post, not directly intended for @Lockdown, he already knows the above.:giggle:
 
Last edited:

cruelsister

Level 36
Verified
Trusted
Content Creator
God Help Me, but I never thought such a test could hit new lows...

SE Labs gets the samples that they use in these tests from AMTSO, specifically from their real Time Threat List. If any wasted their time and tried to delve into the exact Methodology used by SE Labs one would notice that at no time are the malware samples used broken down into type or age. Although we will always be clueless as to the type of malware used, the fact that the average detection rate (throwing out the High and the Low) is over 90%. This makes clear that even if the malware when initially acquired may have been fresh, by the time the tests were conducted this was no longer the case (See my last video).

But that's not what I came here to write about- the actual issue is the poor detection rate of Webroot in contrast to the others tested. But before I begin my rant, please understand 2 things:

1). I am NOT saying that this test was set up to be a whack job on Webroot, and
2). I am not now nor ever was especially partial to Webroot (but sometimes one must call a Spade a Spade).

That being said, I noticed that Webroot had a Protection rate of 24%, whereas Mcafee had a Protection rate of 98%. As a thought experiment I tried to think of any way I could recreate a test that would parallel these results (and please note that in the past I've done a few tests, so have a bit of experience), and after some thought I came up with only 2 possibilities:

1). I was to use hundreds of clones of the EXACT SAME MALWARE file that McAfee detected and Webroot did not; or
2). I was to just pull the results out of my Ass and not waste time actually doing the test.

As either one of these two possibilities would negate any useful information spewing out of this test, the conclusion I reached at this point was instead of SE Labs testing being the WORD OF GOD, it was actually a flwared test done by a Bunch of Chumps who do not know their Ass from their Elbow.

The biggest rank amateur would be ashamed by putting out such drivel.

(ps- I hope I didn't beat around the bush)
 

Andy Ful

Level 52
Verified
Trusted
Content Creator
...after some thought I came up with only 2 possibilities:

1). I was to use hundreds of clones of the EXACT SAME MALWARE file that McAfee detected and Webroot did not; or
2). I was to just pull the results out of my Ass and not waste time actually doing the test.
...
That can be true in some way. My guess is that SE Lab used many scripts and exploits, which are actually the weak point of Webroot. Webroot admits this and works on improvements. It is probable that Webroot is a victim of their testing methodology.
 
5

509322

That can be true in some way. My guess is that SE Lab used many scripts and exploits, which are actually the weak point of Webroot. Webroot admits this and works on improvements. It is probable that Webroot is a victim of their testing methodology.
Webroot has been performing poorly in tests from other testing organizations. Webroot's low performance here is nothing new. The whole point of the product is rollback. But the basis of rollback is that Webroot lets the system get infected in the first place. Some labs automatically deduct points for that.
 
5

509322

God Help Me, but I never thought such a test could hit new lows...

SE Labs gets the samples that they use in these tests from AMTSO, specifically from their real Time Threat List. If any wasted their time and tried to delve into the exact Methodology used by SE Labs one would notice that at no time are the malware samples used broken down into type or age. Although we will always be clueless as to the type of malware used, the fact that the average detection rate (throwing out the High and the Low) is over 90%. This makes clear that even if the malware when initially acquired may have been fresh, by the time the tests were conducted this was no longer the case (See my last video).

But that's not what I came here to write about- the actual issue is the poor detection rate of Webroot in contrast to the others tested. But before I begin my rant, please understand 2 things:

1). I am NOT saying that this test was set up to be a whack job on Webroot, and
2). I am not now nor ever was especially partial to Webroot (but sometimes one must call a Spade a Spade).

That being said, I noticed that Webroot had a Protection rate of 24%, whereas Mcafee had a Protection rate of 98%. As a thought experiment I tried to think of any way I could recreate a test that would parallel these results (and please note that in the past I've done a few tests, so have a bit of experience), and after some thought I came up with only 2 possibilities:

1). I was to use hundreds of clones of the EXACT SAME MALWARE file that McAfee detected and Webroot did not; or
2). I was to just pull the results out of my Ass and not waste time actually doing the test.

As either one of these two possibilities would negate any useful information spewing out of this test, the conclusion I reached at this point was instead of SE Labs testing being the WORD OF GOD, it was actually a flwared test done by a Bunch of Chumps who do not know their Ass from their Elbow.

The biggest rank amateur would be ashamed by putting out such drivel.

(ps- I hope I didn't beat around the bush)
It's pretty easy to smash Webroot.
 

Nightwalker

Level 18
Verified
Trusted
Content Creator
The only think that Webroot is actually good is in making excuses.

In 2010 when Webroot acquired Prevx it was a good novel protection concept and the client itself was very light in all metrics, nowadays others vendors already matched and surpassed its cloud protection technology and most antivirus are light enough.

Webroot lacks in many areas; its rollback technology doesnt work like it should, it doesnt have a proper emulator, doesnt do much against polymorphic malware and it can be very heavy on resources because of some bugs.

In the past I was a Prevx/Webroot enthusiast, but it disappointed me enough for long years.
 

Azure

Level 25
Verified
Content Creator
The only think that Webroot is actually good is in making excuses.

In 2010 when Webroot acquired Prevx it was a good novel protection concept and the client itself was very light in all metrics, nowadays others vendors already matched and surpassed its cloud protection technology and most antivirus are light enough.

Webroot lacks in many areas; its rollback technology doesnt work like it should, it doesnt have a proper emulator, doesnt do much against polymorphic malware and it can be very heavy on resources because of some bugs.

In the past I was a Prevx/Webroot enthusiast, but it disappointed me enough for long years.
Also according to @Umbra it's bad for SSDs.
SECURE: Complete - Umbra Lockdown Security 2018 (3 systems)
 

Burrito

Level 22
Verified


This is like a bad joke. Webroot has "the most powerful" propaganda. And that's about it.




Oh, Yeaahh indeed.

Go to the Webroot forums and look at what they say about all their bad testing with all the testing labs... much of it is the same recycled BS for over 8 years...

For our friends who may not know the "Drink the Kool-Aid" reference:

Drinking the Kool-Aid - Wikipedia
 
Last edited:

Andy Ful

Level 52
Verified
Trusted
Content Creator
Webroot has been performing poorly in tests from other testing organizations. Webroot's low performance here is nothing new. The whole point of the product is rollback. But the basis of rollback is that Webroot lets the system get infected in the first place. Some labs automatically deduct points for that.
Good point.(y)
 

cruelsister

Level 36
Verified
Trusted
Content Creator
It's pretty easy to smash Webroot
But not to this extent, especially with respect to the other products tested. Someone mentioned the use of scripts to be the culprit, but this cannot be the case as a number of the other products tested I know are virtually oblivious to this class and they scored in the 90's. Something is wrong here (And Please, Please note that I am as far from a Webroot apologist as you can get!). Don't mean to harp on this, but I am honestly confused.

Also I suppose that any discussion of this particular test is an exercise in futility as SE Labs only speaks to their methodology in the most vague way possible- they do the test over 3 months (yearly quarter), they get their malware from AMTSO (which any subscriber can also get), and they run them against various products. Do they collect the malware for 30 days, run them monthly all at once and do this 3 times for the 3 month span? Do they collect the malware for 3 months then run the test? We just don't know as they don't tell us.

The one thing that we can be certain of is that this test is either not done daily and/or the malware they use is not D+1 or newer (actual things that a user will come across, since this stuff is what is actively being pushed out). Fresh malware (a really Real World scenario) would never yield such superlative results for the vast majority of products tested.

For me, a True Real World test would be:

1). We got these 10 samples from a honeypot, all undetectable 6 hours ago.
2). We made sure that they were malicious and all are different
3). We ran them against all of the products tested SIMULTANEOUSLY within the D+1 timeframe.
4). These are the results...

But it seems that the Pro testing sites would rather use older malware so that the overall results for the bulk of the products tested are over 90%. This may make the user of these products feel good, but they are also being put at risk due to such shoddy methodology.

Please just remember one very important thing- Malware being actively pushed out by the Blackhats are NOT OLD STUFF, yet this seems to be the malware used by the Pro Sites.
 

Andy Ful

Level 52
Verified
Trusted
Content Creator
Yes, even with scripts, exploits, and the rollback feature, still the results look strange. I read their test methodology and it is unclear to me. It is possible that they used many samples of very similar script/exploits. I have seen this on Malware Hub samples. Many malicious documents use in fact the macros based on the one download/drop PowerShell method.
 

cruelsister

Level 36
Verified
Trusted
Content Creator
Remember Powershell is not malicious in itself- it acts as a trigger for the true payload, and it's not as if WR will allow any payload downloaded/installed by PS without subsequent checks. The use of Scripts/PS malware would not explain the results.

Yeah, I could construct a test that would trash WR without that much problem- the issue would be using the same malware files and getting the superlative results seen in some of the others. It's not that WR sucked in this test- it's that it sucked that much compared to the others. And the pathetic lack of info on the exact methodology (as you point out) really should make one question the legitimacy of this test, and SE Labs itself.
 
D

Deleted Member 3a5v73x

It's not that WR sucked in this test- it's that it sucked that much compared to the others. And the pathetic lack of info on the exact methodology (as you point out) really should make one question the legitimacy of this test, and SE Labs itself.
I remember in the past when Webroot was put to an another AV test and got smashed, Webroot later came up with a claim, that AV testing lab incorrectly installed Webroot/Webroot couldn't properly reach their cloud servers for all protection mechanisms to work. Also, there have been times when Webroot was included in the AV test reports and it scored bad, Webroot has been deleted from that report later on.
 
Last edited by a moderator: