App Review ESET Internet Security 2019 Review

It is advised to take all reviews with a grain of salt. In extreme cases some reviews use dramatization for entertainment purposes.

Nightwalker

Level 24
Verified
Honorary Member
Top Poster
Content Creator
Well-known
May 26, 2014
1,339
Quoting Marcos from ESET:

Below find my personal comments that may not represent an official response of the company on this test.
1, It's not a real world test and it appears that some protection layers were bypassed (e.g. web protection with more aggressive detection and url blocking), ie. the results might not reflect how ESET would protect users in real life. Also the question is if the missed sample was actual or synthetic threat. Since we didn't get missed samples for verification, we don't know how prevalent in the world they are.
2, A false positive test was not a part of the test. It's easy to detect 100% of malware if also clean files are detected.
3, The author works for Emsisoft. Despite the claims of being independent, it's hard to believe that this did not affect the test in any way. It's also interesting that Bitdefender got best results and Emsisoft uses its engine as well.
Employees of AV companies should not perform tests that they proclaim to be independent and unbiased. Only prestigious and respectful AV testing organizations should do that where independence is ensured. It would not be too difficult to make a test where an AV scoring 100% in other tests would get 0% if the "right" samples were picked in the test set.
4, "If a sample successfully makes it to memory and begins execution, it is considered a miss." This is a flawed methodology. A file has to be first unpacked in memory before it is executed. Advanced memory scanner triggers a scan only after a file has been executed and unpacked in memory.

I strongly recommend taking tests from youtube or performed by other than non-professional testers with a pinch of salt. One must consider and understand all aspects of how a test was performed in order to take the results seriously.

Very poor test result
 

MikeV

Level 19
Verified
Top Poster
Well-known
Sep 9, 2013
925
I would be very curious to see if other AVs would detect this ransomware sample.The fact that Eset failed this test doesn't mean that Eset sucks.
In general Eset failed in this particular test, but still is one of the best in the market today.
I've seen other vendors failed as well in other tests for example:
RoxasDev
 

Cortex

Level 26
Verified
Top Poster
Well-known
Aug 4, 2016
1,465
ESET does protect against Ransomware by default and it is actually very good doing so.

When was the last time that you used ESET? Because the user cant tweak the Ransomware module (Ransomware Shield) ...

Erm now: on this PC, if you read my post again I wasn't really being specific, I was really talking about tweaking ESET - This thread was originally regarding a test done by Leo, & it failed at a Ransomware sample? The thread has then wandered to the need to tweak ESET to really get the best out of it (for it to function well) .

My point is (though is lost) I install ESET (among other AV's) on a fair few PC's often ladies I work with & they expect an AV to just work, they aren't going to learn to tweak it & I don't have the time. I tweak this PC on everything but most AV users of any kind don't come on forums like this & never will. If ESET don't function well on default it really ought too - It failed Ransomware on Leo's test unless my contact lenses are in the wrong eyes? I actually like ESET & use it but IMHO the best out of the box is Norton.

Any AV failing on Ransomware (yes I know lots do but this was about ESET) is a massive no, no as if you don't have multiple backups all is lost (I do BTW)

What's with the screen-shot of Kaspersky?
 
Last edited:
  • Like
Reactions: oldschool
F

ForgottenSeer 72227

Ps: I am not shilling for ESET, I just dont like to see misformation and subjective opinions posing as facts.

This is my stance as well. Some my see me as an Eset fanboy, but I am not, I could care less actually. I would do the same for any product. If someone did a YouTube video and showed that Kaspersky failed (for example) and people started saying OMG it's a piece of crap, I would say what I have been saying for Eset, for Kaspersky, or any other reputable product for that matter.

I've read the thread on the Eset forum regarding the test and I suggest people read what Marcos has been saying, as I agree with him. People may see him as making up excuses for Eset, but he's not, he has many valid points and these all need to be considered when someone tests any product. In the case of Leo's test, Eset failed, would I have liked to see a better result, absolutely, but hopefully Eset will take this and try to make some improvements to their product. If you read what Marcos has said, he does bring up many good points about the test. I don't think he's defending the fact that Eset failed, but rather the fact that Eset didn't get a chance to validate the test results. Leo didn't submit any of the missed samples to Eset, there's no law saying he has to, but IMHO it's only good practice to do so, especially when all we have is your word regarding the samples. I am NOT saying that Leo rigged the test, but he should have given Eset a chance to look at it.

@RoboMan brought up some good points earlier in the thread about testing in general as well. Malware now a days is so complicated that testing isn't as simple as it's made out to be. It does require some very good insight and knowledge to see what is really happening. Saying that on-demand scanners didn't pick up anything in the system is not a good reason to say well it's 100% clean. They are useful tools, but how do we know that it's not just a false negative and that the on-demand scanners just missed the sample? Also I've seen many tests (including Leo's) where by he runs a sample nothing happens and the automatic assumption is the sample doesn't work...how do we really know nothing happened to the system instead?

So many people proclaim themselves as experts when all they have done is disable protection to test signatures (not a real world scenario, nor giving the product it's full capabilities (ie: web scanner)), then run the missed samples with protection enabled and run on-demand scanners. This method can work, but again it's not a true representation of how people use their computers or how a product truly performs. All this is, is bench racing, and testing only specific components of the product, not it's full capabilities. To be very clear, I would say this in defence for any product, not just Eset tested in this manner.

My point is every test should be taken with a grain of salt. While it's fun to test, it isn't as simple as it's made out to be, as malware is very complicated now a days. IMHO anyone doing a test, should at least submit the missed samples to the vendor to not only allow them the chance to protect their users from them, but also give them a chance to validate them. Again, I am not here to defend Eset, but rather the mentality that after one test (where we really don't know all the in's and out about the test, including the vendor) we should just start labelling the product and or any product, as a piece of crap.
 
Last edited by a moderator:
F

ForgottenSeer 72227

Any AV failing on Ransomware (yes I know lots do but this was about ESET) is a massive no, no as if you don't have multiple backups all is lost (I do BTW)

To be fair though, this is a very unrealistic expectation. You cannot blame a product for someone not having proper backups.This is the problem with security software IMO, people want the software to do all the thinking for them and ignore basic habits. Having proper backups is essential as anything could happen (ie: ransomware, hdd dies). If your hard drive dies tomorrow and you don't have proper backups, what happens? It's not your fault for not having backups, my hard drive should have lived forever, WD, Segate, or what ever brand you have is a POS because it died and I lost everything. No one likes to take responsibility for their own mistakes, even though concepts like having proper backups has been said over, and over and over like 10000000 times before.

Sorry @Music4Ever I am not picking on you, just the general mentality that people have when it comes to these situations :cool:
 

RoboMan

Level 35
Verified
Top Poster
Content Creator
Well-known
Jun 24, 2016
2,400
This is my stance as well. Some my see me as an Eset fanboy, but I am not, I could actually care less. I would do the same for any product. If someone did a YouTube video and showed that Kaspersky failed (for example) and people started saying OMG it's a piece of crap, I would say what I have been saying for Eset, for Kaspersky, or any other reputable product for that matter.

I've read the thread on the Eset forum regarding the test and I suggest people read what Marcos has been saying, as I agree with him. People may see him as making up excuses for Eset, but he's not, he has many valid points and these all need to be considered when someone tests any product. In the case of Leo's test, Eset failed, would I have liked to see a better result, absolutely, but hopefully Eset will take this and try to make some improvements to their product. If you read what Marcos has said, he does bring up many good points about the test. I don't think he's defending the fact that Eset failed, but rather the fact that Eset didn't get a chance to validate the test results. Leo didn't submit any of the missed samples to Eset, there's no law saying he has to, but IMHO it's only good practice to do so, especially when all we have is your word regarding the samples. I am NOT saying that Leo rigged the test, but he should have given Eset a chance to look at it.

@RoboMan brought up some good points earlier in the thread about testing in general as well. Malware now a days is so complicated that testing isn't as simple as it's made out to be. It does require some very good insight and knowledge to see what is really happening. Saying that on-demand scanners didn't pick up anything in the system is not a good reason to say well it's 100% clean. They are useful tools, but how do we know that it's not just a false negative and that the on-demand scanners just missed the sample? Also I've seen many tests (including Leo's) where by he runs a sample nothing happens and the automatic assumption is the sample doesn't work...how do we really know nothing happened to the system instead?

So many people proclaim themselves as experts when all they have done is disable protection to test signatures (not a real world scenario, nor giving the product it's full capabilities (ie: web scanner)), then run the missed samples with protection enabled and run on-demand scanners. This method can work, but again it's not a true representation of how people use their computers or how a product truly performs. All this is, is bench racing, and testing only specific components of the product, not it's full capabilities. To be very clear, I would say this in defence for any product, not just Eset tested in this manner.

My point is every test should be taken with a grain of salt, further more while it's fun to test, it isn't as simple as it's made out to be, as malware is very complicated now a days. IMHO anyone doing a test, should at least submit the missed samples to the vendor to not only allow them the chance to protect their users from them, but also give them a chance to validate them. Again, I am not here to defend Eset, but rather the mentality that after one test (where we really don't know all the in's and out about the test, including the vendor) we should just start labelling the product and or any product, as a piece of crap.
Amazing text, pretty much sums up general opinion. We've been growing fans of internet tests lately. I tried to do some tests for Youtube but dropped it a week ago for a while, since I'm trying to interiorize a bit more on what I'm doing. After @askalan told me a very big truth about malware self-awareness of VM I've realised testing the way I was testing was useless, so I'm practicing a bit more on isolation and VM obfuscation before trying to test again.
 

beavisviruses

Level 3
Verified
Oct 8, 2018
127
I agree, it's a very capable product, needs a little tweaking, but if you look at all its features as a whole it pretty much covers everything.
I do not agree with the test it clearly shows how the product does not protect
in this test eset is very poor and useless.
 
F

ForgottenSeer 58943

I don't have time to read this entire thread, but ESET does have a point that this doesn't factor other important aspects of a real world scenario, which is something I stress often.

However it does test the raw effectiveness to some extent and ESET really didn't do well. But I think it has been discussed around these parts (and elsewhere) ESET hasn't been faring all that well on many tests lately and has largely shifted to a second tier product. Does anyone really disagree with that?

I see in the comments for his video a few people are asking him to test Gdata with the new deepak chopra technology in it. I'd be curious to see that test without a doubt since GD pretty much cleaned up when he tested it last year. I like ESET in Gryphon though, it seems quite effective and I like it as a secondary layer. But I prefer it there, rather than on endpoints.

I still miss the old modular interface of the original popular NOD32.
 

RoboMan

Level 35
Verified
Top Poster
Content Creator
Well-known
Jun 24, 2016
2,400
All antivirus software have their up and downs on tests like these. Millions of factors make this results unpredictable: Windows updates, coding bugs, samples ages, new discovered vulnerabilities. The only product we can claim can catch all these zero-day attacks which exploit problems nobody yet knows but black hats, would be an AI product, and we all know how those went.
 

Mahesh Sudula

Level 17
Verified
Top Poster
Well-known
Sep 3, 2017
818
Ok if the vendor has up's and down's !
But the vendor should be open enough to accept their flaws and should work to overcome them
I visited their forum regarding the above, and all the guys their give some nasty excuses> Powe rpoint macros, memory scanner, exploit blah blah !!
But none are open enough to accept that yes we lack behaviour based mechanism apart from signatures which is open secret
Secondly i really hate them for the lies they spread around whenever there is a out break..their unnecessary features, tune up, Uefi scanner, next to bloat..meanwhile they can at least work in developing a dynamic engine or at least borrow one
I really hope ESET develops some alternative mechanism beside signatures..which is vital.
 

Azure

Level 28
Verified
Top Poster
Content Creator
Oct 23, 2014
1,712
Ok if the vendor has up's and down's !
But the vendor should be open enough to accept their flaws and should work to overcome them
I visited their forum regarding the above, and all the guys their give some nasty excuses> Powe rpoint macros, memory scanner, exploit blah blah !!
But none are open enough to accept that yes we lack behaviour based mechanism apart from signatures which is open secret
Secondly i really hate them for the lies they spread around whenever there is a out break..their unnecessary features, tune up, Uefi scanner, next to bloat..meanwhile they can at least work in developing a dynamic engine or at least borrow one
I really hope ESET develops some alternative mechanism beside signatures..which is vital.
Their DNA signatures are probably the behavior detection mechanism.
 
5

509322

people want the software to do all the thinking for them and ignore basic habits.

This is a prevailing unrealistic expectation.

The product has HIPS. HIPS still works. Using it is not difficult with a little bit of effort to learn.

Ai\ML, behavioral detection, etc... it will all fail with the right test.

You know it would be different if these debates actually had some merit, but they don't. People talk about products, but they don't know how they work. And they certainly don't know about malware.

I've read the thread on the Eset forum regarding the test and I suggest people read what Marcos has been saying, as I agree with him. People may see him as making up excuses for Eset, but he's not, he has many valid points and these all need to be considered when someone tests any product. In the case of Leo's test, Eset failed, would I have liked to see a better result, absolutely, but hopefully Eset will take this and try to make some improvements to their product. If you read what Marcos has said, he does bring up many good points about the test. I don't think he's defending the fact that Eset failed, but rather the fact that Eset didn't get a chance to validate the test results. Leo didn't submit any of the missed samples to Eset, there's no law saying he has to, but IMHO it's only good practice to do so, especially when all we have is your word regarding the samples. I am NOT saying that Leo rigged the test, but he should have given Eset a chance to look at it.

Either you protect the system or you don't. In this case ESET failed, but the reaction to the test is blown way out of proportion. Typical emotion-laden reactions. The points Marcos bring up are a tactic to deflect and discredit the only fact that matters... and that is that ESET failed to protect the system. So what ? ESET failed. It happens. Big deal. The reaction the test has received is disproportionate to the issue.

However, I can understand where Marcos is coming from to a certain extent. "Bypass" videos have been made of AppGuard and when we take a close look at them they're actually not a bypass or the video only shows a portion of the entire sequence of events - for the video maker's purpose.

One thing that Marcos talks about which rings true in that thread is default deny and the fact that it works. Meaning configure the ESET HIPS and learn how to use it and the nonsense is over and done with.
 
Last edited by a moderator:

Nightwalker

Level 24
Verified
Honorary Member
Top Poster
Content Creator
Well-known
May 26, 2014
1,339
This is my stance as well. Some my see me as an Eset fanboy, but I am not, I could care less actually. I would do the same for any product. If someone did a YouTube video and showed that Kaspersky failed (for example) and people started saying OMG it's a piece of crap, I would say what I have been saying for Eset, for Kaspersky, or any other reputable product for that matter.

(...)

Excellent post, as always.
 
F

ForgottenSeer 72227

This is a prevailing unrealistic expectation.

The product has HIPS. HIPS still works. Using it is not difficult with a little bit of effort to learn.

Ai\ML, behavioral detection, etc... it will all fail with the right test.

You know it would be different if these debates actually had some merit, but they don't. People talk about products, but they don't know how they work. And they certainly don't know about malware.


I agree 100% with you! This is why I was posting what I did. I don't actually care that Eset failed the test, any product can fail a test, but everyone makes the unrealistic expectation that every product should be 100% perfect. Like you said the reaction is being blown out of proportion, but again, this happens when tests come out (professional or YouTube).

Either you protect the system or you don't. In this case ESET failed, but the reaction to the test is blown way out of proportion. Typical emotion-laden reactions. The points Marcos bring up are a tactic to deflect and discredit the only fact that matters... and that is that ESET failed to protect the system. So what ? ESET failed. It happens. Big deal. The reaction the test has received is disproportionate to the issue.

However, I can understand where Marcos is coming from to a certain extent. "Bypass" videos have been made of AppGuard and when we take a close look at them they're actually not a bypass or the video only shows a portion of the entire sequence of events - for the video maker's purpose.

One thing that Marcos talks about which rings true in that thread is default deny and the fact that it works. Meaning configure the ESET HIPS and learn how to use it and the nonsense is over and done with.

Very good points.

It's natural for anyone to defend their work one way or another. I do agree that the reaction has been blown out of proportion and is typical emotional behavior that always happens when these tests are done. People see a failed test and it automatically becomes the end of the world, hence why (especially on security forums) you have this constant musical chairs of security software. People over react and unrealistically expect perfection every time.

If people take the time to read the fine print on some of the "professional" tests you will sometimes see comments like "x product disputed this sample/result....." So I guess every vendor is just making excuses. To be perfectly clear, I have seen this comment on multiple professional tests, from many vendors including Kaspersky, Symantec, etc... I am not trying to sling mud into people's faces, but point out the fact that what Marcos is saying does have some merit.

Like you I do agree that Marcos does have a point to a degree. His points are valid and shouldn't be brushed off entirely as making excuses.

Maybe I'm crazy, but I am also not a huge fan of the mentality that the "tester" is always right and that the vendor is always wrong. IMHO both the vendor and the tester can always improve. As I've said malware testing isn't as simple as it's made out to be and thank you @Lockdown for echoing this statement. Your experience with Appguard and testing is a good example of why its not that simple.

Like I've said, people like to claim to be experts in this field because they were able to run some malware samples in a VM. I hate to break it to you, but it's not that simple and your not an expert.

I will try to bring this to a close because I don't want this thread to continue to go in circles, as many good points have been brought up on both sides.

At the end of the day Eset failed this particular test. Should it have done better, yes/maybe. I say maybe because despite the result, we really don't know all the details fully and it's still not a realistic scenario. This is not how people would use their product/computer in the real world. I like the fact that these tests are done, as it can give the vendor some info on area's for improvement. In saying this, everyone should still take these tests with a grain of salt. Just because a product may have failed a test, doesn't mean it automatically becomes a POS. There's definitely a very unrealistic expectation that every product should be 100% every time. As myself and others have said numerous times, every product will fail at some point, hence why it's important to practice safe habits regardless of the program/setup you are doing.

We also cannot assume that these tests are 100% flawless either, just like vendors, testers can improve too. We cannot always have the notion that the tester is always right and the vendor is always wrong. Many vendors have security experts working for them that have been in the field far longer and know far more about this stuff that many of us do, this includes Youtube testers. IMO testing is not as simple as it's made out to be and people shouldn't make emotional/financial decisions (buying more and more products) on these tests alone. Every major vendor does their job fairly well, despite what some of these tests may say. Every product has their pro's/con's up's and down's so it's very important to find a product that fits your needs. There are many great choices out there to choose from, pick which one works for you. (y)
 
Last edited by a moderator:
5

509322

I agree 100% with you! This is why I was posting what I did. I don't actually care that Eset failed the test, any product can fail a test, but everyone makes the unrealistic expectation that every product should be 100% perfect. Like you said the reaction is being blown out of proportion, but again this happens when tests come out (professional or YouTube).



Very good points.

It's natural for anyone to defend their work one way or another. I do agree that the reaction has been blown out of proportion and is typical emotional behavior that always happens when these tests are done. People see a failed test and it automatically becomes the end of the world, hence why (especially on security forums) you have this constant musical chairs of security software. People over react and unrealistically expect perfection every time.

If people take the time to read the fine print on some of the "professional" tests you will sometimes see comments like "x product disputed this sample/result....." So I guess every vendor is just making excuses. And to be perfectly clear, I have seen this comment on multiple professional tests, from many vendors including Kaspersky, Symantec, etc... I am not trying to sling mud into people's faces, but point out the fact that what Marcos is saying does have some merit.

Like you I do agree that Marcos does have a point to a degree. His points are valid and do have some merit and shouldn't be brushed off entirely as making excuses.

Maybe I'm crazy, but I am also not a huge fan of this mentality that the "tester" is always right and that the vendor is always wrong. IMHO both the vendor and the tester can always improve. As I've said malware testing isn't as simple as it's made out to be and thank you @Lockdown for echoing this statement. Your experience with Appguard and testing is a good example of why its not that simple.

Like I've said, people like to claim to be experts in this field because they were able to run some malware samples in a VM. I hate to break it to you, but it's not that simple and your not an expert.

I will try to bring this to a close because I don't want this thread to continue to go in circles as many good points have been brought up on both sides.

At the end of the day Eset failed this particular test. Should it have done better, yes/maybe. I say maybe because despite the result, we really don't know all the details fully and it's still not a realistic scenario. This is not how people would use their product/computer in the real world. I like the fact that these tests are done, as it can give the vendor some info on area's for improvement. In saying this everyone should still take these tests with a grain of salt. Just because a product may have failed a test, doesn't mean it automatically becomes a POS. There's definitely a very unrealistic expectation that every product should be 100% every time. As myself and others have said numerous times, every product will fail at some point, hence why it's important to practice safe habits regardless of the program/setup you are doing.

We also cannot assume that these tests are 100% flawless either, just like vendors, testers can improve too. We cannot always have the notion that the tester is always right and the vendor is always wrong. Many vendors have security experts working for them that have been in the field and know far more about this stuff that many of us do, this includes Youtube testers. IMO testing is not as simple as it's made out to be and people shouldn't make emotional/financial decisions (buying more and more products) on these tests alone. Every major vendor does their job fairly well, despite what some of these tests may say. Every product has their pro's/con's up's and down's so it's very important to find a product that fits your needs. There are many great choices out there to choose from, pick which one works for you. (y)

1. I looked quickly at the test and it appeared legit to me.

2. All-around reaction to the test:

kermitflail.gif
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top