Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Video Reviews - Security and Privacy
ESET Internet Security 2019 Review
Message
<blockquote data-quote="ForgottenSeer 72227" data-source="post: 780683"><p>This is my stance as well. Some my see me as an Eset fanboy, but I am not, I could care less actually. I would do the same for any product. If someone did a YouTube video and showed that Kaspersky failed (for example) and people started saying OMG it's a piece of crap, I would say what I have been saying for Eset, for Kaspersky, or any other reputable product for that matter.</p><p></p><p>I've read the thread on the Eset forum regarding the test and I suggest people read what Marcos has been saying, as I agree with him. People may see him as making up excuses for Eset, but he's not, he has many valid points and these all need to be considered when someone tests any product. In the case of Leo's test, Eset failed, would I have liked to see a better result, absolutely, but hopefully Eset will take this and try to make some improvements to their product. If you read what Marcos has said, he does bring up many good points about the test. I don't think he's defending the fact that Eset failed, but rather the fact that Eset didn't get a chance to validate the test results. Leo didn't submit any of the missed samples to Eset, there's no law saying he has to, but IMHO it's only good practice to do so, especially when all we have is your word regarding the samples. I am NOT saying that Leo rigged the test, but he should have given Eset a chance to look at it.</p><p></p><p>[USER=53544]@RoboMan[/USER] brought up some good points earlier in the thread about testing in general as well. Malware now a days is so complicated that testing isn't as simple as it's made out to be. It does require some very good insight and knowledge to see what is really happening. Saying that on-demand scanners didn't pick up anything in the system is not a good reason to say well it's 100% clean. They are useful tools, but how do we know that it's not just a false negative and that the on-demand scanners just missed the sample? Also I've seen many tests (including Leo's) where by he runs a sample nothing happens and the automatic assumption is the sample doesn't work...how do we really know nothing happened to the system instead?</p><p></p><p>So many people proclaim themselves as experts when all they have done is disable protection to test signatures (not a real world scenario, nor giving the product it's full capabilities (ie: web scanner)), then run the missed samples with protection enabled and run on-demand scanners. This method can work, but again it's not a true representation of how people use their computers or how a product truly performs. All this is, is bench racing, and testing only specific components of the product, not it's full capabilities. To be very clear, I would say this in defence for any product, not just Eset tested in this manner.</p><p></p><p>My point is every test should be taken with a grain of salt. While it's fun to test, it isn't as simple as it's made out to be, as malware is very complicated now a days. IMHO anyone doing a test, should at least submit the missed samples to the vendor to not only allow them the chance to protect their users from them, but also give them a chance to validate them. Again, I am not here to defend Eset, but rather the mentality that after one test (where we really don't know all the in's and out about the test, including the vendor) we should just start labelling the product and or any product, as a piece of crap.</p></blockquote><p></p>
[QUOTE="ForgottenSeer 72227, post: 780683"] This is my stance as well. Some my see me as an Eset fanboy, but I am not, I could care less actually. I would do the same for any product. If someone did a YouTube video and showed that Kaspersky failed (for example) and people started saying OMG it's a piece of crap, I would say what I have been saying for Eset, for Kaspersky, or any other reputable product for that matter. I've read the thread on the Eset forum regarding the test and I suggest people read what Marcos has been saying, as I agree with him. People may see him as making up excuses for Eset, but he's not, he has many valid points and these all need to be considered when someone tests any product. In the case of Leo's test, Eset failed, would I have liked to see a better result, absolutely, but hopefully Eset will take this and try to make some improvements to their product. If you read what Marcos has said, he does bring up many good points about the test. I don't think he's defending the fact that Eset failed, but rather the fact that Eset didn't get a chance to validate the test results. Leo didn't submit any of the missed samples to Eset, there's no law saying he has to, but IMHO it's only good practice to do so, especially when all we have is your word regarding the samples. I am NOT saying that Leo rigged the test, but he should have given Eset a chance to look at it. [USER=53544]@RoboMan[/USER] brought up some good points earlier in the thread about testing in general as well. Malware now a days is so complicated that testing isn't as simple as it's made out to be. It does require some very good insight and knowledge to see what is really happening. Saying that on-demand scanners didn't pick up anything in the system is not a good reason to say well it's 100% clean. They are useful tools, but how do we know that it's not just a false negative and that the on-demand scanners just missed the sample? Also I've seen many tests (including Leo's) where by he runs a sample nothing happens and the automatic assumption is the sample doesn't work...how do we really know nothing happened to the system instead? So many people proclaim themselves as experts when all they have done is disable protection to test signatures (not a real world scenario, nor giving the product it's full capabilities (ie: web scanner)), then run the missed samples with protection enabled and run on-demand scanners. This method can work, but again it's not a true representation of how people use their computers or how a product truly performs. All this is, is bench racing, and testing only specific components of the product, not it's full capabilities. To be very clear, I would say this in defence for any product, not just Eset tested in this manner. My point is every test should be taken with a grain of salt. While it's fun to test, it isn't as simple as it's made out to be, as malware is very complicated now a days. IMHO anyone doing a test, should at least submit the missed samples to the vendor to not only allow them the chance to protect their users from them, but also give them a chance to validate them. Again, I am not here to defend Eset, but rather the mentality that after one test (where we really don't know all the in's and out about the test, including the vendor) we should just start labelling the product and or any product, as a piece of crap. [/QUOTE]
Insert quotes…
Verification
Post reply
Top