Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Video Reviews - Security and Privacy
ESET Internet Security 2019 Review
Message
<blockquote data-quote="ForgottenSeer 72227" data-source="post: 780777"><p>I agree 100% with you! This is why I was posting what I did. I don't actually care that Eset failed the test, any product can fail a test, but everyone makes the unrealistic expectation that every product should be 100% perfect. Like you said the reaction is being blown out of proportion, but again, this happens when tests come out (professional or YouTube).</p><p></p><p></p><p></p><p>Very good points.</p><p></p><p>It's natural for anyone to defend their work one way or another. I do agree that the reaction has been blown out of proportion and is typical emotional behavior that always happens when these tests are done. People see a failed test and it automatically becomes the end of the world, hence why (especially on security forums) you have this constant musical chairs of security software. People over react and unrealistically expect perfection every time.</p><p></p><p>If people take the time to read the fine print on some of the "professional" tests you will sometimes see comments like "x product disputed this sample/result....." So I guess every vendor is just making excuses. To be perfectly clear, I have seen this comment on multiple professional tests, from many vendors including Kaspersky, Symantec, etc... I am not trying to sling mud into people's faces, but point out the fact that what Marcos is saying does have some merit.</p><p></p><p>Like you I do agree that Marcos does have a point to a degree. His points are valid and shouldn't be brushed off entirely as making excuses.</p><p></p><p>Maybe I'm crazy, but I am also not a huge fan of the mentality that the "tester" is always right and that the vendor is always wrong. IMHO both the vendor and the tester can always improve. As I've said malware testing isn't as simple as it's made out to be and thank you [USER=56349]@Lockdown[/USER] for echoing this statement. Your experience with Appguard and testing is a good example of why its not that simple.</p><p></p><p>Like I've said, people like to claim to be experts in this field because they were able to run some malware samples in a VM. I hate to break it to you, but it's not that simple and your not an expert.</p><p></p><p>I will try to bring this to a close because I don't want this thread to continue to go in circles, as many good points have been brought up on both sides.</p><p></p><p>At the end of the day Eset failed this particular test. Should it have done better, yes/maybe. I say maybe because despite the result, we really don't know all the details fully and it's still not a realistic scenario. This is not how people would use their product/computer in the real world. I like the fact that these tests are done, as it can give the vendor some info on area's for improvement. In saying this, everyone should still take these tests with a grain of salt. Just because a product may have failed a test, doesn't mean it automatically becomes a POS. There's definitely a very unrealistic expectation that every product should be 100% every time. As myself and others have said numerous times, every product will fail at some point, hence why it's important to practice safe habits regardless of the program/setup you are doing.</p><p></p><p>We also cannot assume that these tests are 100% flawless either, just like vendors, testers can improve too. We cannot always have the notion that the tester is always right and the vendor is always wrong. Many vendors have security experts working for them that have been in the field far longer and know far more about this stuff that many of us do, this includes Youtube testers. IMO testing is not as simple as it's made out to be and people shouldn't make emotional/financial decisions (buying more and more products) on these tests alone. Every major vendor does their job fairly well, despite what some of these tests may say. Every product has their pro's/con's up's and down's so it's very important to find a product that fits your needs. There are many great choices out there to choose from, pick which one works for you. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite130" alt="(y)" title="Thumbs up (y)" loading="lazy" data-shortname="(y)" /></p></blockquote><p></p>
[QUOTE="ForgottenSeer 72227, post: 780777"] I agree 100% with you! This is why I was posting what I did. I don't actually care that Eset failed the test, any product can fail a test, but everyone makes the unrealistic expectation that every product should be 100% perfect. Like you said the reaction is being blown out of proportion, but again, this happens when tests come out (professional or YouTube). Very good points. It's natural for anyone to defend their work one way or another. I do agree that the reaction has been blown out of proportion and is typical emotional behavior that always happens when these tests are done. People see a failed test and it automatically becomes the end of the world, hence why (especially on security forums) you have this constant musical chairs of security software. People over react and unrealistically expect perfection every time. If people take the time to read the fine print on some of the "professional" tests you will sometimes see comments like "x product disputed this sample/result....." So I guess every vendor is just making excuses. To be perfectly clear, I have seen this comment on multiple professional tests, from many vendors including Kaspersky, Symantec, etc... I am not trying to sling mud into people's faces, but point out the fact that what Marcos is saying does have some merit. Like you I do agree that Marcos does have a point to a degree. His points are valid and shouldn't be brushed off entirely as making excuses. Maybe I'm crazy, but I am also not a huge fan of the mentality that the "tester" is always right and that the vendor is always wrong. IMHO both the vendor and the tester can always improve. As I've said malware testing isn't as simple as it's made out to be and thank you [USER=56349]@Lockdown[/USER] for echoing this statement. Your experience with Appguard and testing is a good example of why its not that simple. Like I've said, people like to claim to be experts in this field because they were able to run some malware samples in a VM. I hate to break it to you, but it's not that simple and your not an expert. I will try to bring this to a close because I don't want this thread to continue to go in circles, as many good points have been brought up on both sides. At the end of the day Eset failed this particular test. Should it have done better, yes/maybe. I say maybe because despite the result, we really don't know all the details fully and it's still not a realistic scenario. This is not how people would use their product/computer in the real world. I like the fact that these tests are done, as it can give the vendor some info on area's for improvement. In saying this, everyone should still take these tests with a grain of salt. Just because a product may have failed a test, doesn't mean it automatically becomes a POS. There's definitely a very unrealistic expectation that every product should be 100% every time. As myself and others have said numerous times, every product will fail at some point, hence why it's important to practice safe habits regardless of the program/setup you are doing. We also cannot assume that these tests are 100% flawless either, just like vendors, testers can improve too. We cannot always have the notion that the tester is always right and the vendor is always wrong. Many vendors have security experts working for them that have been in the field far longer and know far more about this stuff that many of us do, this includes Youtube testers. IMO testing is not as simple as it's made out to be and people shouldn't make emotional/financial decisions (buying more and more products) on these tests alone. Every major vendor does their job fairly well, despite what some of these tests may say. Every product has their pro's/con's up's and down's so it's very important to find a product that fits your needs. There are many great choices out there to choose from, pick which one works for you. (y) [/QUOTE]
Insert quotes…
Verification
Post reply
Top