Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
General Security Discussions
Stupid default settings of some Antiviruses
Message
<blockquote data-quote="ForgottenSeer 72227" data-source="post: 804628"><p>I agree with what others have said in regards to performance impact and performance tests. The problem with trying to assess performance is that it's not as simple as it's made out to be. There are way too many variables to properly assess the performance impact of an AV/Suite (ie: hardware/software configs, how one uses the computer, etc...). These tests try to make it cut and dry, when in fact it's not. All of these tests should be taken with a huge grain of salt, especially the YouTube ones because they tend to perform these tests in a VM. I'm sorry, trying to compare a system that runs directly on the hardware to a VM is just silliness. The VM will never, ever be as fast a running it directly on the hardware, so right there the VM is already introducing a performance impact of it's own.</p><p></p><p>Another thing that is often overlooked is how does the product impact the system over a longer period of time and not just the first day or 2 a test is ran? Many of these programs will develop large cache files and stuff over time, who's not to say that maybe 8 month's later it's not slower than it was before due to things like large cache files?</p><p></p><p>IMO another thing that cannot be overlooked is the simple fact that, with the software and hardware available today, we are able to measure things (like performance) that otherwise would not be noticeable to us humans. For example, a lot of people were performing tests of various products using Novabench. Lets say product x get a score of 700 and product y gets a score of 760, do you honestly believe you can actually notice that difference in the read world, my guess is it will be a no. You cannot forget how perception can play an impact as well. If people keep saying that x product is slow as molasses and it may be true, but is someone goes into it with a preconceived notion that it's "slow as molasses", chances are they will go , oh ya it is slow. Now the questions we have to answer are, how much of that is true, or how much of that is simply due to the fact that it really only took 1-2 second longer, but we perceived it to by way slower than it really was, simply due to the fact we went into it with a perceived notion that it was slow?<img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite132" alt=":unsure:" title="Unsure :unsure:" loading="lazy" data-shortname=":unsure:" /></p><p></p><p>To echo what other's have said, don't loose sleep over it.<img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite109" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p></blockquote><p></p>
[QUOTE="ForgottenSeer 72227, post: 804628"] I agree with what others have said in regards to performance impact and performance tests. The problem with trying to assess performance is that it's not as simple as it's made out to be. There are way too many variables to properly assess the performance impact of an AV/Suite (ie: hardware/software configs, how one uses the computer, etc...). These tests try to make it cut and dry, when in fact it's not. All of these tests should be taken with a huge grain of salt, especially the YouTube ones because they tend to perform these tests in a VM. I'm sorry, trying to compare a system that runs directly on the hardware to a VM is just silliness. The VM will never, ever be as fast a running it directly on the hardware, so right there the VM is already introducing a performance impact of it's own. Another thing that is often overlooked is how does the product impact the system over a longer period of time and not just the first day or 2 a test is ran? Many of these programs will develop large cache files and stuff over time, who's not to say that maybe 8 month's later it's not slower than it was before due to things like large cache files? IMO another thing that cannot be overlooked is the simple fact that, with the software and hardware available today, we are able to measure things (like performance) that otherwise would not be noticeable to us humans. For example, a lot of people were performing tests of various products using Novabench. Lets say product x get a score of 700 and product y gets a score of 760, do you honestly believe you can actually notice that difference in the read world, my guess is it will be a no. You cannot forget how perception can play an impact as well. If people keep saying that x product is slow as molasses and it may be true, but is someone goes into it with a preconceived notion that it's "slow as molasses", chances are they will go , oh ya it is slow. Now the questions we have to answer are, how much of that is true, or how much of that is simply due to the fact that it really only took 1-2 second longer, but we perceived it to by way slower than it really was, simply due to the fact we went into it with a perceived notion that it was slow?:unsure: To echo what other's have said, don't loose sleep over it.:) [/QUOTE]
Insert quotes…
Verification
Post reply
Top