Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Security
Video Reviews - Security and Privacy
A Bitdefender Internet Security test
Message
<blockquote data-quote="Fabian Wosar" data-source="post: 572767" data-attributes="member: 24327"><p>Personally, I dislike the 100% label as well. Nothing is 100%. I guess people just get tired of writing "100% of the samples tested". And yes, every AV vendor pays to participate in these tests. Apparently, they all pay the same. Whether that is entirely true only AV Comparatives will know. Knowing how much they ask for, I don't see why they would require "extra money" though. They make money hand over fist already.</p><p></p><p>One aspect that is often forgotten is how these tests are performed. Cruelsister's test ignores a whole bunch of security layers. Just to name a few:</p><ul> <li data-xf-list-type="ul">All URL blocking is avoided, samples just magically appear on the system (there are some very aggressive URL filters out there, a lot of which are responsible for those 100% results; just look at Trend Micro for example)</li> <li data-xf-list-type="ul">Unlike files downloaded from the internet, it is likely that all those files miss the zone identifier (a lot of behaviour blockers will give extra bias towards files that come from the internet, using the zone identifier that is present whenever you download a file in a browser, but that are never there if you just get them from a sample pack)</li> <li data-xf-list-type="ul">The JavaScript downloaders usually arrive in archives, which can make a difference (for example certain security tools will harden applications and limit what they can do; Word or WinRAR executing wscript.exe for example, would be a huge no-no)</li> </ul><p>AV Comparatives is a bit more well-rounded and tries to emulate the usual infection vectors as closely as possible. They essentially take a malicious URL, visit it with a vulnerable system and the product installed. They then try to do anything to get the system infected, like clicking Run if a download window pops up or clicking all the links. The whole process is highly automated including control runs and video recordings of all systems. It's actually pretty neat from a technical point of view. However, if one product relies heavily on URL reputation and blocking for example, it would fare very well in an AV Comparatives test, but very poorly in a cruelsister test.</p><p></p><p>It does have its drawbacks though. Certain infection vectors are underrepresented in my opinion. Mostly anything related to file sharing and trojanised warez, probably due to legal reasons. They also tend to ignore email as an infection vector. I guess for most users it won't matter, since I would imagine people mostly use webmail anyway, which turns email into a normal download infection vector, which is well covered in the AV Comparatives testing, but I do feel they can improve in that regard. AV Test for example does test with malicious emails and a normal email client as well.</p></blockquote><p></p>
[QUOTE="Fabian Wosar, post: 572767, member: 24327"] Personally, I dislike the 100% label as well. Nothing is 100%. I guess people just get tired of writing "100% of the samples tested". And yes, every AV vendor pays to participate in these tests. Apparently, they all pay the same. Whether that is entirely true only AV Comparatives will know. Knowing how much they ask for, I don't see why they would require "extra money" though. They make money hand over fist already. One aspect that is often forgotten is how these tests are performed. Cruelsister's test ignores a whole bunch of security layers. Just to name a few: [LIST] [*]All URL blocking is avoided, samples just magically appear on the system (there are some very aggressive URL filters out there, a lot of which are responsible for those 100% results; just look at Trend Micro for example) [*]Unlike files downloaded from the internet, it is likely that all those files miss the zone identifier (a lot of behaviour blockers will give extra bias towards files that come from the internet, using the zone identifier that is present whenever you download a file in a browser, but that are never there if you just get them from a sample pack) [*]The JavaScript downloaders usually arrive in archives, which can make a difference (for example certain security tools will harden applications and limit what they can do; Word or WinRAR executing wscript.exe for example, would be a huge no-no) [/LIST] AV Comparatives is a bit more well-rounded and tries to emulate the usual infection vectors as closely as possible. They essentially take a malicious URL, visit it with a vulnerable system and the product installed. They then try to do anything to get the system infected, like clicking Run if a download window pops up or clicking all the links. The whole process is highly automated including control runs and video recordings of all systems. It's actually pretty neat from a technical point of view. However, if one product relies heavily on URL reputation and blocking for example, it would fare very well in an AV Comparatives test, but very poorly in a cruelsister test. It does have its drawbacks though. Certain infection vectors are underrepresented in my opinion. Mostly anything related to file sharing and trojanised warez, probably due to legal reasons. They also tend to ignore email as an infection vector. I guess for most users it won't matter, since I would imagine people mostly use webmail anyway, which turns email into a normal download infection vector, which is well covered in the AV Comparatives testing, but I do feel they can improve in that regard. AV Test for example does test with malicious emails and a normal email client as well. [/QUOTE]
Insert quotes…
Verification
Post reply
Top