Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Software
Browsers
Web Extensions
[Updated 29/12/2018] Browser extension comparison: Malwares and Phishings
Message
<blockquote data-quote="Decopi" data-source="post: 725754" data-attributes="member: 67091"><p>[USER=51905]@Evjl's Rain[/USER] ... I tried to emulate your test (as much as possible). I tried to use your hosts for UBlock, and your phishing/malware sources (I took from your first comment in this post, describing your test). I used 10 samples for each source. I mixed the samples, taking a few from newest on the list, other from the middle, and others from the end of the list (on same day).</p><p></p><p>But again [USER=51905]@Evjl's Rain[/USER] , this was a very sloppy quick informal test from my side. I need to run something more formal, perhaps on next week.</p><p></p><p>At first sight, I have some unproven conclusions:</p><p>1) Test methodology need to be improved (at least from my side).</p><p>2) It does not exist the perfect solution, or the perfect add-on etc. A combination of tools seems to be better.</p><p>3) Tools/Add-ons must be tested considering their strengths. For example, Netcraft x Safebrowsing for phishing, Norton x Bitdefender for malwares etc. I really don't see much sense in comparing everything with everything.</p><p>4) Also, tools/add-ons must be tested considering "zero-day" samples x 1 day old samples... because test results are dramatically different even in same source</p><p>5) Browsers must be updated, and also add-ons must be with data-bases updated. This is not so simple to be done. An add-on data-base not updated, will show very different test results.</p><p>6) Etc</p></blockquote><p></p>
[QUOTE="Decopi, post: 725754, member: 67091"] [USER=51905]@Evjl's Rain[/USER] ... I tried to emulate your test (as much as possible). I tried to use your hosts for UBlock, and your phishing/malware sources (I took from your first comment in this post, describing your test). I used 10 samples for each source. I mixed the samples, taking a few from newest on the list, other from the middle, and others from the end of the list (on same day). But again [USER=51905]@Evjl's Rain[/USER] , this was a very sloppy quick informal test from my side. I need to run something more formal, perhaps on next week. At first sight, I have some unproven conclusions: 1) Test methodology need to be improved (at least from my side). 2) It does not exist the perfect solution, or the perfect add-on etc. A combination of tools seems to be better. 3) Tools/Add-ons must be tested considering their strengths. For example, Netcraft x Safebrowsing for phishing, Norton x Bitdefender for malwares etc. I really don't see much sense in comparing everything with everything. 4) Also, tools/add-ons must be tested considering "zero-day" samples x 1 day old samples... because test results are dramatically different even in same source 5) Browsers must be updated, and also add-ons must be with data-bases updated. This is not so simple to be done. An add-on data-base not updated, will show very different test results. 6) Etc [/QUOTE]
Insert quotes…
Verification
Post reply
Top