Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Software
Browsers
Web Extensions
ClearURLs
Message
<blockquote data-quote="Decopi" data-source="post: 867247" data-attributes="member: 67091"><p>With all due respect, CleanURLs is not what you're trying to promote in this post.</p><p></p><p>CleanURLs' main function, as its name says, is to clean urls (url tracking parameters). In simple words, the main function is to remove part of url links. That's all. Period!</p><p>Now, please do a simple test at CleanURLs: Keep only the url clean function (turn off everything else, "hyperlink", "ETag" etc)... and you will see that CleanURLs blocks almost nothing. In simple words: You're using a bazooka to kill mosquitoes. Every webpage you load, will need to pass hundreds of RegExps, just to filter very few url tracking parameters. It's a waste of resources, with negative browser performance.</p><p></p><p>Currently 90% of what CleanURLs blocks are "hyperlinks" and "ETags". However, both functions can be achieved without using add-ons at all!</p><p>It's ridiculous to use CleanURLs in order to block "hyperlink" and "ETags", is a distortion of what supposes to be the add-on main function: Just to remove part of urls.</p><p></p><p>In the other hand, the removal of url tracking parameters doesn't need long or updated JS' scripts, nor hundreds of RegExps.</p><p>The small bit of JS' script that removes parts of an url, is the same and it will be the same code for the next years... it's an universal JS' script, and is very small, lightweight and efficient.</p><p>And with regards to RegExps (url tracking parameters), average users will be fine with just 20 generalist RegExps. If someone needs to block specific parameters, again, the efficient way is to customize these blocking parameters (instead of using hundred of unknown RegExps). Same logic goes to blocking lists, 90% of any list always remains unused, but is consuming lot of resources).</p><p></p><p>I was one of the first CleanURLs' testers/reviewers. Also long time ago, I was the person who introduced CleanURLs to ghacksuserjs, in order to include the add-on in the "recommended category".</p><p>But sadly, CleanURLs got worse over time. CleanURLs is the typical example of an add-on that time ago started very slim and focused, but with time became a big pachyderm add-on, unfocused, trying to do in a bad and inefficient way what many other add-ons are doing perfectly fine by using just 10% of resources.</p></blockquote><p></p>
[QUOTE="Decopi, post: 867247, member: 67091"] With all due respect, CleanURLs is not what you're trying to promote in this post. CleanURLs' main function, as its name says, is to clean urls (url tracking parameters). In simple words, the main function is to remove part of url links. That's all. Period! Now, please do a simple test at CleanURLs: Keep only the url clean function (turn off everything else, "hyperlink", "ETag" etc)... and you will see that CleanURLs blocks almost nothing. In simple words: You're using a bazooka to kill mosquitoes. Every webpage you load, will need to pass hundreds of RegExps, just to filter very few url tracking parameters. It's a waste of resources, with negative browser performance. Currently 90% of what CleanURLs blocks are "hyperlinks" and "ETags". However, both functions can be achieved without using add-ons at all! It's ridiculous to use CleanURLs in order to block "hyperlink" and "ETags", is a distortion of what supposes to be the add-on main function: Just to remove part of urls. In the other hand, the removal of url tracking parameters doesn't need long or updated JS' scripts, nor hundreds of RegExps. The small bit of JS' script that removes parts of an url, is the same and it will be the same code for the next years... it's an universal JS' script, and is very small, lightweight and efficient. And with regards to RegExps (url tracking parameters), average users will be fine with just 20 generalist RegExps. If someone needs to block specific parameters, again, the efficient way is to customize these blocking parameters (instead of using hundred of unknown RegExps). Same logic goes to blocking lists, 90% of any list always remains unused, but is consuming lot of resources). I was one of the first CleanURLs' testers/reviewers. Also long time ago, I was the person who introduced CleanURLs to ghacksuserjs, in order to include the add-on in the "recommended category". But sadly, CleanURLs got worse over time. CleanURLs is the typical example of an add-on that time ago started very slim and focused, but with time became a big pachyderm add-on, unfocused, trying to do in a bad and inefficient way what many other add-ons are doing perfectly fine by using just 10% of resources. [/QUOTE]
Insert quotes…
Verification
Post reply
Top