Serious Discussion Cloudflare Gateway Free Plan

I switched to your script and will update you after a few scheduled Hagezi Pro blocklist updates.
Here is the update. I have scheduled 4-hour interval updates. The update run is the same as mrrfv's, but your script is approximately 100% faster (nearly half of mrrfv's time) for me. Your script also creates Cloudflare "Locations." I didn't try locations with mrrfv's script.

adb.png
 
Last edited:
  • Like
Reactions: SeriousHoax
@Marko :) @rashmi @SeriousHoax (members more knowledgeable on this topic)

Would this option be of usefull to add for security?

View attachment 294406
I only use Cloudforce One but I'm not sure if it blocked anything so far.
Here is the update. I have scheduled 4-hour interval updates. The update run is the same as mrrfv's, but your script is approximately 100% faster for me. Your script also creates Cloudflare "Locations." I didn't try locations with mrrfv's script.

View attachment 294408
What are the locations exactly?
 
@Marko :) @rashmi @SeriousHoax (members more knowledgeable on this topic)

Would this option be of usefull to add for security?

View attachment 294406
Cloudflare provides "standard" protection with security and other categories. The "Indicator Feeds" are like "extended" protection; they seem vetted and shouldn't affect usability. You may enable them; enable the Cloudflare "block" page, which provides a link to the blocked domain, to check if "Indicator Feeds" blocked the domain. You don't need "or" logic and can select both UK and Cloudforce feeds in "Value."

Thanks for sharing it with us. Would a simular script importing Hagezi-TIF increase the security?
(I can hardly imagine that an individual would have access to more malware feeds than a large company)
The Hagezi TIF full version is much bigger than the file list limit (total domains) in the Cloudflare free plan.
 
@SeriousHoax, I noticed that your script consistently downloads nearly 115 domains, and mrrfv's downloads about 75 fewer than the total listed at the Hagezi Pro link.

The Cloudflare Token has "Zero Trust" READ and EDIT permissions. Are these permissions all your script needs?
 
Last edited:
Cloudflare provides "standard" protection with security and other categories. The "Indicator Feeds" are like "extended" protection; they seem vetted and shouldn't affect usability. You may enable them; enable the Cloudflare "block" page, which provides a link to the blocked domain, to check if "Indicator Feeds" blocked the domain. You don't need "or" logic and can select both UK and Cloudforce feeds in "Value."
Okay thanks, I will change that. (y)
 
  • Like
Reactions: rashmi and Marko :)
I wanted to increase privacy a little so I reduced the logs to block only and enabled removing sensitive information (free plan has fixed retention period of 24 hours) (y)

Because meeting was cancelled I filled my time with the absolute summon of useless activity by changing ...
..... the looks of the Cloudflare blockpage (matching it with Google safe browsing block page).:ROFLMAO::ROFLMAO::ROFLMAO:

View attachment 294466
 
@LinuxFan58, I enabled "Exclude personally identifiable information (PII) from logs." Are you referring to this setting? I changed the Cloudflare block page slightly.

cbs.png
 
  • Hundred Points
Reactions: LinuxFan58
I have also created another version which stores version information in the policy description in Cloudflare Gateway like you wanted to do instead of saving on the JSON file. I haven't put this in my GitHub yet. I will have to test it more to check if it always works reliably. This will become the default.
Also remove the requirements.txt file dependency; this will make your script standalone. DeepSeek AI gave me your script without the requirements.txt file, but I didn't test it.
 
Here is the update. I have scheduled 4-hour interval updates. The update run is the same as mrrfv's, but your script is approximately 100% faster (nearly half of mrrfv's time) for me. Your script also creates Cloudflare "Locations." I didn't try locations with mrrfv's script.

View attachment 294408
Here is another update: impressive speed and stability!

adb.png
 
  • Love
Reactions: SeriousHoax
@Marko :) @rashmi @SeriousHoax (members more knowledgeable on this topic)

Would this option be of usefull to add for security?

View attachment 294406
I think Cloudflare's security categories are already using those feeds. Not 100% sure but since those are given as values, it is highly likely.
Thanks for sharing it with us. Would a simular script importing Hagezi-TIF increase the security?
(I can hardly imagine that an individual would have access to more malware feeds than a large company)
It may block a few more things. TIF itself uses other publicly available sources. You can see them here:
But are you using Cloudflare Zero Trust free or paid? On free plan TIF full exceeds the 300K domain limit.
@SeriousHoax, I noticed that your script consistently downloads nearly 115 domains, and mrrfv's downloads about 75 fewer than the total listed at the Hagezi Pro link.

The Cloudflare Token has "Zero Trust" READ and EDIT permissions. Are these permissions all your script needs?
Did you mean 115 less?
Because thanks to your comment, I noticed that the domain parsing regex in my code had issues. It was rejecting punycode domains and domains with double hyphens from the Hagezi filters. So, from the last Pro++ filter it removed 127 domains (latest version at the time of writing).
Then I checked what regex is being used by mrrfv. Testing his regex, turns out it removed 66 domains. So still not perfect. I think @Marko :) first noticed this.

But now I have fixed the regex in my script. Now all domains from Hagezi filters will be put into Cloudflare's list (y)
Also remove the requirements.txt file dependency; this will make your script standalone. DeepSeek AI gave me your script without the requirements.txt file, but I didn't test it.
The requirements.txt helps to store cache in GitHub so helps installing the required libraries very quickly. So, it saves some time. I will test how much time does it actually save.
Edit: Removed, requirements.txt.
 
Last edited:
@rashmi I have removed requirements.txt now. It was for GitHub Action only. My action yml file used it but it's unnecessary. I have modified my action file which still saves cache but without needing that file. Though it looks like, saving cache has little to no benefit in terms of speed for this script.
 
@rashmi I have removed requirements.txt now. It was for GitHub Action only. My action yml file used it but it's unnecessary. I have modified my action file which still saves cache but without needing that file. Though it looks like, saving cache has little to no benefit in terms of speed for this script.
I updated the script and action file, adding the new commits and removing the requirements file. The Hagezi Pro download matched the total domains listed at the Hagezi Pro link. 👍 I'll give you an update after a few scheduled runs.
 
Last edited:
  • Applause
Reactions: SeriousHoax