This has been one of the most requested features in the last year but also one of the most polarizing. Some of Magic Pages' customers want to block every AI bot from touching their content. Others want to explicitly allow them, hoping for better visibility in AI-generated answers.
Both positions are valid. And so far, we couldn't give you a clean way to act on either one.
Why This Wasn't Possible Before
With Bunny.net, there was no practical way to filter traffic by bot type at the CDN level. You could edit your robots.txt to signal your preferences, but that relies on crawlers actually respecting it – and many don't. There was no enforcement layer.
Magic Pages' recent move to Cloudflare changes that. Their firewall can identify and act on AI crawler traffic before it ever reaches your Ghost site. That means we can go beyond polite suggestions in robots.txt and actually enforce your choice.
What's Coming
We are adding a simple toggle to the customer portal that lets you choose how Magic Pages handles AI crawlers on your site:
- Block: Cloudflare's firewall will block known AI crawlers (GPTBot, Google-Extended, CCBot, and others) from accessing your content.
- Allow: AI crawlers are explicitly permitted, which may improve your visibility in AI-generated search results and answers.
No code injection, no custom firewall rules to manage yourself. One toggle, and it's done.
If you're a journalist, author, or publisher who wants to protect your original work from being ingested into training datasets without compensation – this is for you.
If you're a marketer, blogger, or business owner who sees AI search as a distribution channel and wants to make sure your content is indexed – this is also for you.
The point is: it's your content, and it should be your decision.
When to Expect It
This will roll out in Q1 of 2026 as part of the deeper integration of Cloudflare into Magic Pages' infrastructure.