Implementation overview

Webflow Robots.txt Guide: Control How Google Crawls Your Site

Robots.txt is a text file that tells search engine crawlers which parts of your site they're allowed to access. It's not a ranking factor — it's an access control file. But getting it wrong can accidentally block Google from crawling pages you want indexed.

The default robots.txt Webflow generates allows all crawlers access to everything. That's the right default for most sites. The question is whether you need to modify it, and if so, how.

When you actually need to modify robots.txt. Block pages you don't want crawled — not just noindexed. There's a difference: noindex tells Google not to include a page in its index, but the page is still crawled. Robots.txt blocks the crawl entirely. Use robots.txt to save crawl budget: admin interfaces, duplicate URL variations, staging sections, or internal search results pages that shouldn't be indexed.

What to be careful about. Robots.txt cannot contain errors. A syntax mistake that accidentally blocks your entire site — the classic Disallow: / error — will stop Google from crawling anything. This has happened to large sites.

Also: robots.txt cannot de-index a page that's already indexed. If a page is in Google's index and you block it in robots.txt, Google may keep showing it in search results because it can no longer crawl the page to find the noindex tag. To remove an already-indexed page, use noindex, not robots.txt blocking.

How to set robots.txt in Webflow. Go to Project Settings → SEO. There's a dedicated robots.txt field where you can add custom rules. These merge with Webflow's default crawl rules.

The most common custom rule on a Webflow site: block the /admin/ path if you're using Webflow's Memberships feature, to prevent crawling of member-only areas.

After modifying robots.txt, verify by going to yourdomain.com/robots.txt in your browser. Confirm the rules are correct before publishing.

For most Webflow sites, the default robots.txt is fine. The value in checking it is making sure nobody modified it during development in a way that's still live.

How to do it on Webflow?

  1. Access Project Settings: Go to your Webflow project settings.
  2. Navigate to the SEO Tab: Find the robots.txt section.
  3. Add Rules: Enter directives to allow or disallow specific paths. To learn more about rules, have a look at the deep dive from Finsweet:

https://finsweet.com/seo/article/robots-txt

How to set your Robot.txt file with Graphite app?
  1. Access Robots.txt Builder:
  2. Within Graphite app, locate the "Robots.txt Builder" feature.
  3. Configure Directives: Use the builder to specify which parts of your site search engines should or shouldn't crawl.
  4. Save and Publish

Don't have the Checklist yet?