Implementation overview
How to Set your Robot.txt file on Webflow
Setting up a robots.txt file controls how search engines crawl and index your website. This file allows you to allow or disallow specific pages or directories from being indexed, which can improve SEO and prevent sensitive or irrelevant content from appearing in search results.
How to do it on Webflow?
- Access Project Settings: Go to your Webflow project settings.
- Navigate to the SEO Tab: Find the robots.txt section.
- Add Rules: Enter directives to allow or disallow specific paths. To learn more about rules, have a look at the deep dive from Finsweet:
https://finsweet.com/seo/article/robots-txt
How to set your Robot.txt file with Graphite app?
- Access Robots.txt Builder:
- Within Graphite app, locate the "Robots.txt Builder" feature.
- Configure Directives: Use the builder to specify which parts of your site search engines should or shouldn't crawl.
- Save and Publish