Overview
Before go-live of a website, it's a good idea to block the website from search engine indexing. This will prevent the website from being listed in search results. Under normal conditions, most websites will want to be indexed by search engines, so they are listed in the search index and when people are searching for their content, the website will be found in the search results.
When the website is ready for launching the robots.txt file needs to be removed.
How to create a robots.txt page
Navigate into the website under the Web Module

- Click on website title where the robots.txt file is needed
- The pagetree should open on the right side
- At the top of the pagetree, click on the Add page button

- In the Add page popup:
- Title: robots.txt
- Click on the Create button
- The Page preset list will appear, do not add any preset to this page!

- Open the Page Settings for editing:
- Click on the General tab:
- Title: robots.txt
- Slug: robots.txt (the slug needs to be modified to exactly "robots.txt" and not robots-txt!)
- Click on Modify button, modify text, click on green check to accept changes to the slug
- Hide in menus: set checkmark
- Hide in Sitemap: set checkmark
- Hide in breadcrumbs: set checkmark
- Click on Update button
- Click on the Advanced tab:
- In the side menu click on the Text option
- Outputs text: set checkmark
- Text Content (add next 2 lines):
User-agent: *
Disallow: / - Click Update button
- In the pagetree, drag and drop the page to the top of the Page tree of the website. This helps to remember to remove it before the website goes live.
Website launching
When the website is ready to go-live, simply delete the robots.txt page and the effects of the robots.txt file on search engine indexing will be removed.
Additional Information
- Testing tool for checking the robots.txt file: https://technicalseo.com/tools/robots-txt/
2025-12-19 to review