You must have "Full Access" or "Limited Access - Website" permissions in order to edit your site.
NOTE: Setting robots it doesn’t make it IMPOSSIBLE for a search engine to crawl it, it is just the signpost that says don’t go there. Malicious bots still will attempt to crawl it, so it shouldn’t be used to mask sensitive information.
SportsEngine defaults an organization's website to be searchable on search engines.
If you would like to not have your organization's website or for certain pages of your organization's website (documents, calendar, or comment stream) not appear in search engines, follow the instructions below.
Sign in and click on the HQ Home icon at the top of your screen.
Once in "SportsEngine HQ," click Website, then choose the Website Settings tab.
Choose SEO (Search Engine Optimization)
Choose the Robots Text sub-tab.
Input one of the following below to prevent the scan of your site:
To prevent robots from scanning the entire site, enter in:
To prevent robots from scanning the pages on the site, enter in /(page link):
To allow all robots complete access
(or just create an empty "/robots.txt" file, or don't use one at all)
To exclude a single robot
To allow a single robot
User-agent: Google Disallow: User-agent: *
To exclude all files except one
This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff," and leave the one file in the level above this directory:
Alternatively, you can explicitly disallow all disallowed pages: