You must have "Full Access" or "Limited Access - Website" permissions to edit your site.
NOTE: Setting robots doesn’t make it IMPOSSIBLE for a search engine to crawl it, it is just the signpost that says don’t go there. Malicious bots will still attempt to crawl it, so it shouldn’t be used to mask sensitive information.
An organization's website will default to be searchable on search engines.
If you would like to not have your organization's website or for certain pages of your organization's website (documents, calendar, or comment stream) not appear in search engines, follow the instructions below.
Sign in and click on the HQ Home icon at the top of your screen.
Once in "SportsEngine HQ," click Website, then choose the Website Settings tab.
Choose SEO (Search Engine Optimization)
Choose the Robots Text sub-tab.
Input one of the following below to prevent the scan of your site:
To prevent robots from scanning the entire site, enter in:
User-agent: *
Disallow: /
To prevent robots from scanning the pages on the site, enter in /(page link):
User-agent: *
Disallow: /documents
Disallow: /page/show/1240820-calendar
To allow all robots complete access
User-agent: *
Disallow:
(or just create an empty "/robots.txt" file, or don't use one at all)
To exclude a single robot
User-agent: BadBot
Disallow:
To allow a single robot
User-agent: Google Disallow: User-agent: *
Disallow: /
To exclude all files except one
This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff," and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/stuff/
Alternatively, you can explicitly disallow all disallowed pages:
User-agent: *
Disallow: /~joe/junk.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html
Helpful Tip! Sensitive information should only be placed on private pages, to which only appropriate members have permission.