aseboeternal.blogg.se

Download coccoc vn
Download coccoc vn














Please note that our robots don't support crawl delays greater than 10 seconds. Coc Coc's robots interpret the Crawl-delay value as an integer number of seconds the robot must wait between two consecutive requests. If you want to lower the rate at which Coc Coc's robots visit your site, you can use the Crawl-delay directive. The Sitemap directive is independent of the User-agent directives. You can add the Sitemap directive to instruct our robots to use sitemap files. # Disallow access to all urls ending with '.ajax' # Disallow access to all urls containing 'private' in their paths To cancel this behavior, add a dollar sign ($) to the end of the rule. Note that, by default, every !Allow/Disallow directive implies a trailing *. The asterisk (*) in Allow and Disallow directives means any sequence of characters. # Allow access of all Coc Coc's robots to pages starting with '/dogs/naughty' despite the presence of the Disallow directive If two directives (Allow and Disallow) are equally specific, the Allow directive takes precedence. # but allow access to pages starting with '/cats/wild', except those pages which start with '/cats/wild/tigers' # Disallow access of all Coc Coc's robots to pages starting with '/cats' If there are multiple directives which can be applied to a URL, the most specific directive is used.

Download coccoc vn download#

# Disallow access of all Coc Coc's robots to all pages of the site except URLs which start with '/docs'Īn empty Disallow directive allows robots to download all pages of the site. To allow robots to access your site or its parts, use the Allow directive.

download coccoc vn

# Disallow access to pages starting with '/cgi-bin' for coccocbot-image # Disallow access to the whole site for all robots If you want to instruct robots to not access your site or certain sections of it, use the Disallow directive. # All of Coc Coc's robots are instructed to not download any documents from /cgi-bin and /ajax. In this case, all instructions for that robot are used together, for example: You can mention the same user agent multiple times. All characters from the first # in a line up to the end of the line are not analyzed by robots. Note that you can use comments in your robots.txt file. # All Coc Coc's other robots are instructed to not download any documents from '/cgi-bin'. # coccocbot-web and coccocbot-image are instructed to not download any documents from '/ajax'. # All other robots are still allowed to download all documents from the site. # All of Coc Coc's robots are instructed to not download any documents from '/cgi-bin'. # All robots, including all of Coc Coc's robots, are instructed to not download any documents from '/cgi-bin'. # No robots are instructed to not download any documents from '/cgi-bin'. All less specific matches are ignored, for example: Every Coc Coc robot tries to find the User-agent directive that most closely matches its name. You can use those names in the User-agent directive to write instructions for a particular robot. You can find information about all of our robots here. Directives User-agentĮvery Coc Coc robot has its own name. If the response is 200 OK, the robot analyzes the returned content, extracts directives from it, and uses those directives until the robot's next request to the robots.txt file. If the robot receives any response other than 200 OK, it assumes that it has unrestricted access to all documents on the site. If the robot is unable to receive any response to this request, the site is treated as not available and excluded from crawling for a period of time. Redirects up to 5 hops are supported for this request. Before requesting any other URLs from a site, the robot requests the site's robots.txt file using a GET request via either HTTP or HTTPS.

download coccoc vn download coccoc vn

Upload it to your site's root directory.Ĭoc Coc's robots request robots.txt files from sites regularly. Create a text file with the relevant directives described below

download coccoc vn

If you want to use the robots exclusion standard for your site: This article describes how Coc Coc's robots interpret robots.txt files. This is the same standard adopted by most search engines, though individual search engines may respond to the standard's directives in slightly different ways. It allows webmasters to control website access for web robots.Ĭoc Coc's robots support the robots exclusion standard. Robots.txt is a text file that contains instructions for web robots.














Download coccoc vn