|
Post by afrina022 on Nov 22, 2023 23:40:40 GMT -5
In this case robots see that the main mirror is a URL without the www prefix. And even if users enter an address with it they will automatically be transferred to the main mirror. It should be taken into account that for Google and a number of other search engines the directive is not prescribed it must be reported about the main mirror in your personal account Google Search Console. Crawl delay handbrake for a robot Using this directive we can limit the time between robot calls to the site during one session. This was done specifically for sites that operate Country Email List on very weak servers and when the indexing robot comes in the resource begins to freeze or even gives a xx server response. The writing syntax looks like this Crawl delay where is the time in seconds betwee. The bot’s calls to the site. Search engines themselves recommend using this directive only in extreme cases. Special charactersTo simplify the composition of the file and to speed up the interpretation of information from the file by are used. The first and most important character is * asterisk It means that you can substitute any number of any characters instead. In general terms it looks like thetc. How can this be applied in our case? For example we have pages like this site.ua catalog site.ua catalog site.ua catalog site.
|
|