How to improve indexation of the website

How to improve indexation of the website

The concept ""index"" of the sphere of optimization and website promotion is mentioned in total with search engines. Process of indexing is a drawing up the peculiar dictionary arranged alphabetically and to numbers. Its structure allows to define importance of the page by search of a certain information.

It is required to you

  • - website;
  • - Internet.

Instruction

1. Pages of a resource are indexed by search engines. An algorithm of processing of the specific website at each of them the. The list of pages on the Internet appears upon a specific request which was entered by the user.

2. Search robots are responsible for indexing of the websites. They treat search queries of users in own way. To improve indexation of the website, for robots it is necessary to create special instructions.

3. For improvement of indexation of the website it is necessary to hold a number of events. First of all create a site map in html a format. For this purpose it is enough to use special automatic generators. Look, for example, here: http://www.xml-sitemaps.com/

4. The ready site map needs to be loaded into the main directory, a root of your website. To check its availability, type in the search box data: Your http://имя resource/sitemap.hml. Make a site map visible for searchers. Visit services of Yandex the Webmaster and Google Webmaster Tools.

5. Manage search robots by means of the robots.txt file. Create it in the normal text editor.

6. All search robots can specify certain commands. For example, Disallow is a directive for the ban from indexing of separate sections of the website. Prohibit not being value for the user and for search engines of the page, for example, technical.

7. That the robot learned about structure of the website, register a command with use of the card in the sitemap.xml format. For example: User-agent: YandexAllow: / Sitemap: http://mysite.ru/site_structure/my_sitemap.xml

8. Allow access of the robot to all website or to separate pages, using the directive Allow: User-agent: YandexAllow: /

9. Set Crawl-Delay values. The robot will hardly visit your website constantly, but it will allow to accelerate indexation of a resource: User-agent: YandexCrawl-delay: 2 # sets a timeout in 2 seconds

10. To create robots.txt of Google, visit a page ""The center of webmasters of Google"". Select the necessary website on the homepage of tools for webmasters.

11. In the section ""Website Configuration"" click ""Access of the scanner"". Then go to the tab to create robots.txt, select settings of access for robots by default and save the file on the computer. Pass into the control panel of the website and load the created file into the root directory.

12. Work with social bookmarks, pay attention on these: bobrdobr.ru, memori.ru, moemesto.ru, myscoop.ru, rumarkz.ru, 100zakladok.ru, mister-wong.ru, bookmark.searchengines.ru.

13. Social bookmarks are actually external references to your website. These services were created for convenience of users. The account on such resource gives the chance to add references to pleasant articles, the websites.

14. It as in the browser, the Favorites button, only an Internet resource allows users to integrate in groups on interests, and it is possible to get access to bookmarks from any computer. Increase in quantity of external references will force robots to visit your site more often.

15. To accelerate indexation of the website, it is necessary to provide more frequentation by robots of its pages. For achievement of the goal regularly refresh pages of the website, fill up it with new, unique information.

Author: «MirrorInfo» Dream Team


Print