The volume of information which is stored in the Internet is extremely huge. It is impossible to find among these data something manually. Search engines are designed to automate process. They represent the computer systems making systematization of data and search by requests.
1. The programs called boats continuously work on servers of search engines. The boat is a reduction from the word "robot". On the behavior they really remind robots. Periodically visiting each site from the list which is stored on the server, they bring local copies of all texts into accord with the current versions of the same texts on web pages. Bots follow all links which it meet and if find the newly created page - add it to the list and too create the local copy. Copies are not posted on the Internet - they are only the integral components of process of obtaining the list of the websites. So, copyright infringement does not happen.
2. Try to enter into the same search engine the same phrase several times. You will find out that every time results will be built in the same order. It changes seldom, is not more often than once a day. The reason of it is simple - the sequence of arrangement of search results is determined by rather difficult algorithm. Use frequency on pages of any given words, the number of the links to this page located on other websites and also some other factors is taken into consideration.
3. Owners of the websites, seeking to bring the resources in this list to the first places, are engaged in optimization of the texts placed on them. This optimization is "white" - directly resolved rules of "search engines", "sulfur" - not resolved, but also not prohibited and also "black" - directly prohibited. In the latter case the website can soon disappear from the list forever. Algorithms of optimization it is frequent more difficult than algorithms of sorting of search results.
4. After input of a keyword or a phrase the program on the server looks for coincidence in all local copies of texts. Then results are sorted by the difficult algorithm stated above. After that the content management system automatically generates the page which is transferred to the browser. At the request of the user also the subsequent pages of the list can be generated: the second, third and so on.