Post Content
The primal algorithms crawlers a “the first search engines, and Aliweb Wanderer, dating from 1993” aimed almost exclusively to the metatags, but, as mentioned above, this procedure quickly revealed unreliable: dishonest webmasters a started inflate fraudulent products with descriptions. (To attract visitors, in fact, seeking other information is a fraudulent practice, widespread, that the search engines pursue and condemn increasingly harder. This deception, in English, has been called spamming). + Logical moving left When search engines to comply with the HTMLs information, the question was how to improve the process? How do you measure the relevancy of a website? Key questions in the heart of the business, revealing questions below programmers and creative. A crucial contribution came in the second half of the nineties. Two students from Stanford University, Sergey Brin and Larry Page (developers) decided to calculate the times when a website had been visited another site a desde weba ; this variable, known as inbound link, is, to the Today, one of the most important criteria in defining the relevance of a page.
With the inbound links, Brin and Page aimed to measure the behavior of the network, the comings and goings of users, the same flow of information: they assumed that a page of food was not to recommend other websites that are devoted to the quantum mechanics. An inbound link is a kind of factual recommendation and a way to gauge the popularity of a website. However, there were the long-awaited variable infallible very soon, again, very soon a “because everything has happened so soon, in the history of virtual space ” dishonest webmasters managed to an Algorithms: a created granges virtual links (link farms), reproduced spam links, and so on.
Commenting is disabled.