rrgoo

سبحان الله وبحمده سبحان الله العظيم

الأربعاء، 30 مايو 2012

Relationship with the search engines

Yahoo and Google offices
                                                                                              

    By 1997 search engines recognized that webmasters are making efforts to rank well in search engines of their own, and that some webmasters to manipulate ranking in search results by stuffing pages with keywords that excessive or irrelevant. Modify the search engines early on, such as AltaVista and Infoseek, special algorithms to try to prevent sites from manipulating rankings. [22]
Due to the high market value of the results of targeted research, and there is potential for an adversarial relationship between search engines and service providers SEO. In 2005, an annual conference, AIRWeb, hostile information retrieval on the Internet, [23] has been established to discuss and minimize the harmful effects of aggressive content providers on the Internet.
Companies can employ aggressive techniques to get their client websites banned from the search results. In 2005, The Wall Street Journal in the company, and the strength of the movement, which allegedly used high-risk techniques and failed to disclose those risks to its customers. [24] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban. [25] Google's Matt Cutts confirmed at a later time that Google did in fact ban Traffic Authority and some of its customers. [26]
Have reached some search engines also to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help improve the site. [27] [28] Google Maps has a program to help webmasters find out if the search is having any problems indexing their website, and also provides data on the traffic to Google. [29] Bing tools provide a means of site owners to submit the site map and a summary of the Web, allowing users to determine the rate of crawl, and the number who have been indexing pages from their search engine.

And began to webmasters and content providers to improve the sites search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all sites needed to do to provide the title page, or URL, for the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and find information on the back page to be indexed. [2] The process involves the spider search engine page is loaded and stored on a server search engine itself, where the second program, known as an indexer, extracts various information about the page, such as the words it contains and where these, as well as any weight for specific words, and all links were the page contains, which are then placed in the scheduling of crawling at a later time.


Site owners began to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white and black hat SEO practitioners hat. According to industry analyst Danny Sullivan, the phrase "search engine optimization" may have come into use in 1997. [3] The first documented use of the term search engine optimization was John Audette and his marketing company Multimedia as documented by a Web page from the site of the MMG August 1997. [4]
Early versions of search algorithms based on the site, providing information such as keyword meta tag, or index files in engines like ALIWEB. Meta tags to provide evidence of the content of each page. Using metadata to index pages was found to be less than reliable, however, because the option of the site administrator for those keywords in the meta tag is likely to be an inaccurate representation of the actual content of the site. Of the data can be inaccurate or incomplete, and inconsistent in its definition, and did not cause the pages to arrange for searches is irrelevant. [5] [unreliable source?] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to arrange as well as in the search engines. [6]
By relying heavily on factors such as keyword density, which was under the exclusive control of the administrator of the site, search engines and suffered early in the abuse and manipulation in the order. To provide the best results to its users, and to adapt to the search engines to make sure of their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters to them. Since is determined by the success and popularity of the search engine through its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account other factors that were more difficult for webmasters to manipulate. Graduate students at Stanford University, Larry Page and Sergey Brin, developed a "Backrub", which is the search engine that relies on a mathematical algorithm to a high rate of web pages. Number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. [7] PageRank estimates the likelihood that is reached to a specific page by a user on the Internet, which currently browsing randomly on the Internet, and follow the links from 1 page to another. In fact, this means that some links are stronger than others, as your top of the site is more likely to be reached by a random server.
Page and Brin founded Google in 1998. Images attracted a loyal following among the growing number of Internet users, who love its design is simple. [8] and considered off-page factors (such as PageRank and analysis of the hyperlink), as well as on page factors (such as the frequency of the word, and meta tags, headings, links and building site) to enable Google to avoid this kind of manipulation seen in search engines which are considered only on a page factors to their ranking. Although PageRank was more difficult in the game, webmasters had already developed tools for link building, and plans to influence the search engine Inktomi, and proven methods as applied to the classification of gaming page. Many sites focused on exchanging, buying and selling links, often on a large scale. The involvement of some of these schemes, or link farms, creating thousands of sites for the sole purpose of spamming the link. [9]
By 2004, search engines had incorporated a wide range of factors that are not disclosed in the ranking algorithms to reduce the impact of manipulation of the link. Google says it occupies ranks sites using more than 200 different signals. [10] leading search engines, Google, Bing, Yahoo, does not disclose the algorithms they use to arrange the pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Allen, have studied different ways to search engine optimization, and have published their opinions in online forums and blogs. [11] [12] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms. [13]
In 2005, Google began customize the search results for each user. Depending on their history of previous inspections, and Google put the results of the registration in the users. [14] In 2008, said Bruce Clay, "The ranking is dead" because of a personal search. It becomes meaningless to discuss how a website ranks, because the arrangement will be different for each potential user, and discuss each of them. [15]
In 2007, Google announced a campaign against paid links that PageRank transfer. [16] On June 15, 2009, Google revealed that it has taken measures to mitigate the effects of sculpting PageRank with nofollow attribute on links. Said Matt Cutts, a software engineer at Google, known, that the Google bot will not cure nofollowed links in the same way, in order to prevent service providers SEO using nofollow to sculpt PageRank. [17] As a result of this change of use of nofollow lead to the evaporation of PR. Engineers developed the SEO in order to avoid the above, and alternative technologies that replace it with the nofollowed obfuscated JavaScript, and thus allow the Sculpture Category. In addition to that has been proposed many solutions that include the use of Iframes, Flash and JavaScript. [18]
In December 2009, Google announced that it will be used in the history of research on the Internet for all users in order to compile search results. [19]
The Google Instant, real-time, search, in late 2010 in an attempt to make search results more timely and relevant. Historically, site administrators have spent months or years to improve Web site to increase search rankings. With the growth in the popularity of sites and social media and blogs to make changes to the engines driving algorithms to allow for the classification of new content quickly within the search results. [20]
In February 2011, Google announced the update Panda ", which penalizes sites that contain content repeated from other sites and sources. Historically have copied the content of sites from each other, and benefited in the search engine rankings by engaging in this practice, but Google's implementation of The new system, which punishes the content of sites that are not unique. [21]

                                                                                      

ليست هناك تعليقات:

إرسال تعليق