|
Google Panda is a search algorithm introduced by Google on February 24, 2011 to filter sites in search results. It was created with the purpose of sorting sites and weeding out low-quality pages filled with non-unique texts with an abundance of spelling and stylistic errors, duplicate and unreliable information. Thanks to the filter, the user is offered sites with the most relevant content, excluding pages created for the purpose of aggressive SEO promotion.
Filter development
Initially, the filter worked for users from the USA. According to reports from SEO specialists, the innovation led to a significant reduction in traffic from the Google search engine: by 12% for large sites and almost 50% for small resources. About 10% of sites with small and medium traffic fell under the filter and were banned from the system. After testing the Panda algorithm, Google started working in other countries.
In Russia, for example, site filtering began on April 24, 2012 and led to similar results - the drop of some sites in search results reached 50 positions. The development of the Google Panda algorithm, like other algorithms for search engines, was carried out in an atmosphere of secrecy. Details were not disclosed for ordinary users and webmasters. At the moment, only two developers are known - Matt Kats and Amit Singhal. Other details are not disclosed due to official Google policy.
Until 2016, it was possible to remove a site from blocking only after the next update for the filter. They came out approximately once every 2 months. In total, about 30 of updates have accumulated since algorithm's launch.
Since January 2016, the Google Panda filter has been implemented in the search engine ranking algorithm (Core Ranking Algorithm). For webmasters, this innovation brought benefits, because blocking and removal of a site from a ban occurs on the fly, there is no need to wait for the next update.
What does the algorithm take into account?
New filtering algorithms are designed to improve the quality of SERPs for the end user. Sites with low-quality content are much less likely to make it into the top 10. The main criterion is the reliability and usefulness of the information on the page. If a site is useful, then it has a much higher chance of ranking at the top for certain keywords. At the moment, several aspects of Google Panda have been reliably identified to which the filter responds primarily:
· uniqueness of content;
· link profile quality;
· behavioral factors (user behavior).
To avoid falling under the Google “Panda” filter, it is necessary to place on the site pages only unique, structured texts with a minimum number of spelling and stylistic errors. Posted links should not lead to blocked or poorly visited sites. A special feature of the Google search engine is that it takes into account behavioral factors. Based on some indirect signs, experts conclude that this factor is becoming more significant. In this case, the number of transitions, time spent on the site, citations, etc. are taken into account.
How to remove a site from a filter
The restrictions imposed by the Panda filter are designed to force webmasters to be more attentive to the content of their pages. To do this you need:
· post unique content - texts, images and videos;
· update content regularly;
· reduce part of the reference mass;
· remove irrelevant content, advertising, links;
· optimize the interface and structure of the site for user convenience;
· improve behavioral factors. Resources that post links to sources of borrowings can work properly without falling under the filter.
Google representatives do not disclose the exact percentage of the critical mass of links, but webmasters should work on creating unique content. According to some subjective estimates, the filtering accuracy is about 90–95%, so there might be resources that are unreasonably lowered in search results. In this case, you should consult with technical support representatives and, if necessary, take appropriate measures.
Google did not stop search improvement with Panda. Further updates came with Penguin algorithm.
|