
Are you afraid to get under the filter? Afraid of the ban ?Optimizers do not trust?
Read and understand what is really to be feared and what not.
Filters Yandex
The task of search engine users to issue queries to the most relevant and really interesting resources and pages.
In the struggle for the purity of issuing search engines introduced a series of filters designed to weed out "dishonest" methods of promotion .
At the same time we see the unity and struggle of opposites. On the one hand the higher the value Yandex website, the more it is links to other resources on the other fights on the purchase link.In fact, the struggle is with the phenomena, most prominent on the acceptable methods. Yandex declares that he is against purchased options, but how to check? And why does the exchange work successfully links and sites are in them in large numbers are not blocked? Yes, because there is no other way to assess and, therefore, that if the site owner spends money on the links, so it is important visitors, and then he will do anything to the site was interesting and turned visitors into customers (conversion).
So, here's a partial list of the known filter Yandex:
"Nepos" filter
This filter is applied for the sale of links to sites and to link spam, with the unnatural links. Imposed both on the whole site or individual pages as well as the specific references.
How does it: Stopping the influence of outgoing links from the site. How to avoid: Do not sell links, or sell, but very cautiously.
Exceed Density Keywords filter
This filter is applied to pages that contain too great a density of key phrases .
How does it: Pages discarded from the index Yandex. How to avoid: Write texts for people to lower the density of 5-7%.
Redirect filter
This filter is applied for the use of javascript redirects, which are typically used in doorways.
filter automatically catches the sites and do not allow them to index.
How does : Ban the site.
How to avoid : Do not use javascript redirects.
Filter, "You last»
How does it: The site is not in issue at the request How to avoid: write unique " content . " Prohibit and prevent copying of their texts to other sites.
Links Text
This filter is applied to the page contents and the title of which, there is no search.
How does it : Yandex provides them with the phrase "... found the link (the link text)"
How to avoid: Add a title and contents of the site in a key request.
Filter affiliates
The operating principle of the filter: if the index is a group of sites of similar subjects, the same or similar content and with the same contact information (probably the same owner), then the extradition request, which they all "relevant" only shows one of the most relevant group.
Google Filters
The principles of promoting a site in Google does not differ greatly from advances in Yandex. In fact, working to advance in Yandex, while you move the site and in Google.
Based on Google and our experience in optimizing websites, we can confidently say the following typical Google filters:
"Sandbox"
The first filter is faced by "young" sites - a "sandbox" (sandbox).
Applied to the newly-created sites with a history of not more than six months. Filter "sandbox" page does not let the "young" sites in the search results for high-demand. Restrictions on grant requests by low-frequency filter is not seen. The site can be found in the "sandbox" for a long time, from 3 months or more. The exact mechanism of action of the filter is unknown to anyone for sure.There are observations that it is not applied to all new sites. Probably, the selectivity of the filter depends on the subject, and from the site. Really interesting, filled with lots of original texts of the site in such a filter may not fall.
There are signs by which we can determine that the site is in a "sandbox":
- The site is indexed in Google and is regularly visited by the search robot
- Search for domain gives correct results, with the correct titles, descriptions, etc.
- The site is okay on rare and unique phrases in the text pages
- However, it is not visible in the first hundred results at the request has been optimized for that.
To get out of the "sandbox", to make the site more filled with unique original content. Also, the timing of exit from the "sandbox" influenced by external links.
Filter Bombing or "bonding"
It is used for links that have the same anchor text (anchor text). Google rightly believes that it is not natural.
To avoid falling into the filter when you buy options, use the unique anchors.
Filter "More results"
If your pages are caught in the filter is "supplemental results", which means that they considered less worthy of attention. In this case, your pages will be displayed in the search results only if the "worthy" of pages missing. To exit from such a filter is usually quite a few inbound links.
The filter 'Domain Age "
This filter is similar to the filter "sandbox." Many SEOs believe the older the domain, the more confidence it makes the search engines. Accurate observations of the filter does not.
Filter "-30"
This filter Google applies to websites that use deceptive or black methods, such as door ways (doorway) and redirect (redirect) via JavaScript. When these mechanisms are found by search engines, issuing requests is reduced by 30 positions. In order to get out of this filter is sufficient to remove the inserts from these code pages.
Filter "duplicate content"
In the network there are plenty of sites with the same or similar content, ranging from doubles individual articles and pages to complete analogue site. Google wants to punish such sites and pages, and lowers them in the search results. Use only original content on the site. Make sure that the content is not copied to other unscrupulous site owners. If you find facts reprint information, you can complain to Google.
Filter "omitted results"
Page of the site may fall under such a filter and even actually fall out of search results for all the same reasons: a small number of incoming links, duplicate content from other sites, duplicate titles, and other "meta tags" as well as a weak or absent internal pages . So try to avoid these errors. Pay attention when you write meta tags, do not use the same descriptions and titles on different pages. It is better to abandon the use of tags: d E scription and keywords at all, than to write on all pages the same tags.
Filter «Links»
Often, to exchange links on this site is put a special page on which there are references to "partner sites". Link Exchange - an old way to improve the position in search results, and is not encouraged by search engines, as does the site Bole interesting and informative. Now this method is not effective, and better to abandon it.
Filter, "Co-citation"
Google follows the theme of "donors" who link to your site. For example, if there are links to a porn site or a site very distant subjects, Google lowers the weight of such a link or does not take into account. Also, when a large number of such links, it can damage your site's topics and, consequently, lower in search results on thematic inquiries. To avoid this, look for sites that buy links.
Filter "A lot of links at once"
The site falls under the filter when it is too sharply increasing number of external links. This can lead to a delay accounting reference and does not give quick results, but rather slow process. To avoid this, you should links gradually, in small portions and add new ones, as the indexing of links bought earlier.
Filter "Broken Links"
If an internal link leads to a page that does not exist on the site, search engine robot can not reach it. And could not put it in the index. It reflects badly on a result of delivery and overall performance quality of the site. Try to avoid such links, link to home page from all pages. Create and use the site map. Use the file Robot. txt , to decline to an index unnecessary to search for pages such as forums, contacts, administrative page of the site. Try to think of ways to navigate and structure of the site before you create it.
The parameter "Loading"
This option does not apply to filters, but ignored by search engines. If the robot is a search engine for some time could not open the page of your site, it will not fall in the index and as a consequence, the results are issued. So do not overload the page information and the "scripts" to process that takes a long time. Break the "heavy" for a few pages of "light".
Common filter "degree of confidence"
Common filter includes all the above. It's called the "degree of confidence» (Google Trust Rank).The filter takes into account the following indicators:
- Age Site
- The number and credibility of incoming links
- The number of outbound links
- The quality of the internal navigation (relink)
Sites with a plethora of key phrases with their excessive density, can get under the filter is over-optimization. So do not overdo it. Remember all should be in moderation.
0 comments:
Post a Comment