dhkptaim

Simple steps on site upgrade after three treatments of dead links

 

two: direct use of 404 error page

 

: use Robots.txt shield or use 301 redirect

is a lot the most widely used methods for this method. But this will also encounter a problem, that is if the content of your site is very large, as the author of the site, in the revision before the upgrade contains 263 thousand, as shown below, you can use these two methods? Because the author of the site after the revision, the content level distribution changes, so no to use Robots file screen simple, can only use 301 redirect, but such a large amount collected, using this method is the most time-consuming and labor-intensive methods.

included a large quantity of the site is still very useful, will jump directly to the site within the dead link 404 error page. Then the 404 error page will guide the user to face pressure after the revision. This website can reduce the loss of flow, so that the user can find the site. For the 404 time jump page settings, I think not too short, the best in eight seconds to ten seconds, and the page have to lure visitors to click on the link, as shown below, let the user to click on the jump better than the direct.

This method for

 

site upgrade is an indispensable part of every site in the life course. How to deal with the problems in the process of upgrading the test every successful optimization personnel. In the process of upgrading the site we will inevitably encounter dead links, especially some of which included a large amount of site. The existence of dead links will not only seriously affect the weights of the site, and if there is a lot of dead link will lead to the spider can’t smoothly crawl your site, that eventually led to the recording and snapshot can not keep up. So for the dead link because the revision generated we have to deal with how to properly? Today I just share some of their own experiences three.

three: adhere to the content of stable update

for a newly upgraded site, the update of the content is very important, which can not only quickly after the upgrade to attract the spider crawling, but also can let the spider to grab the old data to replace. The search engine spiders are usually adopt breadth first, to crawl site takes the form of a hyperlink. So we in the stable update site content at the same time to construct reasonable inside chain, step by step stable to attract spider, reincluding content, delete the old information included the effect of.

I think that for a large collection of site, a lot of dead link method is most effective and direct treatment after the revision is second and third. Using these two methods not only.

Leave a Reply

Your email address will not be published. Required fields are marked *