How to relaunch website keeping ranking and position in SERP
In the last time the most interest to the topic "how to relaunch website", as i shortly realized, rises in Germany, whereby i don't think, that this topic is indeed interesting only there.Fact is: any website relaunch means losses. Losses of traffic, of ranking, of position in SERP, which are finally loss of money. I've seen losses of 4% and 40%, and there aren't the final numbers. The main objective of SEO is to minimize such negative impacts.
In short each website relaunch is a change of design and / or content structure. If content structure changes, with it comes definitely the change of URL structure. So we have generally 2 possible fields, where the new website version could meet a bad acceptance and following loss of traffic:
- users could dislike the new design
- search engines could become bitchy on indexing the new URL structure and some pages will be out of searcher's scope.
Our main principle should be:
to win a battle avoid it.
How to handle design relaunch
To switch new design on and then to begin a convincing battle with visitors, that new design is better - it's unproductive. With the new design version i recommend not to confuse visitors with new design version from today to tomorrow. Let your visitors decide, which design improvements are better. And better are the improvements, which like the visitors:) Make A/B tests with little design improvements, make them permanent. On this way you achieve profit triply:- you will get continually better converting website
- you avoid totally the scaring your visitors away with sudden new design and
- you avoid totally the whole stress regarding time-pressed relaunch.
How to handle relaunch of content structure
Lets talk about the case, where the URL structure of the whole website changes. I will call the site with the new URL structure "the new version":- Every old URL must be redirected with 301 redirect to the correlating new one.
- Try to change as few URLs as possible - do it only if you can't handle otherwise.
- Create an URL map: make a spreadsheet with old and correlating new URLs. Such list is very helpful, e. g. when you will catch 404 after relaunch. Your developer, who creates all redirects will be thankful for this list too. Clarify old URLs don't having their counterparts in new version, and new URLs don't having their counterparts in the old version.
- If there are old URLs without correlating URLs in new version, try to create content pages for them.
- If there are new URLs without their counterpart in the old version, try to get extern backlinks to these URLs.
- For old URLs which remain definitely without their new counterparts create nice 404 pages, and watch out for correct header response.
- Keep sitemaps clean from redirects: old site's sitemap - old URLs, new site's sitemap - new URLs.
- Don't forget to replace the canonical tags from old pages to the new, synchronous to creating 301 redirects.
- If new version gets other protocol as the old one (http→https) and / or other document extensions (html→php etc... ), don't forget to implement these changes in new URL structure too.
- If the relaunching site is so big, that it isn't possible to create an URL mapping and the URL structure is so irregular, that it isn't possible to create redirects using regular expressions and patterns, you must select, which pages get their 301 redirects manually, and which remain without. Look into your Google Analytics (minimum last year) and select for manual creating of 301 redirects pages with:
- most social signals,
- highest PR,
- highest amount of conversions,
- entries, visits,
- pageviews or
- commercial value.
- Another voluminous but necessary task, that is lying ahead, is to re-establish the internal linking structure based on new URLs (surely beside of new menu structure).
- To clean things up create for the new version new Google Analytics ID, new Google Webmaster Tools ID and implement them just before activating of 301 redirects.
- Don't hurry up! Let the old URLs version remain in web and fully available for Google after 301 redirects are activated. How long? Minimum so long till new URLs are indexed. After they are indexed you can begin... no, not delete the old version! After you assured of indexing of new URLs, make the corresponding old URLs with robots.txt noindex and nofollow.
NOTE!
Using robots.txt ensures Google's understanding about devaluation of these URLs, but they don't fly instantly from index away.
- If you want to speed up the Google's indexing routine... you could try, but manually:) Go into your Google's Webmaster tools, under Crawl you see the option "Fetch as Google", after click on it you get a text field, where you can fill you new URL in, you want index, then click onto "fetch" and "index".
- If you want look whether and how many your new URLs are already indexed, just perform a Google search with site:your-domain.tld and use this bookmark for extracting search results (save the following code as bookmark in your browser and click it being on the Google's search results page):
javascript:(function(){output="<html><head><title>High%20Position%20SERP%20Link%20Generator</title></head><body>";output+="";pageAnchors=document.getElementsByTagName("a");var%20linkcount=0;var%20linkLocation="";var%20linkAnchorText="";output+="<table><th>ID</th><th>Link</th><th>Anchor</th>";for(i=0;i<pageAnchors.length;i++){var%20anchorText%20=%20pageAnchors[i].textContent;var%20anchorLink%20=%20pageAnchors[i].href;var%20linkAnchor%20=%20anchorLink%20+%20"\t"+anchorText;var%20anchorID%20=%20pageAnchors[i].id;if(anchorLink!=""){if(anchorLink.match(/^((?!google\.|cache|blogger.com|\.yahoo\.|youtube\.com\/\?gl=|youtube\.com\/results|javascript:|api\.technorati\.com|botw\.org\/search|del\.icio\.us\/url\/check|digg\.com\/search|search\.twitter\.com\/search|search\.yahoo\.com\/search|siteanalytics\.compete\.com|tools\.seobook\.com\/general\/keyword\/suggestions|web\.archive\.org\/web\/|whois\.domaintools\.com|www\.alexa\.com\/data\/details\/main|www\.bloglines\.com\/search|www\.majesticseo\.com\/search\.php|www\.semrush\.com\/info\/|www\.semrush\.com\/search\.php|www\.stumbleupon\.com\/url|wikipedia.org\/wiki\/Special:Search).)*$/i)){if(anchorID.match(/^((?!hdtb_more|hdtb_tls|uh_hl|gb36).)*$/i)){linkLocation+=anchorLink+"<br%20/>";linkAnchorText+=anchorText+"<br%20/>";linkcount++;if%20(anchorText%20===%20undefined)%20anchorText%20=%20pageAnchors[i].innerText;output+="<tr>";output+="<td>"+linkcount+"</td>";output+="<td>"+pageAnchors[i].href+"</a></td>";output+="<td>"+anchorText+"</td>";output+="</tr>\n";}}}}output+="</table><br/><h2>URL%20List</h2><div>";output+=linkLocation;output+="</div><br/><h2>Anchor%20Text%20List</h2><div>";output+=linkAnchorText;output+="";with(window.open()){document.write(output);document.close();}})();Additionally, if you want sleep easy (and you want it, right?), do it on secure way:
- close each page of the site with new URL structure with meta robots from any possible insights (using meta robots will ensure that any insights are 100% prohibited),
- activate your 301 redirects,
- take the crawling tool of your choice, e. g. Screaming Frog,
- give Screaming Frog a list with old URLs, ensure that the number of URLs is correct, and
- let the crawling tool crawl your site with new URL structure criss-cross, then review and correct errors like not performed 301 redirects, 404, wrong Analytics implementation etc.
- Only after this procedure take your prohibiting meta robots away