☠ Undead SEO ☠ 
Advanced SEO: tools, techniques, thoughts

Mittwoch, 17. Juni 2020

Merging and advanced filtering of GSC and GA data with Search Console Helper

<tl;dr>

Search Console Helper merges GSC and GA data and applies multiple condition filters

It is not an advertising!
It is a kind recommendation to all of my colleagues - the tool is of an outstanding nature, i was waiting for it since i'm doing SEO!

Search Console Helper allows, among others:
  • To work with up to 5.000 daily data rows,
  • To work with 24 months of GSC data (not maximally 16, like GSC allows)
  • Filter GSC data with multiple conditions for keywords and urls. There are including and excluding filters, supporting the kind of regular expressions,
  • One-click merging of GSC and Google Analytics data,
  • To export data on any way you need,
  • and much more unique stuff you can't imagine ;)
</tl;dr>

Let us look, what happens with the Google's Search Console, previously Google Webmaster Tools, and on the market around of it.

Montag, 1. Juni 2020

Adsterra aka Adsetica Ad Network test: 100% fraud traffic, don't touch it!

I selected Adsterra, also known as Adsetica as Ad Network for test from the publusher role, because of their statement to run CPA campaigns for Chrome extensions. Its campaign turned out to be 100% - read further why.

Dienstag, 16. Oktober 2018

Google Ads manager account (MCC) hacks your Chrome browser and shares your personal data

Suddenly i realized a funny bug/feature of current stable Chrome 69.0.3497.100 (Official Build) (64-bit) and Google Ads manager account:


  1. If there are some people logging into one after one from own Chrome into MCC,
  2. If syncing of personal data in their Chrome is ON,
  3. If no one of them has a syncing passphrase...


then:

the user, who currently logs in into MCC, gets all personal data (bookmarks, passwords - yes, passwords!) into the own Chrome of the user, who was logged in into MCC previously.

Montag, 27. August 2018

Testing structured data of the current URL with one click

test structured data with one click
Any SEO know about benefits of structured data. Google offers a special structured data testing tool to test the existence and quality of the structured data on the certain URL.

But what, if you want to test some URLs? Copy URL, switch the tab, go to the Google Structured Data Testing Tool, paste URL, press Enter. And again, and again...? Annoying!

I've coded a solution for this kind of routine task.

Montag, 28. Mai 2018

What's wrong with image dimensions?

Yes, what's wrong? I say you - pretty much is wrong. I've completed a kind of little study for our firm about how well is image optimization in times of mobile first. You know what! I was disgusted twice:
  1. roughly 85% images are oversized - this is not a dark secret, we are used to this in the meanwhile,
  2. roughly 20% images are oversized in terms of dimensions: it means 20% of original images are higher than their displaying dimensions.
Read further: study setup, detailed results, toolchain and scripts

Freitag, 16. März 2018

German Google News kicks asses

According to the german version of Google News Help:

  • Google News doesn't exist for Germany (Germany isn't in the list of countries, where Google News is available),
  • there is a country in Asia calling Amsterdam, like on screenshot:


Google, please, don't outsource to India!

Montag, 5. März 2018

Screaming Frog 9.0 exports screenshots, original and rendered HTML code

If rendering Javascript and storing/rendering HTML options are activated, you are able to export screenshots of rendered URLs, as like as their original and rendered HTML code:

Export screenshots and source code of crawled urls in Screaming Frog 9.0

Screaming Frog 9.0 renders websites with Chrome 60

The new but not obvious thing in Screaming Frog 9.0 is the rendering engine. You should know - it isn't Chrome 41 (41.0.2272.118), which is used by Google's web rendering service (WRS). With it Screaming Frog can't render websites to 100% as Googlebot. But approximately identically to ;)

It is rather Chrome 60 (60.0.3112.113). This Chrome version was choosen by Screaming Frog developers as more stable AND with fewest rendering differences to the Chrome 41. You can compare both versions more detailedly at https://caniuse.com/#compare=chrome+41,chrome+60.

According to Screaming Frog support Chrome 60 version was selected due to working stability with the spider engine at scale, rather than because of rendering issues.

You can see the current Chrome version in the debug window:

Screaming Frog 9.0 uses Chrome 60 as rendering engine. Information about Chrome version is placed in the debug window.

Mittwoch, 4. Oktober 2017

How to disable Excel preview in Windows Explorer on Windows 7 and Windows 10.

Preview of Excel tables, specially of bigger size, can go pretty hard on the system performance. Some ressources recommend to use setting of Windows Explorer - but on this way one can only disable ALL previews.

With any file extention there is a Preview Handler associated. To disable only Excel previews, but to keep all other, which aren't so performance hungry, like previews of images or PDFs, one should edit (better as delete) one key in Windows registry.


  • Start registry editor with regedit: Start → Run → regedit
  • Find under HKEY_CLASSES_ROOT\.xlsx\ShellEx\ the key called {8895b1c6-b41f-4c1c-a562-0d564250836f}
  • Edit it (i don't recommend to delete it), like on screenshot:

How to disable Excel preview in Windows Explorer under Window7 / Windows10

  • Close regedit,
  • Restart Windows Explorer: (Start → Run → Type cmd →Enter → Type taskkill /f /im explorer.exe → Enter → Type explorer.exe → Enter)
  • Enjoy (tested on Windows7 and Windows10)

Freitag, 11. August 2017

Three perfect image optimizers for SEO

What is this - a perfect image optimization tool for SEO?

best image optimizing tools for seo
For me the perfect image optimization tool must have following features:
  • This image optimizer is not a kind of hipster online tool, but the old good installable software,
  • This image optimizer is free - not a kind of shity shareware, which means it would be free, but indeed isn't,
  • User can setup an optimization level,
  • The tool must do batch image compression - real SEO ninjas don't have much time,
  • This image optimizer must, no - it MUST understand nested folders and
  • it MUST be able to save optimized images in the same nested folder/subfolder structure,
  • This tool, again, MUST be able to save optimized images without to change the file type,
  • This tool should be able to resize images saving the width/height ratio.
  • And, of course, it must do its compression job well ;)
Do i want too much? No, i believe, it isn't too much.

I will not explain, why image compression is substantially important for your website's rankings. Just one sentence: images are the most heaviest static asset type on ANY website. Do you still don't want to compress them? No prob, just as you wish. But why are you still here?

For non-ignoramuses among us: yes, such tools actually exist - they aren't just a subject of SEO fable tales. I know even three tools, which are able to accomplish every single task i listed above. Want to know more? Let's go →

Freitag, 4. August 2017

HowTo guide: filter Search Console for multiple values at once

Setting of multiple filters in the Google Search Console is possible!

how to setup multiple filters in search console

Yes, against different answers of some experts in the Google product forum, like here or there. The possibility to set multiple filters is extremely useful and works into both directions - you can include or exclude multiple keywords, urls, traffic sources - anything, what you want.

We will operate with filter settings through manipulation of urls. You'll need for this a good code editor - good means Notepad++, one magic plugin for it, and, surely, a bit of chutzpah - yeah, we'll do the big G, nothing less!

Samstag, 3. Juni 2017

Free online SEO tools for TF-IDF calculation

Free online tools for TF-IDF calculation
First of all: what's TF-IDF? In German this content parameter is called WDF / IDF, but the subject is the same:

TF-IDF, term frequency–inverse document frequency, is, explained in short, from rookie to rookie, the relation of term usage frequency in the given document to usage frequency of this term in all documents containing it.

This relation mirrors how relevant is the given document for the given term in the manifold of all documents containing this term.

TF-IDF is successor of keyword density. Some non-demented SEO geriatrics can remind, what means keyword density: number of term's usage in the text divided through the number of all words in the text multiplied with 100. This formula is in reality both most used and wrong.

Well, i'm not your Wiki, and if you're in the full legal age and not banned by Google, you would find fast everything you need to know about keyword density and TF-IDF. I just say: nowadays TF-IDF is nearly the single numeric parameter of content quality. But this article is about something other - i've promised to share some freebies, right?

Mittwoch, 28. Oktober 2015

How to prevent negative SEO impacts caused by poor HTML quality

Poor HTML impacts SEO
The question rises over and over again: whether and how could HTML markup negatively impact SEO. Googlebot is indeed a smart HTML interpreter:
  • it has a high tolerance to HTML syntax errors,
  • it doesn't force websites to comply with W3C validation rules.
But nevertheless there are some HTML misuses, which could painfully hurt SEO. To begin with this topic i relate to a pair of posts by two respectable Googlers and by commenting the posts i list HTML issues causing negative SEO effects:

Dienstag, 27. Oktober 2015

Solution: how to avoid Google's crawling and indexing non-existing pages

Many webmasters are affected from the weird issue: Google is indexing (at least crawling) non-existing URLs. The issue isn't depending of whether one uses Wordpress or other CMS. This question about why Google is crawling and / or indexing non-existing URLs appears in all webmaster forums, Google Groups and so on, but without a clear solution.

The fact, that Googlebot creates and crawls a bunch of non-existing URLs, lets arise some questions:

  • Where non existing URLs are coming from?
  • Why is it not optimal, if non-existing URLs are crawled respectively indexed?
  • How to minimize risks related to non-existing URLs?

Montag, 26. Oktober 2015

Making decision: hosting of javascript libraries on external CDN or locally

whether to host javascript libraries locally or at external javascript cdnThe benefits of hosting common JavaScript libraries externally at JavaScript CDN are well known - i list them just in short: external hosting of JavaScript libraries
  • reduces own server's traffic
  • makes parallel loading possible
The first question is: where to host? Directly on library vendor's site? Or somewhere else? There are mainly two factors, which let us make a decision, where we host our JavaScript libraries:
  • Speed
  • Popularity
The second question is more general: whether to host externally? Or locally?
Lets look on some details, which help to choose an optimal public hosting of common JavaScript libraries.

tl;dr;
  • If you doubtlessly want to host JavaScript libraries externally, host them at Google Hosted Libraries - so you get highest chance, that your visitors have them already in chance and don't need to download them again.
  • If you doubt, host better your JavaScript libraries locally - visitors, who have JavaScript libraries in cache are very few, and the average visitor gets the data on fastest way.

Montag, 27. Juli 2015

After-Panda SEO for intermediary businesses

After Panda SEO for intermediary business
What we definitely know about Phantom and Panda Updates:
  • Phantom and Panda updates are about onpage quality, whatever it might mean;)
  • Duplicated content fraction of a given page is one of the most important factors, which rank a page down
  • Duplicated content can be easy measured

SERPs disintermediation - Google battles intermediaries

There is a Federal trade Comission's report, about how Google misuses its prevalence to kick off intermediary players from some search verticals, like comparison sites or aggregated shopping, where sites sell products from different manufacturers.

Google means, intermediary businesses are poachers and steal Google's money. Google means, SERP is the place for users to see original manufacturers or direct service provider. SERP should be not the place for intermediary services, cause they are secondary. And the sign of secondarity is easy to measure: it is the fact of presence and the proportion of the duplicated content.

The intermediary job to compare and to aggregate stuff and user voice would belong only to Google, because only Google were good, honest, and, last, but not least, it doesn't offer duplicated content - it's just a search engine, not?
Google is a strong rival, playing by own rules. But do you still want to survive this battle?

Mittwoch, 11. Februar 2015

SEOtools for Excel: solutions for loosing installation folder and disappeared ribbon

My installation of SEOTools for Excel on Windows7 x64 / Excel x32 didn't want to cooperate with me from the first step. As first Excel refuses to open seotools.xll properly - it thought always it would be a text file. Then, after try to install SEOTools x64 version as addin, it wasn't not visible in the ribbon at all, but doesn't want to be deinstalled. I was forced to delete it on the hard way. Then, on trying to install SEOTools x32 version, i was pretty near success: i got the strat splash screen from SEOTools, but then an error alert raised, The Ribbon/COM Add-in helper required by add-in SeoTools could not be registered. This is an unexpected error. Error message: Exception has been thrown by the target of an invocation. And nothing more.

After some investigations it becomes clear, that the problem is in the not corresponding versions of the machine (x64), Win7 (x64) and Excel 15 (x32). BTW. if you need to get to know, what is installed on your machine - here are all the places listed, where you get needful information about your hardware, OS and Excel.

Dienstag, 13. Januar 2015

SEO query string universal solution

universal solution for seo troubles caused by urls with query strings
URLs with query strings can be a real poison for a SEO. The main and mostly harmful damage untreated URLs with string query do, is a not calculable rise of amount of existing URLs with same content, HTTP answer code 200 and untreated indexing management, also called duplicated content. Another issue caused by query strings in URLs is overspending of crawl budget to URLs with query strings, which must be better excluded from crawling and indexing.

On this way a site with untreated URLs with query strings gets on the one side such URLs into index, which don't belong here, on the other side the crawling budget for good URLs could be missed, cause overspend.

There are some passive techniques to deal with query strings in URLs. Actually i planned to publish existing techniques for dealing with query strings in URLs and my solution for SEO problems caused by query strings in URL into my ultimate htaccess SEO tutorial, but then this topic got some more details, so i decided to create an extra article about query strings in URL and SEO.

Preceding kinds of SEO dealing with query strings in URLs

  • while Google means, it could deal with query strings in URLs, it recommends to adjust the bot's settings in Webmaster Tools for each of existing query strings.
  • URLs with query strings could be disallowed in the robots.txt with a rule like
    Disallow: /?*
    Disallow: /*?
    
  • If header information of HTML or PHP files, available with URLs with query strings, can be edited, so it is possible to add rules for indexing management and URL canonicalisation, like
    <meta name="robots" content="noindex, nofollow">
    <link href="Current URL, but without query string" rel="canonical">
    
These methods are mainly manual, require unpredictable workload and solute problems partly. But the good news is: i have an universal solution, working for all URLs with query strings and getting rid of all SEO troubles caused by query strings in URLs.

Freitag, 19. Dezember 2014

Freebase shuts down! Your free Google-proof way into Knowledge Graph closes at 31.03.2015

Freebase closes
Everybody, who tried it, knows, whether it is possible and easy to publish an article about a business in the Wikipedia, specially if the business is far away from being in Fortune500, or Fortune 5k, or even Fortune 500k;) But small, local businesses, like individual entrepreneurs too, have an absolutely legit wish and need to get own websites into the Knowledge Graph...

The only free, public way to create a Google-proof web entity is (not much longer) a Freebase entry. Well, smart people create at least 2 entries simultaneously: first for the person, and second for the business entity, with the person entity as an author for the business entity. I wrote an article about an entity creation with a Freebase entry, which was highly spread and liked. But today is a day for the bad news:

Freebase will close in the nearest time!

And, from 31.03.2015 on Freebase will be available only in read-only status: no more new entries, no more edits. Freebase database will be then integrated into the Wikidata. I relate with it to the yesterday post from Freebase Google Plus account, where the closing and integration road map is detailedly described. In the mid 2015 Freebase will be shut down as the standalone project. But what means putting Freebase out of service for all of us, who wants appear in the Knowledge Graph, but haven't enough mana to appear in the Wikipedia?

Sonntag, 23. November 2014

Facebook changed on last friday its ads targeting rules... and becomes doubtless moron champion

Earlier it was possible to target an ad audience only by interest, without be forced to select any other option. Since last Friday (even yesterday) one must select... "at least one country!!!" And the ad can be target to maximum 25 countries.

So now, if one wants to target an ad to the whole world, not dependently from country, only on interest - one is forced to set up 8 ads instead of one, and select manually 196 existing countries.

Is this now not ultramoronic? 

Donnerstag, 20. November 2014

How to hide webpage parts from Google indexing

prevent indexing page parts
Wishes and needs to hide webpage parts from Google indexing are common. Before i proceed, i make one thing clear:

No tag can exclude webpage part from Google indexing!

The silly advice about a snake oil named googleon / googleoff is mantra-alike repeated in the wild web. The repeating won't make it working: googleon / googleoff do their job only inside of Google Search Appliance environment, said John Müller of Google. Hope we closed this discussion once and forever.

But don't worry! Believe it or not, i have for you whopping 4 workarounds, how to hide webpage parts from Google indexing. Let's look into one by one:

Sonntag, 2. November 2014

How to SEO long URLs

How to optimize long urls seo impact
There are many meanings about the SEO impact of long URLs: they are good, or bad, or have no influence. Our questions in this article are:
  • why long URLs occur,
  • how to make long URLs short,
The origins of long URLs are mainly
  • the wish to stuff URLs with keywords (both of domains and every single URL slugs),
  • the necessity (or, to be honest, the wish too) to reproduce the site's structure in the matching URL's structure.
Before we begin let us declare, what we mean as long URL: how many characters it must have to be named long? The longest URL in the Google's Webmaster Blog is 95 characters long, so let's call long URLs longer as this.

Sonntag, 26. Oktober 2014

How to create your own author-centered knowledge graph

own author-oriented knowledge graph
I'm not a fear salesman, really! But the thought to name this article as "how to defend abusive content removal requests" comes to me from the reading of the Google's updated report on "How Google Fights Piracy". The following sentence makes me conceiving suspicion:
...sites with high numbers of removal notices may appear lower in search results.
On the background of all known Negative SEO cases this can be the next thing, where a honest publisher will be punished for nothing.

There are enough known cases, where content scraping sites get better SERP places as the unique content providers. And there are enough abusive content removal requests - just read the Google's report. The best defense is a good offence. We construct our publishing identities network, which serves as our own author-oriented knowledge graph. Its purposes are, that
  • always working removal requests at Google and DMCA takedowns,
  • lack of effect in case of abusive third part removal requests,
  • doubtless machine-readable relations between author's entity and author's creative work.
Our objective is a solid, structured and chained publisher identity, which will include the author, the publishing medium and the publication itself. Let's work!

How to create nested lists in Blogspot

I've struggled some time with creating of nested lists in my blog at Blogspot. Then i realized a simple workaround to get them done. To create nested lists in Blogspot, you must

How Penguin evaluates the link quality

Google's Penguin update is about links, namely about good and bad backlinks to your site. It will evaluate anchors of your backlinks and the quality of backlinking sites. Further it will look deeper into relations between your site and sites linking to you.

Further are some factors listed, which will help you to rethink your linking strategy and, maybe, enlighten you in purifying your link profile, so the Penguin gets no appetite for eating your site;)

Following characteristics do influence on how Penguin evaluates link profile quality of your site:

Mittwoch, 1. Oktober 2014

Solution for "stop spamming us. you're wasting your time" at Hacker News / Ycombinator

solution for "stop spamming us. You are wasting your time"
If you get this message on trying to post something at Hacker News / Ycombinator (not dependently of submitting method: bookmark or site form), it means, the site you try to post is flagged by any HN admin as spam. By mistake or not is another question - i know the case, when YouTube was flagged as spam there. Anyway our intention is not to dispute with HN admins whether our list is spam or not, but rather just to post a link, what we want to post. Deleting cookies doesn't help. But the working solution is simple:

Dienstag, 23. September 2014

Pro and contra of images encoding as base64 encoded data URI for performance purpose

One of the common techniques of the website performance optimization is the reducing of the HTTP requests amount. Each website asset, like image, needs a HTTP request to be loaded.  On this issue is based an idea, to embed website images as base64 encoded data URI. After an image is embedded directly into HTML or CSS of the website, no additional HTTP request is needed to load this image - it is no longer an external resource, but becomes a part of the source code. This is the good part.

Mittwoch, 17. September 2014

Faceted search, SEO and user experience: how to and why?

SEO of faceted search
Certain ecommerce sites with only few product categories and some thousands of products are able to generate thousands upon thousands useless URLs, through product search, product filter and product option URLs. Sad, but true. We can't do as if this problem wouldn't exist. To leave such URLs unhandled would bring tons of negative SEO impact. There are only few kinds of dealing with such URLs:
  • to get rid of them completely,
  • to turn a part of useless URLs into useful, and
  • to reduce the negative SEO impact of the remaining useless URLs.
Note!
There isn't the magic method - no one of existing SEO techniques does the trick alone. What works is the combination of SEO techniques, which i collect in this article.
 Lets look →

Donnerstag, 11. September 2014

H for htaccess: part 5 of the HASCH the OnPage SEO framework

htaccess tutorial for seo
.htaccess (hypertext access) is a text file, placed mostly in the root folder of the given site and invisible cause of the point at the begin. .htaccess contains directives for server, server software, robots and browser about handling of files, folders and paths / URLs.

Generally there are 2 topics, where .htaccess can be used for SEO purposes:
  • Mod_alias and mod_rewrite directives (URL redirects and rewrites)
  • load time optimization
Site security has in my opinion only indirectly to do with SEO, so i decided not to make it to a topic of this article.

The last, fifth part of my HASCH OnPage SEO framework is about the SEO mission of .htaccess. I aim to create a kind of multipurpose explained and examples-illustrated checklist about .htaccess usage for mod_rewrite and robots manipulation and load time optimization as advanced SEO objectives. This ".htaccess for SEO" tutorial will be helpful (for me and you) on performing site audits and building new strictly SEO-minded sites. Read the tutorial →

Dienstag, 2. September 2014

itemId vs sameAs vs additionalType: how to distinguish and use correctly

correct usage of itemid, sameas and additionaltype
One of the best methods to gain the topical relevance of a text is to use entities instead of "plain" keywords. The most simple way to create an entity out of a keyword is to extend it as a standalone type with the Schema.org-markup and prove it with additional trustful informations. Here is the applying point of HTML5 attribute itemId and Schema.org-properties sameAs and additionalType. All of them are used to provide additional informations to a type. But while itemId and sameAs are pretty much the same, additionalType is used for different purpose. Let's look at the use cases in detail to get clear about distinguishing and correct assignment. Accurate usage of these properties is crucial for semantic SEO purposes, cause the kind of the entity creation turns the algorithms opinion about the given text to finally more or less topical ranking.

Samstag, 30. August 2014

Authorship is dead? No! It must be just implemented correctly!

how to provide authorship
Last news are all about authorship is dead. Don't misunderstand this statement! After thoughtful reading of this and that article it is clear, that really dead are only some actions, which Google performs with very special authorship markup, namely
  1. data gathering and
  2. partly showing of author's image and name as byline in search results,
based on rel=author. "Partly" cause author images and names are still shown, if you are logged in into your G+ account and the author is in your G+-circles.

I will not analyze the causes of dropping the processing of rel=author - such analyze is well done in articles linked above. I want rather reassure the community, that the authorship still alive and will be so. The only question is how to correctly implement authorship. Let's look a bit into detail:

Donnerstag, 28. August 2014

C for Content: part 4 of the HASCH the OnPage SEO framework

creating relevant content

How to create relevant content

Shortly i was asked about the definitive OnPage ranking factors. My first and short answer was, there isn't the single ranking factor, but a combination of some. Then it seemed to me, this question could be very relevant verbalization of the article about website content for the part 4 of the HASCH, my OnPage SEO framework. I don't want to rob your time with commonplace phrases about "the content is king, must be unique, interesting, well-written, useful, fresh, catchy, provoking, outrageous" or the like. I will instead explain, which tangible traits must own your content to rank well. With  "to rank well" i mean, that search engine algorithm will reckon your website text as topically relevant to your chosen and used in the text keywords. Let's ask:
  • what makes content relevant?
  • how to create relevant content?
To the answers

Donnerstag, 21. August 2014

S for Semantic: part 3 of the HASCH the OnPage SEO framework

The third part of my OnPage framework is dedicated to semantic of a webpage. With this article i will give an overview of benefits and usage areas of semantic markup. Use this article like a kind of cheatsheet for OnPage audits of existing website, SEO strategy for website under construction, or website preparation for semantic search before relaunch.

Why semantic?

  • Search engines use semantic to improve the search results (semantic search).
  • Using semantic OnPage makes texts better machine-readable (exacter understanding and distinguishing).
  • All fresh kinds of search results output, like rich snippets, Knowledge Graph, the OneBox are based on and populated with semantically rehashed informations.
  • Google accomplished a paradigm transition "from strings to things". The "thing" here is the main object of (web) semantic, an entity. An entity means in our context just a keyword, or keyword phrase, but more or lesser interrelated and explicitly verifiable by other sources,
  • It's already proven, that semantic markup is a ranking factor: 0,3% of all sites make use of semantic markup, but 36% of sites at SERP encounter it.
Semantic website revamp is the key to being understood by machines. Quite enough reasons to scrutinize the subject! Lets go:

Donnerstag, 7. August 2014

How to redirect HTTPS to HTTP without certificate

Redirect HTTPS to HTTP without certificate
Today we all got to know from Google officially, that HTTPS secured connection till now is a ranking signal for Google. Some of us run to buy one. Some other will try to get one for free, e.g. by StartSSL. While we don't know at the moment, how much weighs the secured connection as the ranking signal for Google, another issue exists for all website owners:  what if somebody tries to reach your site with HTTPS, but you haven't any certificate installed? Any browser will rise an error, something like ERR_SSL_PROTOCOL_ERROR. and the visitor will not see your site. What to do? If you will search for an answer, you will mostly get an answer, any redirection isn't possible, cause the SSL handshake happens before the whole connection, so any redirection will not work. But there is a little mighty thing named htaccess, which will allow us to make our sites visible for any visitor independently of with which URL the visitor tries to reach your sites. The trick is

Donnerstag, 31. Juli 2014

Is text to code ratio relevant for SEO?

  • The short answer is: no. There is no such metric or signal or anything, what would Google measure.
  • The long answer is: ...yes! How? Optimizing text to code ratio to the benefit of text, we reduce the code rate. Less code generally effects less page loading time. And this is very well a measurable signal, evaluated by Google and influencing a rank.
So what now? Text to code ration as measurable number isn't relevant for SEO in no way. But it does matter as a first symptom of possible loading time issues, related to dispensable code inside of web document. How to reduce the website code amount?

Mittwoch, 30. Juli 2014

40 secret keyword research tools to find the million dollar keyword

free keyword research tool
Some of the most popular keywords, which Google suggests, regarding keyword tools are "keyword research", "keyword spy", "keyword generator", "keyword finder", "keyword discovery", "keywords search" and "keywords for SEO". I decided to tell about some tools, which i use for certain aspects of keyword research. The keyword research tools i tell about have all in common, that they aren't well-known as e. g. Übersuggest, but they accomplish exactly so good if not better the same tasks, as established commercial and free tools.

What are the main keyword research tasks a SEO and SEA use to do? The tasks are different (better optimizing, saving money, out-competing of competitor), but the primal goal of keyword research is always the same: to get more cheaper targeted traffic. A SEO is looking for keywords for optimized page or analyzes the competitor's landing page. A SEA is looking for keywords for an ad and for the landing page, or analyzes competitor's ad and landing page. Lets further look, which keyword research tools could be helping in this (only free tools and no affiliate links, promised:)

Dienstag, 29. Juli 2014

How to link guest posts to get the most author trust rank

Guest blogging is a hot topic now. There is much unclear with it: some of doing guest blogging get penalized, other are up and about spreading guest posts, and do it fearless and en masse. Ok, the life keeps moving, and incoming links must be acquired, whatever comes, cause they remain one of the strongest signals for the site ranking. In the last article about guest blogging i've written about guest post markup, which helps you to create an additional trust signals in it. Shortly, reading an article from Bill Slawski about Google co-occurrence patent and keyword relationships i got an idea about utilizing of co-occurrence for guest blogging.

What are main problems with guest blogging from Google's point of view:
  • guest article hasn't topically to do with the whole publishing site
  • guest article author is rewarded with (highly) keyword-enriched link anchor text from the guest article to author's own site.
The topical relation's issue is the business of an author alone - only author decides, where to guest post. But what is the motivation to guest post if not the keyword-enriched anchor in the backlink? It must be any kind of threading relation from the guest post to the own site. Let us look, how to set the thread without penalty fear:

Montag, 28. Juli 2014

How Wikipedia moderators defend their sinecure keeping neu authors away from Wikipedia - new trick!

Since many years i get to know from many different people from many different countries about Wikipedia iron curtain, which is build by moderators to keep new authors away. Even today i got told about a new trick: two different people from different countries, who write in different languages, published at Wikipedia each of them an article, both articles are translations from English into other languages. While 24 hours both articles were deleted - the deletion reason was "the article wasn't written by a human, but was translated from another language version by use of automatic translation tool". And now go and bring a proof, you are neither a robot nor an idiot.

Samstag, 26. Juli 2014

How to relaunch website avoiding main catches

How to relaunch website keeping ranking and position in SERP

seo-minded website relaunch
In the last time the most interest to the topic "how to relaunch website", as i shortly realized, rises in Germany, whereby i don't think, that this topic is indeed interesting only there.

Fact is: any website relaunch means losses. Losses of traffic, of ranking, of position in SERP, which are finally loss of money. I've seen losses of 4% and 40%, and there aren't the final numbers. The main objective of SEO is to minimize such negative impacts.

In short each website relaunch is a change of design and / or content structure. If content structure changes, with it comes definitely the change of URL structure. So we have generally 2 possible fields, where the new website version could meet a bad acceptance and following loss of traffic:
  • users could dislike the new design
  • search engines could become bitchy on indexing the new URL structure and some pages will be out of searcher's scope.
So how to act on relaunch to come out as a winner from the relaunch battle against our visitors and search engines? Look:

Donnerstag, 24. Juli 2014

How to bulk check pagerank of all internal pages

Although Google means, we shall in general never mind about pagerank (PR). But the substantial part of any OnPage SEO audit is to clarify, whether a site structure is built properly and spreads correctly the link juice. The sign of correct site structure is: important pages inherit PR, not important pages don't. That is the purpose of project-wide PR measuring. Surely, nobody expects to perform such measurements manually cause of eventual links amount - a tool for such mechanical turk's task is a must! I thought... Damn, the search for this tool is on of cases, when i've got demonstrated again, the internet is full of crap and how i hate some SEOs!

Montag, 21. Juli 2014

How to create bulleted lists in Google Plus posts

create bullets in Google Plus posts
Google Plus has only 3 documented formats: *bold*, _cursive_ and -strikethrough-. But it's a proven fact, that the one of the best arts to deliver information and to gain its visibility is to structure information into lists.

Good, with ordered lists it is no problem: you number each list line with bolded ascending numbers, *1. * etc. What is with unordered, bulleted lists? How to make bullets in Google Plus posts? For this purpose we utilize the computer's own capacity, namely the possibility to type and print unicode characters. This possibility is limited, but it is fully enough for creating bulleted lists in Google Plus posts. Other valuable usage of unicode characters is surely design of page titles, page descriptions, which are used in SERP as snippets, and ads texts. Designing of such text assets with unicode special characters gains their CTR enormously. About it at the end of article. Now lets create bulleted lists in Google Plus posts:

Dienstag, 8. Juli 2014

How to give a face to your entities? Earn SEO profit with making Wikimedia Commons to your media hosting!

We already know, how to become an entity - create a Freebase topic. But this is a common issue that add image to Freebase topic is no longer possible. Our goals regarding semantic SEO and our images are however:
create new image at wikimedia commons
  • the ability to provide semantic informations about our images,
  • a possibility to share images with belonging semantic informations
  • embedding of semantically described images into web documents as images utilizing ImageObject class from Schema.org and, last but not least,
  • to host these images with belonging semantic informations on an authoritative source.
All these tasks can be accomplished using Wikimedia Commons, a free media repository of the Wikimedia Foundation. I divide this article about using of Wikimedia Commons into 2 sections:
  • How to upload to Wikimedia Commons (things like register, login, create user page, select and/or find out matching license, upload and describe image or other media file)
  • How to cite Wikimedia Commons (using on websites, embedding into semantic markup)

Freitag, 4. Juli 2014

A for Architecture: part 2 of the HASCH the OnPage SEO framework

What means a SEO talking about architecture? What kind of architecture matters for SEO?

SEO must clearly distinguish two kinds of website architecture:
  • site-wide architecture,
  • page-wide architecture.
Both site-wide and page-wide architectures have own rules to obey. I will formulate these rules flexible and adaptable enough to be applied to any content or ecommerce project.

Mittwoch, 2. Juli 2014

How to SEO pdf files

How to optimize pdf file for search engines
If your task is to optimize a pdf file for search engines, you must ensure, that your pdf file will be text sourced and not image-sourced. To do so, create your pdf file with a text editor like Libre Office / Open Office or the like, and never with an image editor like Photoshop.

The SEO for pdf procedure isn't tricky, but the optimization quality depends vitally from your general HTML and SEO knowledge:

Freitag, 27. Juni 2014

H for Header: part 1 of the HASCH OnPage SEO framework

Each website begins with the header. There are no minimal or required values inside of it: the header content fully depends from our goals and laziness:) Another cause for poor designed headers is a meaning, that Google doesn't understand meta tags or doesn't take meta tags as a ranking factor. Well, Google says, it understands just a little amount of available meta tags. Here is Google's statement about understanding of meta tags. But i say:
  • use not only meta tags, which Google definitely understands,
  • use meta tags extensively,
  • be redundant in meta tags using.
Why? Just because the web contains much more as only Google and you. There are very much bots and spiders in the internet, their goal is to parse web contents and to create different indexes. The parsing criteria could be a bunch of existing meta tags and the parsing results on their part could be parsed from Googlebot. So Google gets into its index the content from your meta tags, which Google means not to understand.

Good, now we are agreed about benefits of meta tags using. The main SEO point is utilizing header rules to give the searchbot so much information about our site as possible. Now i list meta tags one by one and give the correct syntax, some possible values, and my point of view of the practical SEO effects.

Donnerstag, 26. Juni 2014

HASCH: introducing the OnPage SEO framework

Some C++ programmers dream of or even try to create an own operating system. The creation is scheduled to be the best of the existing OSs and demonstrate the state of the art. Some PHP programmers dream of or even try to create an own content management system, targeting the same. Why want they do it? I guess, such creations would include all the best practice examples into it, then they would get rid of all existing bugs and misbehaviors, then they would systematize and list all current knowledge in the creation.

OnPage SEO is a sophisticated knowledge area, with very much of unsystematized and unvalidated knowledge from many various knowledge segments, like web design, web development, server administration, linguistic, marketing, psychology. With the HASCH OnPage SEO framework i target to systematize the OnPage SEO knowledge and to get rid of unvalidated parts of it.

Then lets get down to the nitty-gritty:

HASCH: the OnPage SEO framework

H... is for Header
A... is for Architecture
S... is for Semantic
C... is for Content
H... is for .htaccess

PS:
I'm sure, this framework will be a good help, cheat sheet and rulebook for all, who performs SEO audits or creates SEO-minded sites. The nature of SEO is, that this SEO framework will be never ended up, so it will be always in the public beta and ongoing updated. I'm highly happy about any additional advice you would share with me!

Mittwoch, 11. Juni 2014

Responsive site vs mobile site: howto to make a right decision

There are many great articles written about which site version is better: responsive site or mobile site, but not a single one answers this question in general and in short words. Why? Cause in this formulation, "which is the best", it can't be answered in general. I will not cite all pros and contras of responsive and mobile site versions. But i recommend you two questions, answering them you will reach your own decision, what do your visitors want and which version will bring the most revenue.

Montag, 2. Juni 2014

How to add iframe to Facebook

Adding of an iframe to a Facebook page seems to be a lack of the Facebook documentation. At least i haven't find an explicit answer to this question. After some tries and errors i come to the solution. It is funny, but in a week after i finded the solution, Facebook changed the procedure and i've forced to look for a new one. After all here is the up-to-date solution "how to add an iframe to a Facebook page". Use it and enjoy the summer:

Freitag, 30. Mai 2014

Direct linking how-to for affiliates: url shortening, iframe, redirect

direct linking howto for affiliates
Affiliate marketing in short words: you drive traffic to the advertiser's landing page, where traffic converts. Direkt linking is, if you drive traffic from e. g. Facebook ad campaign or Bing directly to the advertiser's landing page. You give as the target url for the ad campaign the tracking url of the advertiser's landing page. Direct linking lets CTR grow, cause there isn't any page between your bought visitor and advertiser's landing page - the visitor goes directly to the converting page. But firstly AdWords, then Facebook, then Bing and other try to prevent direct linking of affiliate offers. Firstly affected are Clickbank links, then many other. Nevertheless there are ways to direct-link ads to affiliate offers. In general there are 3 ways: you will successfully direct-link your offers, if you use not a single method alone, but a combination of these methods.

Freitag, 23. Mai 2014

How to surf with US IP address (or IP of any specific country)

It is the common problem: preview of an affiliate offer, which isn't valid in your country. You will be redirected from the offer you want to another offer, which is valid for your country. But you must see it - cause you want decide whether to promote it or not.

The answer is clear: you must look the offer like somebody from a country where this offer is valid. The solution which firstly comes into mind and which i never got to work is to use free proxy. I have never finded any working free proxy. Never. Don't know why. But there are 2 really working tools, which are free and easy to setup and use. They work out of the box:

Samstag, 17. Mai 2014

How to iframe an affiliate offer and run Facebook ad campaign to it: benefits, feasibility and sticking points

iframe of affiliate landing page and running Facebook ad campaign to it
Our goal is to run a Facebook ad campaign, where visitors, who click our ad, will remain at Facebook, but see an advertisers landing page. In other words, we achieve, that the conversion happens inside Facebook, without to force visitor to go away. The benefit is clear, on this way we get much cheaper ad prices from Facebook and better CTR, cause visitors don't leave Facebook.

Samstag, 10. Mai 2014

How to get disapproved Facebook ad approved - Violating Facebook's Ad Guidelines by advertising "work from home" etc.

How to get disapproved facebook ad approved
Your ad wasn't approved because it violates Facebook's Ad Guidelines by advertising "work from home", MLM, get rich quick and other inaccurate money-making opportunities...
Got the same meaningful message from Facebook? Me too:) There are some workarounds, what we can do with or against it. But here we must make a decision, what we want to achive and what we are ready to do for it: we want get our ad approved without any changes, with some little, not substantial changes, or we are ready completely to rework our ad. Let us look, how to get disapproved Facebook ad finally approved, if our ad was disapproved with the cited cause.

Freitag, 25. April 2014

2 things i hate on Facebook Ad Manager and Power Editor

HATE! HATE!! HATE!!!
facebook ad manager
  1. You can't, under no circumstances, change the ad target url! You create an ad with website conversions objective. You create a bunch of them, the whole ad set with many different targeting options. Then you want just copy this ad set and point it to another target url. And you can't! You must click on EACH ad, then create a similar ad - only so you will be able to change ad target url.
  2. If you want hide your affiliate urls, this works ONLY if you create your ads each one new from scratch! If you create an ad with shorten url as target, then create similar ad and input as target url your next shorten url - you mean it would work? NO, damn! Facebook is smart enough to show in the ad body the original url. And it is funny enough to show the SHORTEN url, if you run with the mouse over the opriginal url. Is it not stupid?

Samstag, 5. April 2014

Shortest URL Shortener ever!

As the title said, http://v.gd is the shortest URL shortener i ever seen (and it can techically exist). I find this service super comfortable and decided to make it even more comfortabler and share the results: restartless Firefox add-on and bookmarklet.

5.04.2014 UPDATE: my Firefox addon "Shortest URL Shortener ever" was even fully approved and now available for download at https://addons.mozilla.org/de/firefox/addon/shortest-url-shortener/ Download and enjoy!

Dienstag, 1. April 2014

30 cheat sheats for successful SEO

cheat sheets for successful seo

What a SEO needs to know?

The web is flooded with infografics and cheat sheets. Somebody meant once, such giveaways are good for SEO as linkbuilding assets and now everybody makes some. At least as copy and share. I will not speculate about whether or how many of them bring real value, imo most of them are redundant, but my personal biggest problem with them was - THE cheat sheet was NEVER present, if it was really needed (at least for me). Indeed, the sense and the convenience of cheat sheets is if they are there just in time, at the moment, whem one needs them. So i decided to create a collection of all cheat sheets i ever used on my SEO activities and share it. This cheat sheets suite is an evergreen knowledge, hints and tricks, which will be always helpful. Surely this knowledge isn't enough to call oneself an expert, but for somebody who does SEO, specially technical SEO and Onpage SEO, these cheat sheets will render a great service. And for somebody who learns SEO at the moment, they will give a great summary of things which must be learned. These cheat sheets cover already all essential knowledge segments a SEO brings daily into action. Befor publishing i reviewed all cheat sheets to find eventuallya fresher version - for some of them i finded one indeed.

Montag, 31. März 2014

How to practice guest blogging successfully without penalty fear

Guest blogging tip
There are hard times for guest bloggers and guest blogging platforms. Google's top spam officer  means, guest blogging is done. Even was penalized one of the most successful guest blogging brokers. There is a big chaos and panic in the webmarketing environment. "Guest blogging is dead" is one of the most searchable phrases. But in my opinion there is a kind of doing guest blogging, which allows the successful valuable guest blog practice. Lets look on what exactly hates Google on guest blogging, what exactly drives a guest blogger into penalty and how the common sense and semantic markup help us to spread our guest blog articles and get valuable backlinks for it without penalty fear.

Freitag, 28. März 2014

Solution for "Your post was not shared. Please try again" and how to post to multiple communities

Your post was not shared. Please try again
As i was a bloody rookie at Google+ i was affected many times by the "Your post was not shared. Please try again" error. This error alert rises from time to time if one tries to post something in any community. After some researches i guess to determine the problem's cause. The cause of this issue is BTW related to the wish and try to post to multiple communities. Updated at 1.07.2014. Updated at 8.07.2014

Donnerstag, 27. März 2014

Oftener and deeper into Knowledge Graph? - Become better entity at Wikidata!

create entity to come into knowledge graph
Well my dear entity owners, i realize with pleasure, you really like becoming entity! You maked such stormy run on my previous topic about becoming entity with creating Freebase topic, that i decided to write a follow-up to tell more detailed about a further possibility of entity enriching and chaining of entities. I mean, as more structured and linked data we provide publicly, as more cause we give to the search algorythmus to interrelate us with our creative works, products and the like. Establishing and reinforcing of such interrelations gains firstly our author and trust rank, secondly it enhances our influence grad and our pubications authority. I see these interrelations like a relations between left and right brain hemisphere: the more synaptic relations are established (thickness of Corpus Callosum), the higher is creativity and intelligence. Then let's our entities interrelation as thick as possible, to achive such amount as it was by Einstein's brain;) Now we create our new entity at Wikidata, and then connect this new entity to our already existing.

Dienstag, 25. März 2014

How to become an entity? Create Freebase topic!

create freebase topic
What? Are we individuals not already a kind of entity? Sure, but... Google would say - not enough, it lacks something important. Yes, if you not forget about rel="author", if you publish something, you are on the good way to become a real machine-readable entity. It remains just a little step to be affiliated in the community of the Knowledge Graph entities: yes, i mean your entry in Freebase.

Samstag, 22. März 2014

Solution for "Rich snippets not showing"

Rich snippets not showing
During my microdata coding i quite accidentally realized, that "rich snippets are not showing" doesn't always mean "rich snippets not working"! My research was about nesting and inheritance of Schema.org's classes, and, playing with the Google's structured data testing tool, i detected a case where my microdata markup was correctly validated and worked properly, but rich snippets of this code weren't showing. After some tests i got to know about two causes of not shown rich snippets:

Authorship markup: how to combine correctly several kinds of it?

Yes, exactly, how to correctly combine them? More: why is it useful and needful to use several authorship markup and whether it produces any SEO-profit?

Some SEOs dispute about differences of using author and publisher properties, another SEOs advice to use only rel="author"... I say: use all you might use simultaneously! (it's needless to say, don't use something, what doesn't fit your context and could be ranked as e.g. rich snippets spam etc ).

Montag, 17. März 2014

Optimizing of Blogger load time

Insert scripts properly into Blogger template's head

There are many how-to's for implementing of third-part scripts into Blogger templates, mostly custom css and javascripts. I personally use Syntaxhighlighter and Google Analytics. But the most how-to's advice users to insert scripts into the template's head. This approach is against all best practices for optimizing of site's load times. If scripts are inserted into the head, the whole site's content will not load and wait till all scripts are fully loaded. This influences negatively the whole site's load time, which, as you know is an important ranking signal. The there is a strong dependency: more scripts in the head - longer load time - poorer ranking.

My advice for you (i tested it myself without any issue): insert scripts into the body's bottom, just befor closing tag. If something doesn't work, you could still move scripts one by one into the head.

How to SEO Joomla? - advanced SEO extensions and workarounds

From all free PHP web CMS i like Joomla more than all other. Why? I can work with it very effectively and get done alone enterprise level projects, which would need e.g. with Drupal or Typo3 about 3 working persons. I started to use Joomla as it was called Mambo and use it still on some projects. With the time it become more and more pleasant to work with - the structure becomes mature and MVC, usability makes fun, extensions repository grows and contains fantastic extensions. And there are too some wonderful tools and workarounds for accomplishing Joomla SEO tasks, speciall OnPage. Some extensions, which i use extensively myself i will introduce below. I will talk only about free or extremely lowcost extensions.

Samstag, 1. Februar 2014

How to minify javascript / css?

We are agreed, that assets like html, css and javascript must be better minified. If not, YSlow, Google Page Speed test and similar tools will give your site less scoring points and advice you to minify. It is enough said about importance of load optimizing for SEO, so lets look, what we can do and what we use to achive best possible result.
I tested 17 free tools to minify JavaScript online: after minifying with the best tool there are ca 43% of code remained,  the "worst" tool (not really worst) remained ca 51% of code. It worth also to minify. To minify CSS gives ca 60% of less code - It worth also to minify too. Read further - i list all tested online minifying tools with stats and a few server-side traffic-saving hints :

Montag, 6. Januar 2014

How to result co-citation into valuable backlink?

what-is-co-citation-for-seo

What yields co-citation for SEO?

There is much stuff written about the role of co-citation for SEO. To explain co-citation briefly: if site A links to sites B and C, then Google means sites B and C are somehow related. But how SEO makes a valuable backlink Profit with co-citation?
Yandex.Metrica