Posts mit dem Label OnPage SEO werden angezeigt. Alle Posts anzeigen
Posts mit dem Label OnPage SEO werden angezeigt. Alle Posts anzeigen

Montag, 28. Mai 2018

What's wrong with image dimensions?

Yes, what's wrong? I say you - pretty much is wrong. I've completed a kind of little study for our firm about how well is image optimization in times of mobile first. You know what! I was disgusted twice:
  1. roughly 85% images are oversized - this is not a dark secret, we are used to this in the meanwhile,
  2. roughly 20% images are oversized in terms of dimensions: it means 20% of original images are higher than their displaying dimensions.
Read further: study setup, detailed results, toolchain and scripts

Mittwoch, 28. Oktober 2015

How to prevent negative SEO impacts caused by poor HTML quality

Poor HTML impacts SEO
The question rises over and over again: whether and how could HTML markup negatively impact SEO. Googlebot is indeed a smart HTML interpreter:
  • it has a high tolerance to HTML syntax errors,
  • it doesn't force websites to comply with W3C validation rules.
But nevertheless there are some HTML misuses, which could painfully hurt SEO. To begin with this topic i relate to a pair of posts by two respectable Googlers and by commenting the posts i list HTML issues causing negative SEO effects:

Dienstag, 27. Oktober 2015

Solution: how to avoid Google's crawling and indexing non-existing pages

Many webmasters are affected from the weird issue: Google is indexing (at least crawling) non-existing URLs. The issue isn't depending of whether one uses Wordpress or other CMS. This question about why Google is crawling and / or indexing non-existing URLs appears in all webmaster forums, Google Groups and so on, but without a clear solution.

The fact, that Googlebot creates and crawls a bunch of non-existing URLs, lets arise some questions:

  • Where non existing URLs are coming from?
  • Why is it not optimal, if non-existing URLs are crawled respectively indexed?
  • How to minimize risks related to non-existing URLs?

Montag, 27. Juli 2015

After-Panda SEO for intermediary businesses

After Panda SEO for intermediary business
What we definitely know about Phantom and Panda Updates:
  • Phantom and Panda updates are about onpage quality, whatever it might mean;)
  • Duplicated content fraction of a given page is one of the most important factors, which rank a page down
  • Duplicated content can be easy measured

SERPs disintermediation - Google battles intermediaries

There is a Federal trade Comission's report, about how Google misuses its prevalence to kick off intermediary players from some search verticals, like comparison sites or aggregated shopping, where sites sell products from different manufacturers.

Google means, intermediary businesses are poachers and steal Google's money. Google means, SERP is the place for users to see original manufacturers or direct service provider. SERP should be not the place for intermediary services, cause they are secondary. And the sign of secondarity is easy to measure: it is the fact of presence and the proportion of the duplicated content.

The intermediary job to compare and to aggregate stuff and user voice would belong only to Google, because only Google were good, honest, and, last, but not least, it doesn't offer duplicated content - it's just a search engine, not?
Google is a strong rival, playing by own rules. But do you still want to survive this battle?

Donnerstag, 20. November 2014

How to hide webpage parts from Google indexing

prevent indexing page parts
Wishes and needs to hide webpage parts from Google indexing are common. Before i proceed, i make one thing clear:

No tag can exclude webpage part from Google indexing!

The silly advice about a snake oil named googleon / googleoff is mantra-alike repeated in the wild web. The repeating won't make it working: googleon / googleoff do their job only inside of Google Search Appliance environment, said John Müller of Google. Hope we closed this discussion once and forever.

But don't worry! Believe it or not, i have for you whopping 4 workarounds, how to hide webpage parts from Google indexing. Let's look into one by one:

Sonntag, 2. November 2014

How to SEO long URLs

How to optimize long urls seo impact
There are many meanings about the SEO impact of long URLs: they are good, or bad, or have no influence. Our questions in this article are:
  • why long URLs occur,
  • how to make long URLs short,
The origins of long URLs are mainly
  • the wish to stuff URLs with keywords (both of domains and every single URL slugs),
  • the necessity (or, to be honest, the wish too) to reproduce the site's structure in the matching URL's structure.
Before we begin let us declare, what we mean as long URL: how many characters it must have to be named long? The longest URL in the Google's Webmaster Blog is 95 characters long, so let's call long URLs longer as this.

Dienstag, 23. September 2014

Pro and contra of images encoding as base64 encoded data URI for performance purpose

One of the common techniques of the website performance optimization is the reducing of the HTTP requests amount. Each website asset, like image, needs a HTTP request to be loaded.  On this issue is based an idea, to embed website images as base64 encoded data URI. After an image is embedded directly into HTML or CSS of the website, no additional HTTP request is needed to load this image - it is no longer an external resource, but becomes a part of the source code. This is the good part.

Mittwoch, 17. September 2014

Faceted search, SEO and user experience: how to and why?

SEO of faceted search
Certain ecommerce sites with only few product categories and some thousands of products are able to generate thousands upon thousands useless URLs, through product search, product filter and product option URLs. Sad, but true. We can't do as if this problem wouldn't exist. To leave such URLs unhandled would bring tons of negative SEO impact. There are only few kinds of dealing with such URLs:
  • to get rid of them completely,
  • to turn a part of useless URLs into useful, and
  • to reduce the negative SEO impact of the remaining useless URLs.
Note!
There isn't the magic method - no one of existing SEO techniques does the trick alone. What works is the combination of SEO techniques, which i collect in this article.
 Lets look →

Dienstag, 2. September 2014

itemId vs sameAs vs additionalType: how to distinguish and use correctly

correct usage of itemid, sameas and additionaltype
One of the best methods to gain the topical relevance of a text is to use entities instead of "plain" keywords. The most simple way to create an entity out of a keyword is to extend it as a standalone type with the Schema.org-markup and prove it with additional trustful informations. Here is the applying point of HTML5 attribute itemId and Schema.org-properties sameAs and additionalType. All of them are used to provide additional informations to a type. But while itemId and sameAs are pretty much the same, additionalType is used for different purpose. Let's look at the use cases in detail to get clear about distinguishing and correct assignment. Accurate usage of these properties is crucial for semantic SEO purposes, cause the kind of the entity creation turns the algorithms opinion about the given text to finally more or less topical ranking.

Samstag, 30. August 2014

Authorship is dead? No! It must be just implemented correctly!

how to provide authorship
Last news are all about authorship is dead. Don't misunderstand this statement! After thoughtful reading of this and that article it is clear, that really dead are only some actions, which Google performs with very special authorship markup, namely
  1. data gathering and
  2. partly showing of author's image and name as byline in search results,
based on rel=author. "Partly" cause author images and names are still shown, if you are logged in into your G+ account and the author is in your G+-circles.

I will not analyze the causes of dropping the processing of rel=author - such analyze is well done in articles linked above. I want rather reassure the community, that the authorship still alive and will be so. The only question is how to correctly implement authorship. Let's look a bit into detail:

Donnerstag, 28. August 2014

C for Content: part 4 of the HASCH the OnPage SEO framework

creating relevant content

How to create relevant content

Shortly i was asked about the definitive OnPage ranking factors. My first and short answer was, there isn't the single ranking factor, but a combination of some. Then it seemed to me, this question could be very relevant verbalization of the article about website content for the part 4 of the HASCH, my OnPage SEO framework. I don't want to rob your time with commonplace phrases about "the content is king, must be unique, interesting, well-written, useful, fresh, catchy, provoking, outrageous" or the like. I will instead explain, which tangible traits must own your content to rank well. With  "to rank well" i mean, that search engine algorithm will reckon your website text as topically relevant to your chosen and used in the text keywords. Let's ask:
  • what makes content relevant?
  • how to create relevant content?
To the answers

Donnerstag, 21. August 2014

S for Semantic: part 3 of the HASCH the OnPage SEO framework

The third part of my OnPage framework is dedicated to semantic of a webpage. With this article i will give an overview of benefits and usage areas of semantic markup. Use this article like a kind of cheatsheet for OnPage audits of existing website, SEO strategy for website under construction, or website preparation for semantic search before relaunch.

Why semantic?

  • Search engines use semantic to improve the search results (semantic search).
  • Using semantic OnPage makes texts better machine-readable (exacter understanding and distinguishing).
  • All fresh kinds of search results output, like rich snippets, Knowledge Graph, the OneBox are based on and populated with semantically rehashed informations.
  • Google accomplished a paradigm transition "from strings to things". The "thing" here is the main object of (web) semantic, an entity. An entity means in our context just a keyword, or keyword phrase, but more or lesser interrelated and explicitly verifiable by other sources,
  • It's already proven, that semantic markup is a ranking factor: 0,3% of all sites make use of semantic markup, but 36% of sites at SERP encounter it.
Semantic website revamp is the key to being understood by machines. Quite enough reasons to scrutinize the subject! Lets go:

Donnerstag, 7. August 2014

How to redirect HTTPS to HTTP without certificate

Redirect HTTPS to HTTP without certificate
Today we all got to know from Google officially, that HTTPS secured connection till now is a ranking signal for Google. Some of us run to buy one. Some other will try to get one for free, e.g. by StartSSL. While we don't know at the moment, how much weighs the secured connection as the ranking signal for Google, another issue exists for all website owners:  what if somebody tries to reach your site with HTTPS, but you haven't any certificate installed? Any browser will rise an error, something like ERR_SSL_PROTOCOL_ERROR. and the visitor will not see your site. What to do? If you will search for an answer, you will mostly get an answer, any redirection isn't possible, cause the SSL handshake happens before the whole connection, so any redirection will not work. But there is a little mighty thing named htaccess, which will allow us to make our sites visible for any visitor independently of with which URL the visitor tries to reach your sites. The trick is

Donnerstag, 31. Juli 2014

Is text to code ratio relevant for SEO?

  • The short answer is: no. There is no such metric or signal or anything, what would Google measure.
  • The long answer is: ...yes! How? Optimizing text to code ratio to the benefit of text, we reduce the code rate. Less code generally effects less page loading time. And this is very well a measurable signal, evaluated by Google and influencing a rank.
So what now? Text to code ration as measurable number isn't relevant for SEO in no way. But it does matter as a first symptom of possible loading time issues, related to dispensable code inside of web document. How to reduce the website code amount?

Samstag, 26. Juli 2014

How to relaunch website avoiding main catches

How to relaunch website keeping ranking and position in SERP

seo-minded website relaunch
In the last time the most interest to the topic "how to relaunch website", as i shortly realized, rises in Germany, whereby i don't think, that this topic is indeed interesting only there.

Fact is: any website relaunch means losses. Losses of traffic, of ranking, of position in SERP, which are finally loss of money. I've seen losses of 4% and 40%, and there aren't the final numbers. The main objective of SEO is to minimize such negative impacts.

In short each website relaunch is a change of design and / or content structure. If content structure changes, with it comes definitely the change of URL structure. So we have generally 2 possible fields, where the new website version could meet a bad acceptance and following loss of traffic:
  • users could dislike the new design
  • search engines could become bitchy on indexing the new URL structure and some pages will be out of searcher's scope.
So how to act on relaunch to come out as a winner from the relaunch battle against our visitors and search engines? Look:

Donnerstag, 24. Juli 2014

How to bulk check pagerank of all internal pages

Although Google means, we shall in general never mind about pagerank (PR). But the substantial part of any OnPage SEO audit is to clarify, whether a site structure is built properly and spreads correctly the link juice. The sign of correct site structure is: important pages inherit PR, not important pages don't. That is the purpose of project-wide PR measuring. Surely, nobody expects to perform such measurements manually cause of eventual links amount - a tool for such mechanical turk's task is a must! I thought... Damn, the search for this tool is on of cases, when i've got demonstrated again, the internet is full of crap and how i hate some SEOs!

Montag, 21. Juli 2014

How to create bulleted lists in Google Plus posts

create bullets in Google Plus posts
Google Plus has only 3 documented formats: *bold*, _cursive_ and -strikethrough-. But it's a proven fact, that the one of the best arts to deliver information and to gain its visibility is to structure information into lists.

Good, with ordered lists it is no problem: you number each list line with bolded ascending numbers, *1. * etc. What is with unordered, bulleted lists? How to make bullets in Google Plus posts? For this purpose we utilize the computer's own capacity, namely the possibility to type and print unicode characters. This possibility is limited, but it is fully enough for creating bulleted lists in Google Plus posts. Other valuable usage of unicode characters is surely design of page titles, page descriptions, which are used in SERP as snippets, and ads texts. Designing of such text assets with unicode special characters gains their CTR enormously. About it at the end of article. Now lets create bulleted lists in Google Plus posts:

Freitag, 4. Juli 2014

A for Architecture: part 2 of the HASCH the OnPage SEO framework

What means a SEO talking about architecture? What kind of architecture matters for SEO?

SEO must clearly distinguish two kinds of website architecture:
  • site-wide architecture,
  • page-wide architecture.
Both site-wide and page-wide architectures have own rules to obey. I will formulate these rules flexible and adaptable enough to be applied to any content or ecommerce project.

Mittwoch, 2. Juli 2014

How to SEO pdf files

How to optimize pdf file for search engines
If your task is to optimize a pdf file for search engines, you must ensure, that your pdf file will be text sourced and not image-sourced. To do so, create your pdf file with a text editor like Libre Office / Open Office or the like, and never with an image editor like Photoshop.

The SEO for pdf procedure isn't tricky, but the optimization quality depends vitally from your general HTML and SEO knowledge:

Freitag, 27. Juni 2014

H for Header: part 1 of the HASCH OnPage SEO framework

Each website begins with the header. There are no minimal or required values inside of it: the header content fully depends from our goals and laziness:) Another cause for poor designed headers is a meaning, that Google doesn't understand meta tags or doesn't take meta tags as a ranking factor. Well, Google says, it understands just a little amount of available meta tags. Here is Google's statement about understanding of meta tags. But i say:
  • use not only meta tags, which Google definitely understands,
  • use meta tags extensively,
  • be redundant in meta tags using.
Why? Just because the web contains much more as only Google and you. There are very much bots and spiders in the internet, their goal is to parse web contents and to create different indexes. The parsing criteria could be a bunch of existing meta tags and the parsing results on their part could be parsed from Googlebot. So Google gets into its index the content from your meta tags, which Google means not to understand.

Good, now we are agreed about benefits of meta tags using. The main SEO point is utilizing header rules to give the searchbot so much information about our site as possible. Now i list meta tags one by one and give the correct syntax, some possible values, and my point of view of the practical SEO effects.
Yandex.Metrica