Freitag, 11. August 2017

Three perfect image optimizers for SEO

What is this - a perfect image optimization tool for SEO?

best image optimizing tools for seo
For me the perfect image optimization tool must have following features:
  • This image optimizer is not a kind of hipster online tool, but the old good installable software,
  • This image optimizer is free - not a kind of shity shareware, which means it would be free, but indeed isn't,
  • User can setup an optimization level,
  • The tool must do batch image compression - real SEO ninjas don't have much time,
  • This image optimizer must, no - it MUST understand nested folders and
  • it MUST be able to save optimized images in the same nested folder/subfolder structure,
  • This tool, again, MUST be able to save optimized images without to change the file type,
  • This tool should be able to resize images saving the width/height ratio.
  • And, of course, it must do its compression job well ;)
Do i want too much? No, i believe, it isn't too much.

I will not explain, why image compression is substantially important for your website's rankings. Just one sentence: images are the most heaviest static asset type on ANY website. Do you still don't want to compress them? No prob, just as you wish. But why are you still here?

For non-ignoramuses among us: yes, such tools actually exist - they aren't just a subject of SEO fable tales. I know even three tools, which are able to accomplish every single task i listed above. Want to know more? Let's go →

Freitag, 4. August 2017

HowTo guide: filter Search Console for multiple values at once

Setting of multiple filters in the Google Search Console is possible!

how to setup multiple filters in search console

Yes, against different answers of some experts in the Google product forum, like here or there. The possibility to set multiple filters is extremely useful and works into both directions - you can include or exclude multiple keywords, urls, traffic sources - anything, what you want.

We will operate with filter settings through manipulation of urls. You'll need for this a good code editor - good means Notepad++, one magic plugin for it, and, surely, a bit of chutzpah - yeah, we'll do the big G, nothing less!

Samstag, 3. Juni 2017

Free online SEO tools for TF-IDF calculation

Free online tools for TF-IDF calculation
First of all: what's TF-IDF? In German this content parameter is called WDF / IDF, but the subject is the same:

TF-IDF, term frequency–inverse document frequency, is, explained in short, from rookie to rookie, the relation of term usage frequency in the given document to usage frequency of this term in all documents containing it.

This relation mirrors how relevant is the given document for the given term in the manifold of all documents containing this term.

TF-IDF is successor of keyword density. Some non-demented SEO geriatrics can remind, what means keyword density: number of term's usage in the text divided through the number of all words in the text multiplied with 100. This formula is in reality both most used and wrong.

Well, i'm not your Wiki, and if you're in the full legal age and not banned by Google, you would find fast everything you need to know about keyword density and TF-IDF. I just say: nowadays TF-IDF is nearly the single numeric parameter of content quality. But this article is about something other - i've promised to share some freebies, right?

Mittwoch, 28. Oktober 2015

How to prevent negative SEO impacts caused by poor HTML quality

Poor HTML impacts SEO
The question rises over and over again: whether and how could HTML markup negatively impact SEO. Googlebot is indeed a smart HTML interpreter:
  • it has a high tolerance to HTML syntax errors,
  • it doesn't force websites to comply with W3C validation rules.
But nevertheless there are some HTML misuses, which could painfully hurt SEO. To begin with this topic i relate to a pair of posts by two respectable Googlers and by commenting the posts i list HTML issues causing negative SEO effects:

Dienstag, 27. Oktober 2015

Solution: how to avoid Google's crawling and indexing non-existing pages

Many webmasters are affected from the weird issue: Google is indexing (at least crawling) non-existing URLs. The issue isn't depending of whether one uses Wordpress or other CMS. This question about why Google is crawling and / or indexing non-existing URLs appears in all webmaster forums, Google Groups and so on, but without a clear solution.

The fact, that Googlebot creates and crawls a bunch of non-existing URLs, lets arise some questions:

  • Where non existing URLs are coming from?
  • Why is it not optimal, if non-existing URLs are crawled respectively indexed?
  • How to minimize risks related to non-existing URLs?
Yandex.Metrica