Mittwoch, 17. Juni 2020

Merging and advanced filtering of GSC and GA data with Search Console Helper

<tl;dr>

Search Console Helper merges GSC and GA data and applies multiple condition filters

It is not an advertising!
It is a kind recommendation to all of my colleagues - the tool is of an outstanding nature, i was waiting for it since i'm doing SEO!

Search Console Helper allows, among others:
  • To work with up to 5.000 daily data rows,
  • To work with 24 months of GSC data (not maximally 16, like GSC allows)
  • Filter GSC data with multiple conditions for keywords and urls. There are including and excluding filters, supporting the kind of regular expressions,
  • One-click merging of GSC and Google Analytics data,
  • To export data on any way you need,
  • and much more unique stuff you can't imagine ;)
</tl;dr>

Let us look, what happens with the Google's Search Console, previously Google Webmaster Tools, and on the market around of it.

Montag, 1. Juni 2020

Adsterra aka Adsetica Ad Network test: 100% fraud traffic, don't touch it!

I selected Adsterra, also known as Adsetica as Ad Network for test from the publusher role, because of their statement to run CPA campaigns for Chrome extensions. Its campaign turned out to be 100% - read further why.

Dienstag, 16. Oktober 2018

Google Ads manager account (MCC) hacks your Chrome browser and shares your personal data

Suddenly i realized a funny bug/feature of current stable Chrome 69.0.3497.100 (Official Build) (64-bit) and Google Ads manager account:


  1. If there are some people logging into one after one from own Chrome into MCC,
  2. If syncing of personal data in their Chrome is ON,
  3. If no one of them has a syncing passphrase...


then:

the user, who currently logs in into MCC, gets all personal data (bookmarks, passwords - yes, passwords!) into the own Chrome of the user, who was logged in into MCC previously.

Montag, 27. August 2018

Testing structured data of the current URL with one click

test structured data with one click
Any SEO know about benefits of structured data. Google offers a special structured data testing tool to test the existence and quality of the structured data on the certain URL.

But what, if you want to test some URLs? Copy URL, switch the tab, go to the Google Structured Data Testing Tool, paste URL, press Enter. And again, and again...? Annoying!

I've coded a solution for this kind of routine task.

Montag, 28. Mai 2018

What's wrong with image dimensions?

Yes, what's wrong? I say you - pretty much is wrong. I've completed a kind of little study for our firm about how well is image optimization in times of mobile first. You know what! I was disgusted twice:
  1. roughly 85% images are oversized - this is not a dark secret, we are used to this in the meanwhile,
  2. roughly 20% images are oversized in terms of dimensions: it means 20% of original images are higher than their displaying dimensions.
Read further: study setup, detailed results, toolchain and scripts

Freitag, 16. März 2018

German Google News kicks asses

According to the german version of Google News Help:

  • Google News doesn't exist for Germany (Germany isn't in the list of countries, where Google News is available),
  • there is a country in Asia calling Amsterdam, like on screenshot:


Google, please, don't outsource to India!

Montag, 5. März 2018

Screaming Frog 9.0 exports screenshots, original and rendered HTML code

If rendering Javascript and storing/rendering HTML options are activated, you are able to export screenshots of rendered URLs, as like as their original and rendered HTML code:

Export screenshots and source code of crawled urls in Screaming Frog 9.0

Screaming Frog 9.0 renders websites with Chrome 60

The new but not obvious thing in Screaming Frog 9.0 is the rendering engine. You should know - it isn't Chrome 41 (41.0.2272.118), which is used by Google's web rendering service (WRS). With it Screaming Frog can't render websites to 100% as Googlebot. But approximately identically to ;)

It is rather Chrome 60 (60.0.3112.113). This Chrome version was choosen by Screaming Frog developers as more stable AND with fewest rendering differences to the Chrome 41. You can compare both versions more detailedly at https://caniuse.com/#compare=chrome+41,chrome+60.

According to Screaming Frog support Chrome 60 version was selected due to working stability with the spider engine at scale, rather than because of rendering issues.

You can see the current Chrome version in the debug window:

Screaming Frog 9.0 uses Chrome 60 as rendering engine. Information about Chrome version is placed in the debug window.

Mittwoch, 4. Oktober 2017

How to disable Excel preview in Windows Explorer on Windows 7 and Windows 10.

Preview of Excel tables, specially of bigger size, can go pretty hard on the system performance. Some ressources recommend to use setting of Windows Explorer - but on this way one can only disable ALL previews.

With any file extention there is a Preview Handler associated. To disable only Excel previews, but to keep all other, which aren't so performance hungry, like previews of images or PDFs, one should edit (better as delete) one key in Windows registry.


  • Start registry editor with regedit: Start → Run → regedit
  • Find under HKEY_CLASSES_ROOT\.xlsx\ShellEx\ the key called {8895b1c6-b41f-4c1c-a562-0d564250836f}
  • Edit it (i don't recommend to delete it), like on screenshot:

How to disable Excel preview in Windows Explorer under Window7 / Windows10

  • Close regedit,
  • Restart Windows Explorer: (Start → Run → Type cmd →Enter → Type taskkill /f /im explorer.exe → Enter → Type explorer.exe → Enter)
  • Enjoy (tested on Windows7 and Windows10)

Freitag, 11. August 2017

Three perfect image optimizers for SEO

What is this - a perfect image optimization tool for SEO?

best image optimizing tools for seo
For me the perfect image optimization tool must have following features:
  • This image optimizer is not a kind of hipster online tool, but the old good installable software,
  • This image optimizer is free - not a kind of shity shareware, which means it would be free, but indeed isn't,
  • User can setup an optimization level,
  • The tool must do batch image compression - real SEO ninjas don't have much time,
  • This image optimizer must, no - it MUST understand nested folders and
  • it MUST be able to save optimized images in the same nested folder/subfolder structure,
  • This tool, again, MUST be able to save optimized images without to change the file type,
  • This tool should be able to resize images saving the width/height ratio.
  • And, of course, it must do its compression job well ;)
Do i want too much? No, i believe, it isn't too much.

I will not explain, why image compression is substantially important for your website's rankings. Just one sentence: images are the most heaviest static asset type on ANY website. Do you still don't want to compress them? No prob, just as you wish. But why are you still here?

For non-ignoramuses among us: yes, such tools actually exist - they aren't just a subject of SEO fable tales. I know even three tools, which are able to accomplish every single task i listed above. Want to know more? Let's go →

Freitag, 4. August 2017

HowTo guide: filter Search Console for multiple values at once

Setting of multiple filters in the Google Search Console is possible!

how to setup multiple filters in search console

Yes, against different answers of some experts in the Google product forum, like here or there. The possibility to set multiple filters is extremely useful and works into both directions - you can include or exclude multiple keywords, urls, traffic sources - anything, what you want.

We will operate with filter settings through manipulation of urls. You'll need for this a good code editor - good means Notepad++, one magic plugin for it, and, surely, a bit of chutzpah - yeah, we'll do the big G, nothing less!

Samstag, 3. Juni 2017

Free online SEO tools for TF-IDF calculation

Free online tools for TF-IDF calculation
First of all: what's TF-IDF? In German this content parameter is called WDF / IDF, but the subject is the same:

TF-IDF, term frequency–inverse document frequency, is, explained in short, from rookie to rookie, the relation of term usage frequency in the given document to usage frequency of this term in all documents containing it.

This relation mirrors how relevant is the given document for the given term in the manifold of all documents containing this term.

TF-IDF is successor of keyword density. Some non-demented SEO geriatrics can remind, what means keyword density: number of term's usage in the text divided through the number of all words in the text multiplied with 100. This formula is in reality both most used and wrong.

Well, i'm not your Wiki, and if you're in the full legal age and not banned by Google, you would find fast everything you need to know about keyword density and TF-IDF. I just say: nowadays TF-IDF is nearly the single numeric parameter of content quality. But this article is about something other - i've promised to share some freebies, right?

Mittwoch, 28. Oktober 2015

How to prevent negative SEO impacts caused by poor HTML quality

Poor HTML impacts SEO
The question rises over and over again: whether and how could HTML markup negatively impact SEO. Googlebot is indeed a smart HTML interpreter:
  • it has a high tolerance to HTML syntax errors,
  • it doesn't force websites to comply with W3C validation rules.
But nevertheless there are some HTML misuses, which could painfully hurt SEO. To begin with this topic i relate to a pair of posts by two respectable Googlers and by commenting the posts i list HTML issues causing negative SEO effects:

Dienstag, 27. Oktober 2015

Solution: how to avoid Google's crawling and indexing non-existing pages

Many webmasters are affected from the weird issue: Google is indexing (at least crawling) non-existing URLs. The issue isn't depending of whether one uses Wordpress or other CMS. This question about why Google is crawling and / or indexing non-existing URLs appears in all webmaster forums, Google Groups and so on, but without a clear solution.

The fact, that Googlebot creates and crawls a bunch of non-existing URLs, lets arise some questions:

  • Where non existing URLs are coming from?
  • Why is it not optimal, if non-existing URLs are crawled respectively indexed?
  • How to minimize risks related to non-existing URLs?

Montag, 26. Oktober 2015

Making decision: hosting of javascript libraries on external CDN or locally

whether to host javascript libraries locally or at external javascript cdnThe benefits of hosting common JavaScript libraries externally at JavaScript CDN are well known - i list them just in short: external hosting of JavaScript libraries
  • reduces own server's traffic
  • makes parallel loading possible
The first question is: where to host? Directly on library vendor's site? Or somewhere else? There are mainly two factors, which let us make a decision, where we host our JavaScript libraries:
  • Speed
  • Popularity
The second question is more general: whether to host externally? Or locally?
Lets look on some details, which help to choose an optimal public hosting of common JavaScript libraries.

tl;dr;
  • If you doubtlessly want to host JavaScript libraries externally, host them at Google Hosted Libraries - so you get highest chance, that your visitors have them already in chance and don't need to download them again.
  • If you doubt, host better your JavaScript libraries locally - visitors, who have JavaScript libraries in cache are very few, and the average visitor gets the data on fastest way.

Montag, 27. Juli 2015

After-Panda SEO for intermediary businesses

After Panda SEO for intermediary business
What we definitely know about Phantom and Panda Updates:
  • Phantom and Panda updates are about onpage quality, whatever it might mean;)
  • Duplicated content fraction of a given page is one of the most important factors, which rank a page down
  • Duplicated content can be easy measured

SERPs disintermediation - Google battles intermediaries

There is a Federal trade Comission's report, about how Google misuses its prevalence to kick off intermediary players from some search verticals, like comparison sites or aggregated shopping, where sites sell products from different manufacturers.

Google means, intermediary businesses are poachers and steal Google's money. Google means, SERP is the place for users to see original manufacturers or direct service provider. SERP should be not the place for intermediary services, cause they are secondary. And the sign of secondarity is easy to measure: it is the fact of presence and the proportion of the duplicated content.

The intermediary job to compare and to aggregate stuff and user voice would belong only to Google, because only Google were good, honest, and, last, but not least, it doesn't offer duplicated content - it's just a search engine, not?
Google is a strong rival, playing by own rules. But do you still want to survive this battle?

Mittwoch, 11. Februar 2015

SEOtools for Excel: solutions for loosing installation folder and disappeared ribbon

My installation of SEOTools for Excel on Windows7 x64 / Excel x32 didn't want to cooperate with me from the first step. As first Excel refuses to open seotools.xll properly - it thought always it would be a text file. Then, after try to install SEOTools x64 version as addin, it wasn't not visible in the ribbon at all, but doesn't want to be deinstalled. I was forced to delete it on the hard way. Then, on trying to install SEOTools x32 version, i was pretty near success: i got the strat splash screen from SEOTools, but then an error alert raised, The Ribbon/COM Add-in helper required by add-in SeoTools could not be registered. This is an unexpected error. Error message: Exception has been thrown by the target of an invocation. And nothing more.

After some investigations it becomes clear, that the problem is in the not corresponding versions of the machine (x64), Win7 (x64) and Excel 15 (x32). BTW. if you need to get to know, what is installed on your machine - here are all the places listed, where you get needful information about your hardware, OS and Excel.

Dienstag, 13. Januar 2015

SEO query string universal solution

universal solution for seo troubles caused by urls with query strings
URLs with query strings can be a real poison for a SEO. The main and mostly harmful damage untreated URLs with string query do, is a not calculable rise of amount of existing URLs with same content, HTTP answer code 200 and untreated indexing management, also called duplicated content. Another issue caused by query strings in URLs is overspending of crawl budget to URLs with query strings, which must be better excluded from crawling and indexing.

On this way a site with untreated URLs with query strings gets on the one side such URLs into index, which don't belong here, on the other side the crawling budget for good URLs could be missed, cause overspend.

There are some passive techniques to deal with query strings in URLs. Actually i planned to publish existing techniques for dealing with query strings in URLs and my solution for SEO problems caused by query strings in URL into my ultimate htaccess SEO tutorial, but then this topic got some more details, so i decided to create an extra article about query strings in URL and SEO.

Preceding kinds of SEO dealing with query strings in URLs

  • while Google means, it could deal with query strings in URLs, it recommends to adjust the bot's settings in Webmaster Tools for each of existing query strings.
  • URLs with query strings could be disallowed in the robots.txt with a rule like
    Disallow: /?*
    Disallow: /*?
    
  • If header information of HTML or PHP files, available with URLs with query strings, can be edited, so it is possible to add rules for indexing management and URL canonicalisation, like
    <meta name="robots" content="noindex, nofollow">
    <link href="Current URL, but without query string" rel="canonical">
    
These methods are mainly manual, require unpredictable workload and solute problems partly. But the good news is: i have an universal solution, working for all URLs with query strings and getting rid of all SEO troubles caused by query strings in URLs.

Freitag, 19. Dezember 2014

Freebase shuts down! Your free Google-proof way into Knowledge Graph closes at 31.03.2015

Freebase closes
Everybody, who tried it, knows, whether it is possible and easy to publish an article about a business in the Wikipedia, specially if the business is far away from being in Fortune500, or Fortune 5k, or even Fortune 500k;) But small, local businesses, like individual entrepreneurs too, have an absolutely legit wish and need to get own websites into the Knowledge Graph...

The only free, public way to create a Google-proof web entity is (not much longer) a Freebase entry. Well, smart people create at least 2 entries simultaneously: first for the person, and second for the business entity, with the person entity as an author for the business entity. I wrote an article about an entity creation with a Freebase entry, which was highly spread and liked. But today is a day for the bad news:

Freebase will close in the nearest time!

And, from 31.03.2015 on Freebase will be available only in read-only status: no more new entries, no more edits. Freebase database will be then integrated into the Wikidata. I relate with it to the yesterday post from Freebase Google Plus account, where the closing and integration road map is detailedly described. In the mid 2015 Freebase will be shut down as the standalone project. But what means putting Freebase out of service for all of us, who wants appear in the Knowledge Graph, but haven't enough mana to appear in the Wikipedia?

Sonntag, 23. November 2014

Facebook changed on last friday its ads targeting rules... and becomes doubtless moron champion

Earlier it was possible to target an ad audience only by interest, without be forced to select any other option. Since last Friday (even yesterday) one must select... "at least one country!!!" And the ad can be target to maximum 25 countries.

So now, if one wants to target an ad to the whole world, not dependently from country, only on interest - one is forced to set up 8 ads instead of one, and select manually 196 existing countries.

Is this now not ultramoronic? 

Yandex.Metrica