- The short answer is: no. There is no such metric or signal or anything, what would Google measure.
- The long answer is: ...yes! How? Optimizing text to code ratio to the benefit of text, we reduce the code rate. Less code generally effects less page loading time. And this is very well a measurable signal, evaluated by Google and influencing a rank.
Disclaimer
The optimization of site loading time isn't a topic of this article, at least cause each site is unique. To get a unique advice about your site loading issues, analyze it with services like GTmetrix, http://gtmetrix.com/: it's free and unifies both of Google's site optimizer and Yahoo's YSlow.
All following advices connote a condition "if possible". Think about this condition and don't adopt each advice blindly!
How to minify the webpage's code length and let searchbots immediately reach your content
General objectives
- Mantra number 1: main content loads above the fold;
- Mantra number 2: "ready to read time" must target 200-300 ms, if page keywords are middle to highly competitive;
To-do's
- Keep in mind source ordered content: main content comes first!
- No inline CSS-styles;
- No inline JS-javascripts;
- No inline images;
- Put most javascript declarations just before the closing body-tag and not into the HEAD area;
- Just before deploying your project in production stage minify, compress and gzip everything: javascripts, CSS-styles and HTML code. There is a bunch of free and commercial minifying, compressing, gzipping tools, but always test output results - minified/compressed code could get broken, if tools use too aggressive settings.
- Trim protocol from URLs: URLs are still working without "http://";
- Don't close tags, which don't need to be closed (according to HTML version you use);
- Make use of anchors and relative URLs instead of absolute URLs;
- Shorten class and id names;
- Make use of URL shortening services;
- Strip EXIF metadata from images, which you don't self-written.