Search-Engine Optimization (SEO) is a field surrounded by many myths and misconceptions. There are four main reasons for this:
First Only the engineers working on the search engines know exactly how their systems operate at a given moment. As an outsider, one has to figure out how they work with the help of one’s own research, experiments and even failures. Anyone who doesn’t do this – or who is simply new in this business – has to trust fundamentally unreliable information which may have been cycling for years trough the web.
Second SEO knowledge is changing rapidly because search engines develop their technology very fast. Moreover, what boosted your site’s ranking yesterday, has no effect today, and may well harm your search engine rankings tomorrow.
Third The technologies used to search and index the vast amount of data found on the web are highly complex and heavily reliant on mathematics. Those unfamiliar with computer science, its key concepts and its specific terminology may easily misunderstand important aspects.
Fourth Search engine rankings are of such economical importance, that many parties involved in this business deliberately spread misinformation. This includes the search engines themselves and some SEOs who try to protect their “secrets” or want to harm competitors or simply don’t know it better.
Having worked in the field of Search-Engine Optimization for 13 years now, I have grown tired of the stream of useless junk and misinformation steadily coming from various forums, blogs and social networks. I therefore intend to provide some help to SEO newbies and web master trying to figure out what they should and shouldn’t do to better rank their sites on Google. In this series of articles, I will try to debunk some of the SEO myths most commonly encountered on the web.
The following short “episodes” are based on my experiences with many different websites, diverse clients from various industries and the people involved in breaking and fixing those sites:
“Google can figure it out on it’s own…”
This is a more general statement you often hear in discussions with people who see themselves in opposition to SEO. Often this is in house marketers who feel the pressure to compete against external consultants, the copywriter who see their writing skills in disregard or the person responsible for the the later implementation of SEO features from the technical side. The topic could be anything from URL structure over keyword cannibalization between documents to redundant linktexts. This mantra to suggested optimizations is always the same and it is implying that nothing is wrong as it is and the consultant is over exaggerating an issue.
The answer to this is simpler than the description of the problem: when Google can choose between comparable websites for a search term which one will they chose? The one where they have to figure everything out (and go sometimes wrong with it) or the one where everything is clear and logical for a software system like the Google-bot? The very core of SEO is to help the software (the Google crawlers) to understand the structure of content of a website and little issues usually sum up to bigger issues no matter if Google could still index the website. The primary goal is not rank better than the competition.
The perfect Meta-Description
There are a lot of misconceptions around the infamous meta-description HTML-tag. Opinions about it are ranging from “highly important” to “complete waste of time”. Both sides are wrong (like so often in live) and the answer lies in the middle. The first fact about this meta element is it will not boost your position in search engine result pages (SERP) in any direct way. But why are some SEOs still so obsessed with it? This comes with the second fact about it: it gives you some leverage to exploit given SERP rankings.
But how does this work?
Google (and other search engines) might choose the pre formulated description over the auto generated snippet to present a website to its user. If you know the rules it’s giving you the opportunity to take some influence on how your website is presented and to boost your SERP click through rate (CTR). The cool thing about a higher CTR is it might even give you a better position over time when the search engine sees your result has higher CTR than the surrounding results.
What are the known facts?
Google is officially stating “there is no limit on the length of a description”. But with an important restriction: they will truncated it to save space in the SERP. So this is actually a severe restriction as a long description will result also in a ugly auto-snippet. To discover the real available character length tech savvy SEOs do some analysis of the snippets in various SERPs from time to time. A recent example could be found at rankranger.com. But there is another misconception with these studies: to save time and effort they only look at the length of the presented snippets and not if this is the actual description of that website. Why does this matter? Google is often merging stuff like dates in the auto generated snippets. When you look into the details you will find only these auto-snippets are variable in length.
What is a optimal meta-description?
Google chooses according to my research a description:
- ONLY if it is shorter than 160 characters (so the maximum length should be 159 characters)
- ALSO if Google decides to present a date in front of the snippet they only choose the full description if it is shorter than 146 characters
- AND if enough keywords or synonyms overlap between the search request and the description
- OR if the search request is generic or brand related
The perfect meta-description should be formulated short and sweet like an advertisement and be written with the target audience in mind. More official help around this could be found in Google Search Console Help.