Google Survival: how to survive the next Google Core update

A guest article by Jonas Malur / SEOLYMP

Google is constantly working on optimizing its algorithm and is causing nervousness among website operators. Especially for those who have rankings to lose. What efforts are needed to be prepared for the next core update?

For the overview, the large areas are briefly explained at the beginning, which you must definitely keep an eye on.

Technical SEOstructure and content
This is about checking whether Google can crawl (read) your page. For this, some technical parameters must be observed.Since Google indexes URLs, the URL is also the base unit that covers a particular topic. However, since a website deals with a larger topic, the question arises as to which aspect is dealt with on which page and how the content is structurally related.
UsabilityTopic Competence
How the content is prepared and whether it is well prepared for all end devices is a question of usability. The user experience is the result of the quality of the content consumed and the way it is consumed.Theoretically, everyone can say something about any topic. But do you also have the competence to take your messages from you? How do you show your visitors that you know the topic?

Google has been online as a search engine since 1997. Since then, a lot has happened to the algorithm. At the beginning, the websites were all rated equally, but this has changed massively over the course of 2 decades. The search engine today recognizes site types and organizes them into a specific pattern. Each pattern has its own evaluation spectrum. Therefore, it is not easy to optimize the ranking factors individually.

What you should consider

Well, first of all, you should think about what rough pattern you would fall into with your website.

  • Which industry?
  • Is it a shop? An affiliate site? A guide?
  • What’s the main theme of your Page?

Digression: Your Page’s Main Topic

The main topic is easy to determine in many cases. In some cases, however, this is very difficult. For example, there are labels that produce different products that cannot be described in an industry term. Ultimately, the label itself is the theme. Only the label may be unknown. In this case, organic traffic will not be a relevant channel. A multi-domain strategy could help drive more organic traffic to your sites in the long run than if you bundle all products on just one domain.

If you have set a pattern for your website, you can easily compare yourself with other websites. Perhaps there is not so much direct competition anymore. To survive the next core update, it’s crucial to know what context you’re actually in. This is the only way to know for which search terms you want to be found and which keywords you want to protect from the next Google update.

How do I protect my keywords from the next Google update?

Basically, the search intention of the inquirer must be optimally served on the SEO landing page. This includes not only the content (I think that’s self-evident), but also:

  • The preparation of the content (Is the slider really the optimum?)
  • The order of the contents (Are the contents logically structured?)
  • The readability of the contents (font size 8 could be too small)
  • The style of the content (Does the preparation fit the target group?)
  • The fulfillment of the snippet promise (Do you keep what you promise in the snippet?)

Structurally, you should also think about your most important sides:

  • Is the (so important) page easy to reach via the main navigation?
  • Is the page sorted correctly in the site hierarchy?
  • Is the site hierarchy even logical and appropriate to the main topic?

It is central to success that the website reflects the entire range of topics of the main topic in the form of the website type. So either as a shop in the form of products or as a guide in the form of information or as an affiliate site in the form of advice and recommendations.

  • Is there keyword competition on your website?

You should never have two different landing pages for your keyword, because how is Google supposed to decide which page should rank at the top for the search query? Well, Google is struggling, experience has shown that it does not get the most out of it for you. In the best case, you will rank with both sides on page 1, but probably not on position 1.

If there is a lot of keyword competition on your site, then you won’t survive the next Google update. In this article, we also include the duplicate content that you have probably heard of before (same content on several URLs).

Also pay attention to the pages indexed on Google, which you can access via the site command:

site:your-domain

The more non-optimized pages of your site are in Google’s index, the more likely you are not to survive the next core update.

Now, there are page types, such as product pages, that experience shows have little impact on your ranking, even though they’re not optimized. That’s because Google recognizes product pages as patterns. Informational pages and category pages that are of some importance within the page hierarchy have influence and should be optimized. Then it works with the next update.

When is a page (URL) not optimal?

Again, there is a fundamental and a specific side of the coin to answer this question:

The basic side of the coin

If the topic is “production of sustainable socks”, how much expertise is needed to write a good contribution? If you are an expert on the topic, then you can demonstrate it in the post, but you could also back up your expertise with facts. A label for sustainability and at the same time a manufacturer of organic cotton socks has a very high level of expertise.

If someone from production writes the article himself, then the authenticity of the expertise is given. However, if the contribution is written externally, then the authenticity is not given at first. As a publisher, it’s your turn to show your readers how to keep it with expertise and authenticity.

After all, the user wonders how trustworthy your site is. What is your reputation? To answer this, analyze the appearance of your website! Does it contain distracting advertising? Is the layout contemporary and appropriate to the topic in terms of seriousness? What does your link profile look like? Who links to you and do you link to trustworthy sites?

Expertise, authenticity and trustworthiness result in your reputation. The worse your reputation, the more vulnerable you are to the next core update, even if you’re actually an expert on your topic.

The basic side of the coin is also known under the keyword EAT and was on everyone’s lips among SEOs in 2018. In Google’s Quality Guidelines , there is a whole section (point 3.2), which you should definitely read in the original.

The specific side of the coin

You can’t influence ad hoc whether you’re an expert on a particular topic and whether your site is trustworthy. This is a long process also on a personal level. What you can directly influence is:

  • With some work and research, create the main content of the page
  • Make a connection between the topic of the page and the purpose of the website. So why are you writing about sustainable sock production?
  • Bringing together search intent and landing page content
  • Check if you’re using advertising excessively.
  • Make sure you provide the necessary information about yourself and your website (imprint, About Us page, author of the post, sources, etc.)

Keep an eye on the technical component of your website

Crawling and indexing

Google crawls your page. That means you should give them access to all of your Page’s resources. There are at least 3 parameters that can prevent your website from being crawled:

  • In the robots.txt
  • At the URL level using robots meta tag
  • At URL level via http header

The robots.txt is located in the root directory and you can call it directly in the browser:

your-domain/robots.txt

If directories are listed here, it could be that important resources are inaccessible for rendering. You can use Google Search Console to check if necessary resources are blocked.

At the URL level, you will find the command in the source code for each URL with the default values index and follow.

However, if it says “nofollow”, Googlebot will not follow the links on the page. If this applies to all URLs, Google simply won’t be able to crawl your page.

The same applies to the HTTP header. If it says “nofollow”, it just doesn’t work. To check, use an HTTP header tool like webconfs.com’s or your browser’s inspector (I use Chrome, it definitely works).

The index or noindex tells Google whether the URL should be indexed or not. This can be set at the URL level. Of course, Google can only index your URL if the crawler also gets to the URL. So if your domain is completely nofollow, it’s clear that your website can’t be indexed. But what’s worse is that it can’t even be crawled.

Usability and accessibility

Whether your website is operable and accessible for all devices also depends on your website. It should be optimized for mobile devices and have a good loading time. Nothing is more annoying than a page that loads way too long. You don’t have to exploit every potential down to the last detail, but good caching, compression, and a lean website (resources needed to load such as images, CSS, and Javascript) are essential to keep visitors from starving on their phones.

However, good usability also includes a few standards such as accessibility and usability of navigation. Imprint and terms and conditions must not be collapsed in the footer on mobile devices. Data protection policies should also be implemented. A usually free SSL certificate has also become the web standard, even for pure information pages.

It is these standards that Google automatically checks.

The accessibility of a website and URL also includes the status code that your server issues when queried. This should usually be “200”, which means that the request was successfully processed and a positive response was sent from the server. An overview of all status codes can be found in Wikipedia.

Basically, optimizing the technical condition of a website is not always worthwhile. If a page can be crawled well and the loading times are acceptable compared to the competition (in the upper third), then technical optimization seems too expensive in view of other ranking criteria such as search intent, structure and reputation. The technical condition must not cause Google to crawl your page effectively or cause the loading times to lead to abandonments. Fine tuning is not the top priority.

Conclusion: No checklist!

If you were expecting a checklist, you’re probably disappointed now. Before a Google Core update, you can only arm yourself in the best possible way, but not shield yourself. Even a well-optimized affiliate site can be re-evaluated on an update and lose many important rankings. However, if you take the hints in this article to heart, you will emerge more often as a winner than as a looser in the sum of all future core updates.

Focus on your topic and the context in which you position your page. When setting up and optimizing individual pages, always think about your target group and their expectations. If in doubt, research the search intention and build a page that has washed itself! Work on your reputation and showcase your expertise. Google will love your site because your visitors love it too.