top of page

Introduction Technical SEO: What It Is And How To Get To The Top Of Google [2023]

Technical SEO is a set of internal strategies used to make websites stand out in search results. In the previous article we covered On-Page and Off-Page SEO, but equally important is the technique performed on the internal part of a website, called Technical SEO.


The goal of technical SEO is to improve the positioning of a site’s pages and generate digital authority and organic traffic, that is, to lead users to find the pages of your site without having to place paid digital ads.

Nowadays, there are thousands of pages being published daily on the Internet, and this is precisely what makes the competition to appear in the first positions of a search so great. This is where the work of a Technical SEO comes in.


• What is Technical SEO?

• XML Sitemap

• Robot Directives

• Canonization

• Schema

• Web Vitals

• HTTPS

• Hreflang

• Mobile

• Conclusion


What Is Technical SEO?

Technical SEO is the most important part of SEO and is the set of optimizations related to the internal structure of a site. The intention is for pages to become faster, more understandable, crawlable and indexable, so as to increase the likelihood of achieving a better Google ranking. Technical SEO is the beginning of the entire search engine optimization strategy, so it should be one of the first steps to focus on in SEO. While the focus of technical SEO is to demonstrate to search engines how your site performs, it is also intended to deliver the best user experience.


The best way to understand technical SEO parameters is to understand how Googleboot crawling works:

  1. The crawling engine creates a list of all URL’s that it finds for links within the pages of a given domain, as well as on pages that are in sitemaps;

  2. After doing all the reconnaissance, Google will prioritize crawling all new URL’s that had not been crawled before and those that need to be crawled again due to some change to them;

  3. This is how the system that captures all the content of the pages is built;

  4. Next these rendering systems deal with canonization;

  5. The renderer loads a page like a browser would do with JavaScript and CSS files. This is done so that Google can see what most users will see;

  6. Finally comes Indexing. The pages that Google wants to show users are stored.


How To Do The Right Indexing

Put XML Sitemap On The Site


This is the formula of a Sitemap in the code, that will be in the header of a site:


<?xml version=”1.0″encoding=“UTF-8”?>   

<urlset xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9″>              

<url>           

<loc>https://oseuwebsite.com/</loc>           

<lastmod>2019-08-21T16:12:20+03:00</lastmod>              

</url>          

<url>                           

<lastmod>2019-07-31T07:56:12+03:00</lastmod>              

</url>

</urlset>


But some sitemaps can be generated automatically for the site through plugins, in WordPress you can use “Yoast SEO” or “Rank Math”, you just have to follow the steps. Another system where you can create a sitemap without having a CMS is in the free software “Screaming Frog”, also very useful to see other parameters of technical SEO.


Once you have created the Sitemap, it has to go to Google. In Google Search Console there will be a tab called “Sitemap”, here you have to add/submit the URL of your Sitemap. Google will process it and give you the message that it was a “success”. In Google Search Console you can put several Sitemaps of the same website, in fact if a site is very extensive (for example E-commerce), it is good to have several Sitemaps listed by Categories, Posts, Pages, etc., and not just a single index of the general Sitemap.


Robots Directives

Robots.txt is a text file stored at the root of the website to control and instruct the search robots how to handle the indexing of the respective websites. In the Hearder of each page of the website you can place a robots meta tag, which is a “piece” of HTML that tells search engines how to crawl or index a particular page:


<meta name=”robots”

content=”noindex”/>


Already in Robots.txt you can set some indexing directives on your website in general. For this purpose there is the “Allow” directive that tells search robots what they can index on the website. The code below indicates that it allows “JavaScript” and “CSS” files to be indexed and parsed:


Allow: .js

Allow: .css


You can also set the “User-Agent” command that determines which search robot is addressed, for example “Googleboot”:


User-agent: Googleboot


Additionally, there is the “Disallow” command, which is used to prevent a page from being viewed or indexed by search robots, in this case “Beta.php” and the “Files”:


Disallow: /beta.php

Disallow: /arquivos/


Finally, there is the directive to indicate the “website sitemap”, very useful to help search robots to identify all the existing pages in the domain. Nowadays this directive is in disuse due to the “Google Webmaster Tools” that help in this subject in a more effective way.



Canonization

Usually a domain can be accessed through multiple combinations of URL’s. This allows the user to access the site in different ways within the same domain, for example:



However, when there are multiple versions of the same page, Google will select only one to store in its index. This process is called canonization and the URL selected as canonical will be the one that Google will show in the search results. There are many different signals used to select the canonical URL, including:

– Canonical tags;

– Duplicate pages;

– Internal links;

– Redirects;

– URLs from Sitemaps.


The easiest way to see how Google has indexed a page is to use the URL inspection tool in “Google Search Console”. This will show the canonical URL selected by Google.



Schema

Schema Notation is HTML code placed on website pages that helps search engines better understand their content. Schema is a vocabulary of structured data that defines entities, actions, and relationships on the Internet. This vocabulary enables search engines to understand the meanings behind subjects on the web and in turn provide a better user experience.


Google has a search gallery that shows the various search resources and the Schema needed for your website to qualify: https://schema.org/docs/gs.html.



Web Vitals

Web Vitals (Core Web Vitals) are the speed metrics, used to measure the user experience on site pages. The metrics measure loading with:

– Largest Contentful Paint (LCP): this should occur within 2.5 seconds after the page starts loading;

– Visual stability with Cumulative Layout Shift (CLS): pages should maintain a CLS of 0.1. or less;

– Interactivity with First Input Delay (FID): pages should have a FID of 100 milliseconds or less.



Think With Google found that a load of up to 5 seconds increases the likelihood of the user giving up the website visit by 90%. Core Web Vitals can be viewed in real time, monitored and resolved via PageSpeed Insights.


HTTPS

HTTPS stands for “hypertext transfer protocol secure” and is the encrypted version of HTTP. HTTPS protects the communication between your browser and the server from being intercepted and tampered with by attackers. HTTPS uses a secure certificate from a third-party vendor to secure a connection and verify that the site is legitimate. This secure certificate is known as an SSL Certificate. Any website that shows a padlock icon in the address bar uses HTTPS. The TLS “transport layer security” helps to encrypt HTTPS when using email protection and other protocols. With HTTPS, credit card data, passwords, private user data and personal data are encrypted with a strong layer of security.


When switching from HTTP to HTTPS you have to inform Google about this transition, starting by setting up Google Search Console, Google Analytics, update internal links and update all relative URLs. First you have to open a new HTTPS configuration in Google Search Console, but do not delete the previous one yet, until you are sure that everything has been migrated properly. At this point you have tree possibilities, change the URL’s manually by implementing 301 redirection of those URL’s or change the URL’s by the server, or use some plugin that will update the URL’s instantly, but it is advised to check if this transition was done well. Basically there are many ways to do this redirect, each SEO has to define what is the best condition for his situation. Finally, Google Analytics must give the ok of how the new domain is and is secure, to receive the correct data from the new URL’s. Make sure you choose the best SSL certificate, implement it and the process is done.


Hreflang

As globalization advances, websites increasingly need multiple languages. This is because people from all over the world may have an interest in your products or services, so each language must have its own content. The main care that SEO technicians must take in a multilingual project is the “from – to” relationships, for example, I have a page about “Global SEO PT” and I want that when the user clicks on the English flag, it will forward directly to the page “Global SEO EN”, without the user having to go back to the homepage again, and only then to the page in question. This is made possible by hreflang.


Hreflang is an HTML attribute used to specify the language and geographic targeting of a web page. The hreflang tag tells search engines, such as Google, about these variations. If you are on wordpress you can also use several plugins that do this conversion, the “WPML” plugin is a good solution as it allows you to establish the “from – to” connection, but allows for manual editing of the content by the technician/administrator.


If you want to add the hreflangs manually you need to put them in the header code, in the <head> tags with this structure:

<link rel=”alternate” href=”http://dominio.com/“hreflang=”pt”/>

<link rel=”alternate” href=”http://dominio.com/fr/“hreflang=”fr”/>

<link rel=”alternate” href=”http://dominio.com/en/“hreflang=”en”/>


Browser Compatibility

When creating a website you need to consider the variability of browsers that exist today (Chrome, Edge, Firefox, Safari, etc.). While some users turn to modern browsers, many still use Internet Explorer as the default for their browsing. In addition, each browser reads websites differently, which can hinder the viewing of some of them. Therefore, SEO technicians should consider the limitations of each browser, know where their target audience is, and perform an SEO audit to verify domain compatibility in each browser.


Optimization For Mobile

Nowadays it is more common for a user to access websites via cell phone rather than via desktop computer, so Google’s parameterization has changed and prioritized sites that have good mobile navigation. However, websites are still developed first for desktop, and only then for mobile. One way to speed up the work in developing mobile navigation and help Google’s algorithm to automatically recognize the responsiveness of the site’s pages, is to put the “viewport” tag in the header of the site’s HTML code. This tag guides the browser on how it should adjust the dimensions and scale of the page according to the width of the device.


If you consider a WordPress or Wix site, they come already prepared to deliver responsive pages automatically. In the case of wordpress the “Elementor” (type of page creation framework) is the most suitable for creating user-friendly pages. Technical SEO should adjust the HTML, CSS and JavaScript of your pages for mobile parameters to be at the right size and understandable.


To find out if your site is mobile-friendly, you should go to the Google Search Console report – “Usability on mobile devices” and see if any pages have any problems. In case there are any problems, Google Search Console will report how to mediate their resolution.


6 Technical SEO Tools You Can Use

– Test of Enhanced Research: https://search.google.com/test/rich-results,




 

Conclusion

There are so much more to explore in the world of SEO, all the elements described are a combination of SEO that will get Google to recognize your site, but there are many more methods and combinations. Learn about other technical SEO strategies here.


Today, many companies need immediate results, but the truth is that they cannot afford to implement SEO internally while leveraging with the priority of their business focus. If you still can’t handle these steps or don’t have the time to put them in place, Bringlink SEO ensures you get the brand visibility and growth you deserve.


Talk to us, send email to bringlinkseo@gmail.com.


 

Referências


Ahrefs – Ferramentas SEO & Recursos para Crescer o Seu Tráfego. https://ahrefs.com/pt


Semrush – Marketing online pode ser fácil. https://pt.semrush.com/


Rock Content – Content Experiences your audience will love. https://rockcontent.com/


Resultados Digitais – Seu Portal de Marketing e Vendas. https://resultadosdigitais.com.br/


Marco Gouveia – Consultor de Marketing Digital? Marco Gouveia. https://www.marcogouveia.pt/


Oficina da Net – Site de tecnologia, smartphones, notebooks. https://www.oficinadanet.com.br/


SEO Fórum - SEO, ASO | Tráfego Pago, Markegint e Tutoriais. https://seoforum.com.br/


Agência Mestre – Marketing Digital: SEO, Inbound Marketing, Adowrds. https://www.agenciamestre.com/


Kinsta – Hospedagem WordPress Gerenciada Premium. https://kinsta.com/pt/


KingHost – Hospedagem de Sites Ltda. Wiki hospedagem de sites. https://king.host/wiki/


Pedro Dias – Consultor em Otimização e Marketing Digital. https://www.pedrodias.net/


Hubify – Agência de Marketing Digital de Performance. https://hubify.com.br/

bottom of page