Most of the articles about technical SEO around the web basically talk about best practices which you can do to make sure that your technical SEO does well. However, the matter of fact is that it’s hardly a possibility to work on technical SEO after reading those best practice guides. Therefore, it could be more beneficial to talk about different ways to troubleshoot technical SEO issues.
info: search operator
There are a number of issues you can look into with the help of [info:https://www.domain.com/page]. For instance, you can check if a page is indexed and how it is indexed with the help of this operator. Sometimes, Google can fold two duplicate pages into one index. With the help of this command, you can view canonical version of the page. This version is what Google chooses to index.
Adding ‘&filter=0’ at the end of Google search URL
While looking at the result page in Google, adding ‘&filter=0’ at the end of the result’s page URL can lift any filter applied to the Google search results. When the filters are removed, Google shows even those websites which are currently in the Google’s consideration list. In the search results, you may be able to see two versions of a page. Hence, you will also be able to identify the issues which would make one version disappear when filters are applied.
This one is pretty popular. This command is used to find keyword relevant pages in a domain. Hence, you will be able to find link opportunities by looking the search results.
An interesting benefit of this command is that you can find it easy to figure out if your website is eligible to for featured snippets. Using this command, you can look into the top websites to find what’s included in those sites that make them eligible for the featured snippets.
Static vs. dynamic
The better approach is to look at the ‘inspect’ instead of view-source. And then you can look at Fetch and Render” in Google Search Console to get the better idea about how Google sees your page.
One of the most important things is to look into the robot.txt file. There are certain areas of your website that you want unblocked and also the some that you want blocked. While looking in the robot.txt file, you will be able to know if a page you want to be public is blocked.