Technical SEO has clearly been one of the pillars that enabled many businesses to thrive in the information age.
With 3.5 billion searches being made each day (which is still stirring towards a consistent growth pattern); it only proves how more significant – and more competitive – search optimization will be for many businesses in the years to come.
SEO, as a medium for traffic acquisition, hasn’t changed that much in over a decade of its existence. The primary objective of the practice is still the same – “to make it easier for people to get to the information/solution they need”.
When Google launched the Panda Update back in 2011, they’ve also created a list of questions that provided guidance to webmasters and SEOs on how they should optimize their sites, which pretty much applies up to now:
Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Would you be comfortable giving your credit card information to this site?
Does this article have spelling, stylistic, or factual errors?
Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
Does the article provide original content or information, original reporting, original research, or original analysis?
Does the page provide substantial value when compared to other pages in search results?
How much quality control is done on content?
Does the article describe both sides of a story?
Is the site a recognized authority on its topic?
Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
Was the article edited well, or does it appear sloppy or hastily produced?
For a health related query, would you trust information from this site?
Would you recognize this site as an authoritative source when mentioned by name?
Does this article provide a complete or comprehensive description of the topic?
Does this article contain insightful analysis or interesting information that is beyond obvious?
Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
Does this article have an excessive amount of ads that distract from or interfere with the main content?
Would you expect to see this article in a printed magazine, encyclopedia or book?
Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
Are the pages produced with great care and attention to detail vs. less attention to detail?
Would users complain when they see pages from this site?
Although, Google has continuously revolutionize their ways of improving how their search engine assesses web pages that deserve better visibility, to practically match how web consumption evolved in the past few years (such as the steady growth in mobile search, which in turn gave birth to mobile, site-speed and web security-related metrics).
But there are still several aspects of technical site optimization that many of us tend to forget when doing site audits. And these areas will be the focal point of this blog post.
Maximize Crawl Budget
“Crawl budget is the time or number of pages Google allocates to crawl a site”. – AJ Kohn
The amount of crawl budget given to a website is often based on the PageRank it has. The higher a site’s PageRank is, the more important it appears as an entity to Google (which also means the more frequent the site will be crawled, indexed and ranked).
Aside from the basic concepts of crawl optimization wherein the site’s architecture and internal linking structure are optimized to make it easy for search crawlers to get to any page from a website in a few clicks – it is also vital to conserve crawl budget by blocking crawl paths (links) that may only waste some of the allocated crawl budget to the site.
Normally, these kinds of crawl paths are links to pages that you’re not really aiming to rank well for competitive queries on search results (ex: terms and conditions, contact us, shipping and delivery, etc…), which are more often than not accessible on many site’s inner pages.
When these internal links are accessible to Googlebots on most of the site’s deeper pages, it allows crawlers to repeatedly crawl these pages over and over – which wastes a lot of your overall crawl budget (and passes a lot of link equity/PageRank to them that could have been more valuable if passed to other pages instead).
On maximizing site’s crawl budget:
- Use the Nofollow attribute for the internal sitewide links that point to pages you don’t really want to rank when in the deeper level of the site (as you can still have the Dofollow tag on them from the site’s homepage).
- Once these pages are already indexed, you can also disallow crawlers from accessing these pages via the Robots.txt.
- Improve deep crawling by creating more contextual internal links to pages that host links pointing to deeper key pages of the site (ex: important categories, list of your popular blog posts, etc…).
Blocking access to thin-content and duplicate pages:
In the Google Panda era, this method for site optimization has since been a norm (aside from pruning low-quality indexed pages from websites). But another positive effect in which this process helps very well is in the aspect of crawl optimization.
Allowing search bots to crawl URL parameters that you don’t even want to be indexed is a total waste – and blocking access to them can surely save a ton of your crawl budget (hence, more value to be passed around to pages that you want to rank well in SERPs).
The best way to do this is to disallow crawlers from getting into these pages via the site’s Robots.txt.
There are also a bunch of free tools that you can use to find your site’s duplicate pages as well as the URL parameters that you shouldn’t be allowing to be indexed like Google Webmaster Tools (check my guide),SEMRush Site Audit feature and Siteliner.
Optimize for Knowledge Graph and Google Quick Answers Box
As defined by Search Engine Land – “The Google Knowledge Graph is a system that understands facts about people, places and things and how these entities are connected”.
Knowledge Graph Optimization for brands is usually more on improving their sites’ entity detection. For them to make it easier for search engines to better understand their brand through the connections it’s being commonly tied in with (through relationships that can be based on content, mentions, and links)
There are so many ways to do this, in which implementing structured data is one (probably the most popular one), as outlined by AJ Kohn in his post from last year:
Using more entities (aka Nouns) in your writing
Getting connected on social platforms and linking out to other relevant websites
Implementing Structured Data when appropriate to increase entity detection
Use of the sameAs property
Getting featured on Wikipedia
Creating an entry on Wikidata
Note: Getting more unlinked brand mentions (including other related entities to your brand – ex: authors, executives, branded products, etc…) and branded links to your site can also extremely help in making Google better understand what your brand is about (see Co-occurrence and Co-citation).
Another interesting angle on how improving entity detection might help brands further in getting more search visibility is from a recent discovery by Dan Shure on how Google is showing related searches on their autocomplete feature.
As for the Google Quick Answers Box, it is also a part of the Knowledge Graph that’s designed mainly to answer “what is” and/or “how to” queries, wherein snippets of content from pages that best answer the a certain query are extracted and displayed as the 1st organic result in the SERP.
BrightEdge’s Kirill Kronrod shed some light on a post he wrote about Google Answers Box, on the factors that Google could be using to evaluate content/pages to feature in their answers box:
Other notable resources:
- 101 Google Answer Boxes: A journey into the Knowledge Graph
- Optimizing for the Google Quick Answers Box
- Semantic Optimization with Structured Data
Optimize for Long-Click
User Satisfaction is the most important ranking factor in Google’s search algorithm.
Long-click, as a metric for determining how satisfied users are on the pages they’ve found on search results, is a concept that has been around for quite some time now (many experts have already discussed the topic thoroughly in the past – see AJ Kohn, Bill Slawski, and Cyrus Shepard’s takes on it).
As Stephen Levy explained on how Google is utilizing their user behavior data to improve how they serve information on their search results:
“Google could see how satisfied users were. The best sign of their happiness was the “long click” – this occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query. But unhappy users were unhappy in their own ways, most telling were the “short clicks” where a user followed a link and immediately returned to try again.”
Improving user satisfaction for search visitors:
- Ensure that the search queries that people use to find your site’s key page(s) accurately match the information provided by your page’s content. Study your most visited landing pages through search, then analyze those that have poor usage data and identify the areas where you can improve them.
- Optimize with the searchers’ intent in mind to provide better experience for search-driven visitors. Does your page load fast? Will it be easy for skim readers to find what they are looking for?
- Reduce disruptive elements on the page, such as pop-ups, opt-in forms, display ads, etc…
- Provide superior content with high quality images to reduce the chances of search visitors exiting back to the search results (provide better content than your competitors!).
- Link out to highly relevant sources (especially to content written by other topic experts) – and it’s also best to include internal links that point to your other inner pages that have high engagement rate.
- Have clear page-level CTAs to cultivate more actions from your visitors (ex: encouraging visitors to leave comments, share the content on social media, subscribing to your feed or access to any free assets your brand offers).
This process covers half of “time to long-click” – as a metric – given that you’ll also need to increase the amount of visitors being driven from search to basically get the whole score.
- Use strong and straight-to-the-point title tags for your key pages.
- Include a sales proposition or a call-to-action on your pages’ meta descriptions to attract more clicks (when appropriate).
- Update and republish your site’s older content assets (but are still continuously generating traffic) – for freshness factor.
- Implement Structured Data to generate rich-snippets.
Bulk up more signals on pages you want to rank
Many practitioners have grown accustomed to only relying on two major ranking factors when it comes to improving certain pages’ search rankings (keyword usage and links).
But SEO has been far more complex these days if compared to how many of us approached it in the past. The good thing is that there are so many other ranking factors out there that we can now actually explore and tap into.
- Funnel more link equity to pages you want to rank better. You can easily do this by identifying which pages on your website are able to earn natural links on their own (you can use Ahrefs to see your site’s top pages/content in terms of links and social shares). Internally link these pages to key pages that need more link equity to boost their search visibility.
- Include other content formats to make your content more comprehensive such as videos and images/infographics. This can help your pages generate better usage data.
- Make your pages (that you want to rank better) load faster. More tips here.
- Link out to other credible and authority websites to increase the trust scores of your pages (also to improve page-level activity).
- Update, lengthen and/or use more synonyms (as well as highly related nouns) in the content of the pages that are having difficulties in ranking well (to optimize for freshness and latent semantic indexing). You can also merge redundant pages from your site (pages that provide similar information) to integrate both pages’ ranking value (they’ll have better chances of competing in SERPs this way).
- Restructure pages that are getting more mobile traffic – in a manner that’ll be easier for mobile users to consume (ex: including a summary of the content placed above the fold, and breaking the content into shorter paragraphs to make it scannable).
Continuously work on and monitor your brand’s external linkable assets
All the other platforms brands normally use to promote themselves outside their websites can be considered as linkable assets. Whether it’s a fanpage from social network or a single guest blog/column that was well-received – they all form a channel in which the brand is enabled to connect and build relationships.
Amplifying your brand presence through these initiatives (social media marketing and content distribution) will not just help you get your name out there, but will also help produce a lot of links – which is still very important in SEO.
Consistency in execution for these other marketing practices is not that difficult, though what many of us tend to forget is the part where these marketing initiatives can be integrated with SEO.
It’s important to monitor how well these brand assets are doing in terms of natural link acquisition. Find out who’s linking to your social accounts that are getting more traction. Because if they’re linking to these external brand assets (Twitter, Facebook, Pinterest, Youtube, etc…), they’d be probably interested to link to your website as well.
My Twitter handle has over a hundred linking domains – you should start checking yours too.
And it’s the same with your guest posts. Regularly check who’s linking to them, sharing them on social media, and the people commenting on them. These are potential relationships. And relationships fuel successful link building campaigns.