A Definitive Guide from Basic to Advance


There was a time when on-page optimization meant keyword stuffing. With Google’s Panda update, the websites that relied on this tactic were thrown out from the first page! 

In SEO, strategies that work today may not work after a few years. This is why I always recommend budding SEO marketers to keep a tab on what’s happening in the industry.

In the post-Panda era, the Search Engine Results Page (SERP) listing is replaced by pages that provide value to the users. If you’re looking to rank a page on Google, your priority must be to add value to the already existing content that is ranking on the #1 position. 

As you already know, the two critical factors that Google considers while ranking websites are on-page and off-page factors. In this blog, you’ll get a glimpse into the basic to advanced on-page optimization strategies that you can use for your page. 

What you’re about to read is not another guide that boasts about never-tried-before techniques. I’ve used the same on-page optimization strategies that you’re about to learn to rank my blog in the #0 position for the most competitive SEO keyword “Google algorithm update.” 

Google Algorithm Update SERP page

So to begin with, let’s delve into the basics of on-page. If you’re looking for advanced strategies, please use the table of contents section to navigate and start reading. 

What is On-page Optimization in SEO?

On-Page SEO is a website optimization technique to rank pages within the site on search engines. Proper on-page optimization of a page will ensure that it ranks higher on search engines for relevant keywords and drive targeted traffic. 

On-page SEO techniques include optimization of web content, URL, metadata, images, and page speed. As the name itself suggests, on-page SEO is about optimizing the internal elements within your website to achieve search engine friendliness. 

Many times, websites that fail to do on-page optimization don’t show up on the first page of search results. The reason for this is that the Google Algorithm considers a handful of on-page factors while ranking pages on the first page. 

One of the most critical factors that determine the position of a page on SERP is relevance. If the content on your page is not relevant to the query entered, you won’t rank on Google 99% of the time. 

During each stage of on-page optimization, you have to ensure that the page stays relevant to the target audience. If your page is not relevant to the users, it will end up with a high bounce rate and eventually, your page will lose its position on the Google results.

Why is On-page SEO Optimization Important?

The success of search engines relies on the relevance of the results provided to the users. Since Google provides the best results to the users, thanks to its advanced algorithms, it’s currently omnipotent in the search landscape. 

On-page optimization becomes all the more important due to this reason. The whole purpose of doing on-page SEO is to ensure that the users and the search engine understands what is offered on a page. 

This also creates an opportunity as proper on-page optimization paves the way for you to find the target audience and send across the message effectively. 

All the factors that you’re about to learn through this post are critical as they give important signals to the search engine about your website. Skipping any on-page SEO element may end with your competitor taking up your position on the SERP, which is exactly what you don’t want. 

Basic On-Page SEO Checklist for Beginners

On-page SEO Checklist image representation

On-page SEO, as you may be already aware, is one of the biggest ranking factors and it has the ability to make or break your website. Here are a few basic On-page SEO factors that can impact the search engine rankings of a page on search engines:

1: On-page Content 

Image of Content on a web page

You may have come across the SEO proverb, Content is King. As I told you in the beginning, the SEO industry is so volatile that things change over time. Content is still a deciding factor, but defining the quality of the content depends on varying factors. 

You may have come across scenarios wherein your highest quality content fails to make it to the first page of the Google Search list. The major reason for this is that the content has been written without much audience research. 

If you create long-form content on a topic that is not in the interest of the target audience, chances are you won’t get traffic despite the best optimization efforts. What you as an SEO must do is identify the most searched queries of the targets and try to give them the answers through your content. 

Proper keyword research will help you find the most searched query string, and you can use them within the content to rank high on search engines. Check out our in-depth article on how to do keyword research to learn the various ways of finding keywords that get you to the No.1 position on Google Search. 

In addition to this, the best strategy would be to identify the kind of content that is working for your competitors and emulate it on your website. However, emulating doesn’t mean copying or scrapping the content. 

Google hates websites that do this and things can go awry if you indulge in such practices. The best way to do this is to identify the content from the competitors and put it down in your perspective by adding more input, data, and research. 

2: URL Optimization

Image of a browser tab with URL to represent URL Optimization

URL is something that SEOs give the least importance. However, considering that URL is the basic building block of a website, you cannot ignore it. In addition to that, all the important discussions about internal links, site architecture, and link juice, which we’ll talk about in a later part of this blog, have URL as its core. 

You may have come across URL structures like this:

By the sheer look of it, you will understand that it’s a real mess. The problem with these kinds of URLs is that neither the users nor the search engines can understand what is within the link.

A URL is supposed to provide a gist of what the page is about to the users and search engines. This is one reason why I have always recommended to make the URLs as short as possible while using the target keyword. 

An ideal search engine and user-friendly URL would be: 

www.example.com/blog/how-to-do-keyword-research

www.example.com/store/iphone-11-pro-grey

Do a simple Google search for “iPhone 11 Pro.”

You can see that the top 10 results showcase clean URLs. If you’re using WordPress, Joomla, or any other CMS for that reason, they automatically create SEO-friendly URLs using the title tag. 

However, if the title is lengthy, the URLs tend to be longer. The best practice would be to use the target keyword at the beginning of the URL and remove the rest. If you find the target keyword already used for another page, take the secondary or LSI keywords to generate the URL. 

3. Optimize Meta Title and Description

Optimize Meta Title and Description

Optimizing the meta title and description of a page is critical for improving the search engine rankings and the click-through rate of the website.

Meta Description

Google has confirmed that meta description is not a ranking factor, but ignoring it can lose you valuable click-through rates. The best way to optimize the meta description of a website is by providing the users with reasons why they should visit the page. 

The text that goes into the meta description is most likely what Google will display on SERP (unless it decides not to and picks up some random description from the content). Since you’re competing with at least 10 other competitors, it’s important to make the description as click-worthy as possible. 

Making the description click-worthy doesn’t imply that you have to stuff it with keywords. Use the keywords and the LSI as naturally as possible if you think it adds more value. However, this may not help in improving the SERP position on Google.

If you want to learn further on how to optimize the meta description so that it doesn’t truncate, read our blog on Meta Description best practices. 

Meta Title

Even after all these years, the meta title still remains an important ranking factor. Optimizing the meta tags both for the users and the search engine is important. The best meta title optimization trick is to ensure that your target keyword is placed at the beginning of the title. 

Example: Here is how I optimized the meta title of my top ranking post-Google Algorithm Update.

Google Algorithm Update SERP Page

My Target Keyword is “Google Algorithm Update” and I have strategically placed it in the very beginning of the title followed by a semicolon. 

As and when a new update happens, I update the title, description, and the content. However, the first part of the meta title remains the same. This strategy really works for long-form content and also for content that you plan to keep evergreen. 

One of the most common mistakes that SEOs make while optimizing the meta title is to place the target keyword towards the end of the title. This strategy can backfire as Google might truncate the keyword when displaying it on SERP. It reduces the possibility of the page appearing and being clicked. 

If you want to learn more about optimizing the meta title without getting it truncated, read our in-depth blog on meta title optimization. 

4. Heading Tags

The heading tags within a page give both search engines and its users a fair idea of the topic that they are reading. When it comes to crawlers, especially Google’s, the H1 tag comes off as an important ranking element. 

Placing the target keyword within the H1, which usually is the page title, carries the same ranking weight as optimizing the Meta Title. In the majority of cases, Google will consider the H1 tag of a page if there is no predefined Meta title.  

There is a lot of misconception about using multiple H1 tags. However, Google’s John Mueller has categorically stated that multiple H1’s won’t affect the search engine rankings of a page. He also added that Google’s algorithms are fine with multiple H1s if the users are happy with the way the content has been structured. 

That said, using the different variations of the heading tag will definitely provide Google with ample information about the main topics and the following subtopics. This can add more value since featured snippets are picked up based on the sub-topics listed under each article. 

5. Alt Text for Images

The image alt text is an important On-page SEO factor that website owners leave unattended. The alt text is usually a description of an image added to a website. This comes handy for web browsers when the image fails to load on a page. In such cases, the text used within the alt description will fill the image space and provide users with context. Apart from this, a written copy that appears in place of an image on a webpage if the image fails to load on a user’s screen.

Apart from this, Alt text plays an important role in On-page optimization as search engines value it for helping the visually impaired understand the content. However, SEOs nowadays try to sprinkle keywords within the alt text, which doesn’t fulfil the purpose. 

The best practice here would be to provide a descriptive alt text that features the LSI keywords so that the user gets an idea of what the image conveys. 

Example of Alt Text: 

Good Practice: alt= “An SEO professional doing on-page optimization for a website in her laptop.” 

Bad Practice: alt= “on-page seo expert”

Advanced Techniques

1. Internal Links

Image representing Internal Link optimization

As a website owner, you should assign a hierarchy for each section of your website. This will provide users and search engines with the option to navigate and fetch relevant information easily. 

Internal links are hyperlinks from one page to another page within the website. The link can be placed using resources such as text, images, videos, or documents. A proper internal linking structure will determine the importance of pages within a website. 

Understanding the importance of each page is vital as Google passes the link juice. The concept of link juice is also valid for internal links as only a properly interlinked website can pass link juice generated for one page to the other. 

Here are some of the advanced internal linking factors to consider: 

Crawl Depth

Crawl Depth is an important internal linking factor to consider when setting up a website. Crawl depth refers to the internal linking architecture of a website that determines how easily a search engine can find and index pages. Generally, a crawl depth of three is the maximum as any deeper pages may fail to get the initial crawler attention.

Important money page (service, product pages) must be strategically placed within the crawl depth range of 0-2 for better crawlability.  

Page Hierarchy

Image of Page Hierarchy

Internal linking is one way to establish the hierarchy of pages within a website. The more internal link value you give to a page, the more important Google considers it. 

Link Relevancy

Though the links are within your website, it doesn’t mean any page can be linked with each other. Ensure that only relevant pages are interlinked because Google dislikes websites that try to trick its algorithm. Try to provide internal links to contextually relevant pages using highly relevant anchor text. 

Contextual Links

Adding too many links within a page is considered as a bad SEO practice. Providing 100 internal links from a 1000-word content will make the page look spammy, and Google may never show it on the first page. Even though there is no definite number for internal links, ensuring that it remains natural is important. 

Anchor Text

Anchor Texts are important for hyperlinking within one page to another. It’s the anchor text that gives contextual signals to Google Crawlers about the relationship between the pages. It’s recommended to use long-tailed anchors for internal linking as it provides more context to the users and Google crawlers. 

If you want to learn more about internal links and how to use it effectively, read our in-depth article on everything that you need to know about internal linking

2. Avoid Intrusive Interstitial Propertiesimage representing Intrusive Interstitial Properties

Think of a website that opens up to a full-screen video ad, and when closed, redirects you to another page with multiple pop-ups. I would close the whole tab instead of closing the pop-up and go to some other less complicated page to fulfil my search intent. 

This is a common mistake that websites make, which is pulling down their organic reach. Giving users a seamless web experience is critical to ensure you maintain top positions on the Google search. 

Google has been cracking down on websites that indulge in placing too many interstitial properties within the page. In 2016, Google announced that any website that tries to force intrusive interstitial ads will be penalized. 

3. Keyword Density

Creative image of Keyword Density

Keyword density is considered as one of the most basic On-page SEO factors. However, those days of stuffing keywords will not bring you to the first position today. Google’s Algorithms are now trained to find websites that stuff keywords and penalize them. 

In these changed circumstances, keyword density has evolved, and it’s more to do with advanced techniques such as LSI and TF/IDF.

Repeating the same keywords multiple times will only hurt your SEO strategy. The future lies in making Google understand that the words used within your content are contextual and relevant to the overall topic. 

In 2020, if you use the target keyword only three to four times in the content, your content can still rank, provided you have used LSI and TF/IDF technique. 

 LSI Technique

Using LSI Keywords, AKA Latent Semantic Indexing Keywords is a technique used by Google to understand the relation of words used within a page to the topic discussed.  

LSI Keywords are contextually relevant words that appear within a topic. Google uses its algorithm to find and understand the common terms that appear on different websites for the same topic to analyze the quality of the content.

With LSI keywords in place, Google is now able to determine the quality of the content despite the fewer number of times the keyword appears. The best way to find LSI keywords is by checking the “Related Search” section on Google search and also by using certain free LSI tools.

TFIDF Technique

TFIDF is the short form for Term Frequency–Inverse Document Frequency. Google’s John Mueller was the first to confirm that the search engine giant uses the TFIDF technique to retrieve information from the web. 

TFIDF is an information retrieval method that tries to understand the relevance of a combination of words that appear on a page in relation to the overall index of all the content on the web. Google has many other techniques for information retrieval and TFIDF is just one of the metrics it uses. 

In addition to this, it’s difficult to optimize a web page based on the TFIDF metric because it’s based on an aggregate of all the content currently indexed by Google. However, you can use certain tools such as SEMRush Writing Assistant or TFIDF Tool to check whether your content qualifies under the basic TFIDF metric. 

4. Structured Data

Image representing Structured Data formats

Google SERP Features are now turning out to be important Click Through Rate drivers. Most of these SERP features are the direct result of websites implementing Structured Data or Schema Markup. 

Structured Data is additional information that websites provide to the search engine to better understand the content. Structured Data ensures that search engines provide valuable information/clues even before a user enters a web page. 

The best example of Structured Data helping users is the reviews that you see on the search results for Movies, Events, and Products. 

Google has categorically stated that Structured Data is not a ranking factor. However, missing out on Structured Data may cause you to lose out on the click-through rates. Since the additional information is missing, your target audience may choose your competitors instead.

Going back to the history of Structured Data, it was an initiative started by the giants of the search engine market – Google, Bing, and Yahoo in 2011 to make the process of understanding the intent of each page easier. 

The worldwide web is loaded with information that has not been categorized or organized. The search giants wanted to streamline the web content, so they introduced a coding standard to help their algorithms pick up information easily and in an organized manner. 

Enabling the structured data on a website or on individual pages ensures that the search engines crawl websites and display them with rich information or rich snippets. 

Structured Data Formats

JSON-LD

This is the Google recommended Structured Data format that uses the JavaScript notations or markups within a page to help search engines understand the page type. 

Example: Local Address JSON-LD

{ “@context”: “http://schema.org”, 

“@type”: “LocalBusiness”, 

“address”: 

{ “@type”: “PostalAddress”, 

“addressLocality”: “Midtown Manhattan”, 

“addressRegion”: “NY”, 

“streetAddress”: “721 Fifth Avenue” 

}, 

“description”: “Trump Tower is a 58-floor, 664-foot-tall (202 m) mixed-use skyscraper at 721–725 Fifth Avenue, between 56th and 57th Streets.”, “name”: “Trump Tower”, “telephone”: “000-000-0000” 

}

 </script>

Microdata

Microdata is another format used to specify the structured data. Even though this is a Google approved Structured Data Format, it’s prone to messing up the code as it’s directly injected into the HTML. It’s a time-consuming process as it styles the actual elements within the page using inline codes, resulting in reduced page speed.

Example: Local Address Microdata

<div itemscope itemtype=”http://schema.org/PostalAddress”>

<span itemprop=”name”>Trump Tower</span><br>

<span itemprop=”streetAddress”>721 Fifth Avenue</span><br>

<span itemprop=”addressLocality”>Midtown Manhattan</span>,

<span itemprop=”addressRegion”>NY</span>

<span itemprop=”postalCode”>20500</span><br>

<span itemprop=”addressCountry”>United States</span>

</div>

RDFa

RDFa is another Structured Data Format used by websites. Even though it’s Google approved, the number of websites using this format is less compared to the other two. RDFa (or Resource Description Framework in Attributes) adds a set of attribute-level extensions to HTML for embedding structured data.

Example: Local Address RDFa

<div vocab=”http://schema.org/” typeof=”LocalBusiness”>

<h1><span property=”name”>Trump Tower</span></h1>

<span property=”description”> Trump Tower is a 58-floor, 664-foot-tall (202 m) mixed-use skyscraper at 721–725 Fifth Avenue, between 56th and 57th Streets.</span>

<div property=”address” typeof=”PostalAddress”>

<span property=”streetAddress”>721 Fifth Avenue</span>

<span property=”addressLocality”>Midtown Manhattan</span>,

<span property=”addressRegion”>NY</span>

</div>

Phone: <span property=”telephone”>000-000-0000</span>

</div>

5. Sitemap

Image of a Site Map structure

Search Engine crawlers are busy indexing millions of pages out there on the web. The time they spent on each website depends on many factors, such as the number of pages, site load speed, and the HTTP status codes. All that said, you can help them crawl the pages within your site faster by enabling a sitemap. A sitemap is an XML file placed within site to help search engines navigate through different pages 

Sitemaps are easy to generate, and there are a handful of tools like the free Sitemap Generator Tool that can help you with the process. However, if you’re managing a CMS-based website, the process becomes much easier as most of the time, sitemaps come as an innate feature. 

To ensure Google indexes pages within the sitemap, you have to add it within the Google Search Console. 

One of the key advantages of a sitemap is that it provides Google crawlers with a clue about the importance of each page. Since a sitemap is generated based on page hierarchy, the crawler gets to know which page is more important. A sitemap also provides information regarding the freshness of the content, which also helps the crawler to reindex pages.

6. Robot.txt

image of a robot

Robot.txt is as important as a sitemap. Usually, all websites have a predefined robots.txt file to ensure that a few pages are not indexed by the search engines. Not having a robot.txt file will not hurt SEO efforts. However, they may take away the crawl budget assigned to your site as search engine bots may take time to crawl and index pages that are not relevant to your users.  

To check if you have implemented Robot.txt on your website, search for http://www.yoursite.com/robots.txt on your site. 

7. PageSpeed Optimization

Graphic image representing importance of Page Speed

From the time Google announced that the load speed of a website is one of the determinants for organic ranking, there has been a lot of discussion about the page of a website. There are a lot of factors that determine the site speed, and this begins with the web hosting provider that you have chosen. 

Modern-day users are less patient and with the plethora of options provided to them. They prefer to browse through sites that open in a blink. So, what is the most preferred page load time? We have covered this in a comprehensive blog post titled Google Recommended Page Load Time

The best way to reduce the page load time is by reducing the size of a few on-page elements, such as JavaScript, CSS and image. A chunk of the page load time is eaten up by these elements and proper optimization of these can result in bringing down the page load speed of your website. 

In addition to this, adapting to modern frameworks such as Angular JS and React can boost the speed of your website. Google always boosts sites that provide users with seamless user experience. Ensuring that your website is free of technical SEO issues will help in improving the organic rankings.

8. E-A-T

An image denoting Expertise Authoritativeness and Trustworthiness

Google has been quite vocal about pulling down the rankings of websites that lack Expertise, Authoritativeness, and Trustworthiness; E-A-T. The term E-A-T has its origin in the Google Quality Rater Guidelines. 

Even though the initial versions of the quality rater guidelines talked very little about E-A-T, the newer ones have a whole section dedicated to the concept. There are a lot of factors that Google collates before deciding the fate of a page on its results page, and E-A-T is now one of the critical factors, especially if you’re dealing with a YMYL Site. 

A representative image of YMYL

YMYL sites are the ones that can impact the life and security of the end-users. Google has been careful lately about how website content can affect the livelihood of its users. Thus, it introduced E-A-T, which determines whether the information present on the site is authentic and valid. 

This is especially true when it comes to Health, Banking, Finance and Wellness websites, which all come under the ambit of YMYL. 

There are a lot of factors that determine the E-A-T score of a website, and this includes factors such as the Niche (alternate medicine sites have a hard time to rank on Google), Author Bio, About Us, Security Features, Policies, etc. 

I’ve written an in-depth guide explaining how to optimize each of the above mentioned E.A.T factors for your website to rank higher than your competitors. 

Final Thoughts 

I hope I have provided enough arsenal to take down those who propagate the idea that on-page SEO is just about adding a few more keywords on pages. This post will be updated as and when I figure out new On-page SEO practices to rank websites better on search engines. If you think I missed a critical on-page SEO factor in this guide, please feel free to let me know in the comments section. 



Source link

Related Articles