On-Page SEO Checklist for New & Old Websites

A website’s performance in SERPs (Search Engine Result Pages) depends on many factors. An SEO checklist is the best way to find out the errors on a website and rectify them to improve the website’s ranking in SERPs.
Our SEO checklist contains different high-value factors, which are important and should be implemented on a website. We can use different free tools to check these factors.

Here is an SEO checklist, which will help you to improve your website ranking in search engine result pages and hence improve brand awareness.

On-Page SEO Checklist

1. Website Page Load Time

A website’s homepage and other pages should have minimum load time. It should not be longer than 2-3 seconds. High load time increases the bounce rate. Around 50% of people go back from your website if the load time is more than 5 seconds. You can use GTmetrix or Pingdom tool to check the website load time.

2. Website Title Tag

The title tag has the most value among Google’s 200 ranking factors. A unique title should be implemented on every page of the website and its length should be between 50-60 characters. Title tags are important because when a user clicks on a page they let readers know what information will be present on the page. The title tag should include the most important keywords for that particular page. You can use the Screaming Frog Tool to check the title tag for all the website pages.

3. Meta Description Tag

The Meta description provides a summary of a web page. A unique Meta description tag for each website page should include the most important keyword for that page and its length should not be more than 155-160 characters. Screaming Frog is the best free tool to check if this tag is implemented or not.

4. Heading Tags

The heading tags (H1, H2…..H6) are important for page ranking in SERPs. There should be only one H1 tag on a website page. If a page has subcategories then it is a good idea to implement the multiple H2 or H3 tags on that page.

5. Keyword Density:

The unique and fresh content on a page does not give that much value in search engines if it does not have proper keyword density. The keyword density on a page should be around 2-3 %.
You can use www.wordcounter.net or www.charactercountonline.com to check the content length and keyword density for the website pages.

Read More: How Important Are Bread Crumbs?

6. Broken Links

The broken links affect a website negatively. A large number of broken links not only affect the rankings negatively but also the user experience. You can find the broken links using www.brokenlinkcheck.com and correct them manually.

7. 301 Redirection

The 301 Redirection is implemented when two versions of the website are available. For example, if www and without www version is available for the website, then it causes duplicity. To remove this duplicity, we implement 301 Redirection for the website. We should use the www version as the preferred one for the website.

8. Copied Content

Search engines show only the original content in search results. If your website has duplicate content, it is recommended to remove it and implement some keyword-rich content to improve your website’s performance in search engine result pages. The websites with copied content are penalized by Google Panda update. The Copyscape Tool is used to find the copied content from other websites.

9. Iframes and Flash

The iframes and Flash should not be implemented on the website. Google does not crawl the content which is placed in iframes. And Flash increases the page load time of the website. Iframes allow web developers to feature content from other websites or external sources.

10. Internal Links

Internal links play an important role in linking your website pages with each other. If internal links are implemented on wrong anchor texts, then it may be harmful to your website. It is recommended not to use the keyword as an anchor text for linking and always link the correct page.

11. Navigation Structure

There should be an easy navigation structure for the website. It is recommended to have breadcrumbs on the website so that the user can find from which he/she has visited a particular page. Google shows breadcrumbs in the search results. It shows that Google gives value to breadcrumbs also.

12. URL Structure

A simple URL structure that is easy to read and understand for the user and search engines is an important ranking factor. The URLs should not contain special characters like (?, &, @, $) in them. Instead, it is recommended to have static URLs as they are easy to understand. Google does not crawl anything after the “#” in the URL. In that case, all the URLs which are the same before the “#” are treated as the same pages. It does not matter if they have different characters after “#”.

13. Alt & Image Title Tags

Alt tags are the important text of the image that is understandable by Google and other search engines. Google indexes the content only. It does not index any image. Implementation of alt tags and tweaked title tags helps to improve the website position in SERPs. You can check them manually. The alt tag should not contain the keyword only. Instead, it should be based on the image.

14. Sitemaps

Sitemaps help search engines to easily crawl your website. Using sitemaps, you can set the frequency and priority for website URLs. Check if your website contains sitemaps for 3 major search engines: Google, Bing, and Yahoo.

15. Social Media Profiles Integration

Integrate all your social media profiles on the website to interact with your followers and visitors.

16. Mobile-Friendliness

Mobile-friendliness is a ranking factor these days. Make your website mobile-friendly to get the benefits from search engines. Mobile-friendly websites are getting benefits for the search queries that are done on mobiles. You can use “Ctrl+Shift+M” to check this, however, the best way is to use Google mobile-friendly test.

17. Robots File

Robots.txt is a text file that is created to instruct search engine robots on how to crawl pages on the website. This is the first file that is crawled by bots when they come to your website. Make sure you have implemented the robots.txt file in the right way. All your website URLs that are important for users should be indexed in the search engine.

18. 404 Error Page

It does not directly affect the search engine rankings. However, it is important to improve the user experience. A custom 404 error page on your website can improve the bounce rate, average visit duration, and other key performance indicators for the website.

19. Duplicate Content

Duplicate content on the website can harm your website’s performance in search engines. Even you can get your website penalized by Google Panda update. Use only high-quality fresh content on every page of the website.

20. Footer Links

The footer link to the other websites can be harmful to your website. It is recommended to remove all the footer links that are going to other websites or make them “no-follow”.

21. Canonical Tag

The canonical tag is implemented when there are many pages on the website with the same content. To remove this duplicity, the canonical tag is used and is implemented on all the pages, which have duplicate content.
All these factors are important for the website and should be implemented to get the desired results.


Leave a Reply

Your email address will not be published. Required fields are marked *