How to Fix Critical On-Page SEO Factors in Website?

We should concentrate on simple to-recognize, direct to-settle issues. The greater part of these issues can be revealed in an evening, and it’s conceivable they can tackle months of movement issues. While there may not be pivotal, complex issues that will settle SEO for the last time, there are simple things to check at this moment. On the off chance that your site as of now looks at for these, then you can go home today and begin decoding RankBrain tomorrow.

I’ll characterize specialized SEO here as parts of a site involving more specialized issues that the normal advertiser wouldn’t distinguish and take a touch of understanding to reveal. Specialized SEO issues are additionally by and large, however not generally, all-inclusive issues instead of page issues. Their fixes can enhance your site in general, as opposed to simply segregated pages.


  1. Check Indexing of site pages

Sort site:{} into Google inquiry and you’ll promptly perceive what number of pages on your site are positioning.

What to do next:

* Go further and check diverse pails of pages on your webpage, for example, item pages and blog entries

* Check subdomains to ensure they’re ordering (or not)

* Check old renditions of your site to check whether they’re erroneously being recorded rather than diverted

* Look out for spam, if your site was hacked, diving deep into the output to search for anything phenomenal (like pharmaceutical or betting SEO site-hacking spam)

* Figure out precisely what’s creating ordering issues.

  1. Robot.txt

Go to and ensure it doesn’t demonstrate “Client operator: * Disallow:/”.

What to do next:

If you see “Prohibit:/”, quickly converse with your engineer. There could be a justifiable reason it’s set up that way, or it might be an oversight.

On the off chance that you have a complex robots.txt record, in the same way as other web based business destinations, you ought to survey it line-by-line with your engineer to ensure it’s right.

3.Meta robots NOINDEX

NOINDEX can be significantly more harming than a misconfigured robots.txt now and again. An erroneously designed robots.txt won’t haul your pages out of Google’s list if they’re as of now there, however a NOINDEX order will expel all pages with this arrangement.

Most regularly, the NOINDEX is set up when a site is in its improvement stage. Since such a variety of web improvement undertakings are running behind timetable and pushed to inhabit the most recent hour, this is the place the misstep can happen.

Learn more about critical on-page seo factors to make your website seo friendly, enrol free course at


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s