How to Find My Blog on Google Search
How to Find Your Blog On Google
How to check whether your blog is present in Google Search Result
Hello guyz, if you are reading this article, surely upto now you must have created a blog with some posts and tried to index it through Google Search Console. If not yet, check out
Now still after manually indexing your site, if you are unable to find your article in Google Search Result or if you have any query among the below list;
- Find Blogger Blogs
- How to Find My Blog on Google Search
- Why can't i find my blog on google
- When will my blog appear in Google search
- Can't find my blog on blogger
- How to check whether your blog is present in Google Search Result
Google Search Result for Checking blog visibility |
Now, if still your Blog is not visible in Google Search Result there might be some reason from below:-
1. Your Website is very new
If you have just launched the site today morning, and now wondering Why ain't it appeared in Search Result, that won't work. It simply means Google did not find it yet.
Still to confirm your doubt you can use above trick for using 'site:your_url'.
If there shows atleast a single result, means your site has been discovered and other pages need to be indexed or single page is indexed till now.
If you have many pages, you need to use SITEMAP.xml and upload it on GSC(Google Search Console) to get crawled all pages.
2. Blocking Search Engines from Indexing the pages
There is an option through which the user can manually restrict the page from getting crawled.
This is done with "noindex" meta tag, it is a HTML code as below;
<meta name = "robots" content="noindex"/>
Using this tag in HTML code will restrict the page from crawling though it is added in sitemap.xml and added to Google Search Console.
If by mistake, the user misses to revert it back if drafted for development issue. The Google Search Console notifies it as an error "Submitted URL marked 'noindex'.
There are many Site Audit websites also, where you can check your site for any potential SEO issues, which includes "noindex" tags too.
3. Blocking Search Engines from Crawling your pages
Mostly the websites use, robots.txt file which is used to instruct the robots or crawlers about which pages to crawl and which not.
Google crawls only those URLs which are present in robots.txt file, if robots.txt does not contain URL or is blocked, then Google restrict from it to get crawled.
Google Search Console also provides feature under "Coverage" report and it alerts the user by saying "Submitted URL blocked by robots.txt".
User can also manually check the robots.txt file by moving to the URL, 'Your_website_URL.com/robots.txt'