Custom Robots.txt Generator for Blogger & WordPress

Robots.txt Generator


Generate Robots.txt For Blogger
Enter Your Website URL with ‘https://’

User-agent: * Disallow: /search/ Disallow: /blog/page/ Allow: / Sitemap: User-Agent: * Allow: /wp-content/uploads/ Allow: /wp-admin/admin-ajax.php Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Disallow: /blog/page/ Disallow: /search/ Sitemap:

Generate or Add Robots.txt file on Blogger

  1. Choose the Blogger Platform from the top of the Robots.txt generator tool.
generate Robots.txt file for Blogger
  1. Then put your website URL with ‘https://‘ on the input box.

Make sure you have to use the URL, same as visible on your website. Like if your website uses ‘www’ so make sure you have to enter it “https://www.yourwebsite.com“, on my cash I don’t, uses ‘www’.

  1. Click the Generate button and wait while the tool is generating the file for your website.
  1. Once you got the file, ‘Copy’ that and go back to your Blogger.

1 Way

  • Then Navigate to Settings >> Search Preferences >> Crawlers and Indexing >> Custom Robots.txt >> Edit ” Paste the Generated Robots.txt in The Box ” >> Click on ‘Yes’, and save all Settings.
  • Now your file has been successfully uploaded to your website, you can only see them by adding ‘/robots.txt‘ at the end of your website URL.
  • ex. https://pawanblogs.com/robots.txt

2 Way

  • Go to the pages, create a new one, set the URL “/robots.txt” and past the copied robots.txt file under html.
  • Onec you done, click on the public and now you can view the robots.txt of your website under the “https://yourwebsite/robots.txt”.

Generate or Add Robots.txt file on WordPress

  1. Choose the WordPress Platform from the top of the Robots.txt generator tool.
  1. Then put your website URL with ‘https://‘ on the input box.
generate Robots.txt file for WordPress

Make sure you have to use the URL, same as visible on your website. Like if your website uses ‘www’ so make sure you have to enter it “https://www.yourwebsite.com“, on my cash I don’t, uses ‘www’.

  1. Click the Generate button and wait while the tool is generating the file for your website.
  1. Once you got the file, ‘Copy’ that and go back to your WordPress.

1 Way

  • On WordPress, we need ‘Yoast SEO‘ for adding Robots.txt file. if you don’t have yet so installed them now.
  • Visit on the ‘Yoast Seo Tools‘ and then go to the ‘file editor‘.
  • Now edit your robots.txt and paste the generated code here, and click on the ‘save changes to robots.txt‘.

Now your file has been successfully uploaded to your website, you can only see them by adding ‘/robots.txt‘ at the end of your website URL.
ex. https://pawanblogs.com/robots.txt

2 Way

Instead of this, if you don’t have access of yoast or other seo tool so you may use another method, where you directly put the robots.txt file on your public_html in cpanel.

  • For that first goto your cpanel, then visit on the file manger.
  • Once you in, open the public_html folder, here you have to add or create an file, name or robots.txt.
  • After that put the copied code inside of recently created robots.txt file and save it.

Now the robots.txt file successfully added on your doamin.

Robots.txt File

If I say it simply simple, the robots.txt file will allow or disallow to search engine bots or crawlers, which things you wanna index or show on a search engine such as google, bing, and yahoo.

robots.txt file is the root of any web server, where you can define rules for web crawlers, such as allow or disallow certain assets from being crawled. Web crawlers follow those and crawled whats you want. 

Robots.txt File

Search Engine Bots. Search engines are, for the most part, entities that rely on automated software agents called spiders, crawlers, robots, and bots. These bots are the seekers of content on the Internet, and from within individual web pages. These tools are key parts in how the search engines operate.

Every search engine has its own bot for crawling the data. like Google have Google bot, Bing has Bing bot, Yahoo ‘Slurp bot’, DuckDuckGo has ‘DuckDuckbot’,  Baidu ‘Baiduspider’, Yandex ‘Yandex Bot’,  Sogou ‘Sogou Spider’, Exalead ‘Exabot’, Facebook ‘Facebook external hit’, and Alexa ‘Alexa crawler’. 

These are some of the popular search engines and their bots, crawlers, and spiders.

User-agent:

User-agent: This will specify the search engine bots, crawlers, spider, or other automated clients which crawl your pages, pst, images, videos, sand others website content. 

like User-agent: * if you use this, it will a generic rules which apply to all search engine bots, or you also may add the specific rule for any specific bot.

Such as:

User-agent: Googlebot
Disallow: /no-index/your-page.html

Here we are set the rule for Google crawler ‘Googlebot’ you may set for others or more.

Disallow:

Disallow: this will use to disallow the things which you don’t wanna be index, you can set for a specific page or things and also set for an entire path or root.

If you use only this /, it defines or indicates to the entire root of the domain.

like: Disallow: /
It will Disallow the crawler to crawl or index the entire domain path.

It only for your clarification, about the robots.txt file tag don’t be use those tags without knowing all things about those.

Disallow: /search/
Here we are disallowing the bots to crawl the search of your website, which your visitor makes.

disallow things on Robots.txt File

Like that you can add a disallow tag to hide the thing which you don’t want to show on google or other search engines

Allow:

Allow: this will use to Allow the things which you wanna index, you can set for a specific page or things and also set for an entire path or root.

Allow: /
It will allow the crawler to crawl or index the entire domain path.

Allow: /search/
here we allow the bots to crawl the search of your website, which your visitor makes.

allow things on Robots.txt File

Like that, you can add the allow tag to index the thing which you want to show on google or other search engines.

Sitemap:

The sitemap is a file where we provide information about the post, images, videos, or other site content, it helps crawlers to know what’s things we want to index, otherwise if you do not use it so the crawler crawls all of the roots of domains first then index the things as your allow or disallow tags.

It takes a lot of time and also creates errors if you ignore him.

These are some examples of sitemap:

Sitemap: https://pawanblogs.com/sitemap.xml
Sitemap: https://pawanblogs.com/post-sitemap.xml
Sitemap: https://pawanblogs.com/sitemap_index.xml
Sitemap: https://pawanblogs.com/page-sitemap.xml

It will define by Sitemap:, and then you can put your website sitemap on the robots.txt file.

About Robots.txt Generator Tool

This tool works based on JavaScript, js modifies the value here, it takes the value from the input box ‘where you enter your website URL’, and put those on the basic structure of Robots.txt file and generates robots.txt files for your Blogger or WordPress Websites.

You can modify the file as your need, before you add it to your website, robots.txt will take time to appear or work, so don’t be change them more and more time, probably it will create an error.

The source code is not public right now, so if you want, contact me on my Instagram.

Leave a Comment

Your email address will not be published.