300 Users interested
Optimize your Robots.txt for better ranking & indexing

Better Robots.txt – WordPress Robots.txt Plugin by PagUp

wdm product share iconShare
Easily modify robots.txt to meet the needs of your website
Easily allow/disallow indexing for multiple search engines
Conveniently add crawl-delay directive to conserve resources
Protect your backlink profile from spam backlinks
Ensure none of your pages gets overlooked for indexing
Safeguard your content against “bad bots”
Boost rankings with better loading performance on important pages
300
interested
Tech
Details
Boost your search engine ranking and improve site indexing with the SEO plugin Better Robots.txt for WordPress

What is special about this product

  • Optimizes your robots.txt for SEO (crawling, visibility on SERPs, ...)
  • Protects data and original content from being stolen and republished
  • Compatible with Yoast SEO, WooCommerce, or any other plugin
View Detailed Information

Description

Better Robots.txt – WordPress Robots.txt Plugin by PagUp

Better Robots.txt – WordPress Robots.txt plugin

Better Robots.txt is a WordPress plugin by PagUp that helps you create optimised robots.txt files for your website to improve your site ranking. This WordPress robots.txt plugin helps easily add instructions about 16 most popular search engine bots to your robots.txt, protect your data from data scraping by bad bots and add custom instructions to your robots.txt.

 

Better Robots.txt – All Free Features 

        1. Create the right robots.txt file

A robot.txt file contains instructions on which search engine bots are allowed or not allowed to crawl your website. Modifying it optimally can help you rank better on search engines and improve your website indexing

A robots.txt file defines how search engines see your webpages. It is very sensitive and one wrong modification can block big parts of your site from search engines. The plugin helps you add specific instructions to your robots.txt file with a few simple clicks.

        2. Boost your website indexing

Use simple toggle buttons to easily add allow directives to robots.txt. Select from a list of 16 popular search engines you want to index your website. These instructions ask corresponding search engine bots to read and index your website.

       3. Set Crawl-delay

Better Robots.txt facilitates easily adding a crawl-delay directive to your robots.txt. This prevents aggressive bots from utilizing and leaching all your resources with hundreds of requests. A crawl-delay directive instructs these bots to crawl your website with a more moderate speed.

       4. Avoid Spam Backlinks

Spam backlinks from “Spambots” to your website can lower your search engine rankings. The plugin can help you prevent these backlinks from being indexed by other search engine bots.

Better Robots.txt – All Premium Features

        1. Add your Sitemap to robots.txt

A sitemap is an XML file that lists all the pages of your website and how they are linked to each other. Adding a sitemap to your robots.txt ensures that every page on your website is crawled by search engines.

You can generate a sitemap using the Yoast SEO plugin or any other sitemap generator. Better Robots.txt automatically detects it and adds it to your robots.txt.

2. Protect Your Data

“Bad Bots” can steal your original content, republish it and affect your search engine rankings. Prevent malicious, unregulated, bad bots from scraping your data using Better Robots.txt. This WordPress robots.txt optimization plugin provides a list of the most popular known malicious bots, that you can now easily block from your website, keeping your data secure.

        3. Optimize WooCommerce store crawlability

Your WooCommerce store has a lot of pages like “filter”, “account”, “add to cart” which don’t necessarily need to be crawled. With Better Robots.txt you can optimize your store crawlability with just a click so resources could be allocated to more important pages

Why get Better Robots.txt?

The robots.txt file is a goldmine when it comes to SEO. It should be optimized and utilized in the right way! However, robots.txt is also a sensitive file and wrong modifications can be detrimental to your website. Better Robots.txt is an easy way to safely optimize your robots.txt file. Improve your website SEO and rank higher on search engine pages.

Better Robots.txt FAQ’s

  • What is robots.txt file?

Robots.txt is a text file that instructs search engine bots about how to crawl pages on their website. Robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain user agents.

  • What is a sitemap?

Sitemaps are an easy way to inform search engines about pages on the sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL

  • Is the plugin translatable?

The plugin is translated and supported in 6 languages – Chinese, English, French, German, Russian, Portuguese and Spanish

Tech Details

Better Robots.txt – WordPress Robots.txt Plugin by PagUp
  • Version1.2.9.2
  • Last UpdatedAug 2, 2019
  • Downloads0
  • Required WP Version5.3
  • CompatibilityWoocommerce
  • Compatible Browsers chrome, firefox, safari, ie
  • Files Included html, javascript, php
  • Categories ,
  • Tags
View Detailed Information

Reviews by Verified Customers

WHAT PEOPLE SAY...
0/5

Avg. Rating

What's your story? We'd love to know! What's your story? We'd love to know!
Filter By :

Pricing

 
Yearly
Lifetime
 

Free

(Basic features)

Includes lifetime updates and limited support.

Premium

(Production + Staging)

Subscription includes updates and support for a year. Subscription includes lifetime updates and support.

Features

Free
$139 /year $639 /lifetime
Create the right robots.txt file i
Easily modify robots.txt to meet the needs of your website
  Free Features Tick
  Paid Features Tick
Boost your website indexing i
Easily allow/disallow indexing for multiple search engines
  Free Features Tick
  Paid Features Tick
Set Crawl-delay i
Conveniently add crawl-delay directive to conserve resources
  Free Features Tick
  Paid Features Tick
Avoid Spam backlinks i
Protect your backlink profile from spam backlinks
  Free Features Tick
  Paid Features Tick
Add your Sitemap to robots.txt i
Ensure none of your pages gets overlooked for indexing
 
  Paid Features Tick
Protect Your Data i
Safeguard your content against “bad bots”
 
  Paid Features Tick
Optimize WooCommerce store crawlability i
Boost rankings with better loading performance on important pages
 
  Paid Features Tick
100% Risk Free - no hidden charges

Customer Question & Answers

You can ask any question related to this product

Meet the Episeller

Support

In case of support queries, contact us at [email protected]. We shall put you in touch with the Episeller who will resolve your queries at the earliest!