How to Create the Best robot.txt and Upload on Your Site

how to make a robot.txt file

SEO is a powerful tool to increase your website’s visibility, which ultimately leads to more traffic on your website in terms of quality and quantity. The type of traffic it brings in is called organic traffic. There are many ways to help your website ranking on SERPs (Search Engine Result Pages) using SEO, but today we will talk about a small little essential hack that many people do not know about.

It will make your website’s reach better on the web. 

We are talking about the robots.txt file (also known as robots exclusion protocol or standard).

This small text file is a part of every website on the internet that will increase your SEO. It’s easy to implement and not time-consuming, and you will be thanking us by the end of this blog.

So what is a robots.txt file?

The robots.txt file is a text file that tells web robots or, more commonly, the search engines which pages on your site to crawl and which not to crawl. 

Why would you NOT want a search engine to crawl some of your pages? Isn’t it the point of SEO to increase your visibility and not decrease it?

Well, yes, you want search engines to read your website, but Google search engines (Google bots) have a crawl budget. And you have to make the best use of your budget.

If the bot crawls on pages that have duplicate content, low-quality content, or error messages and thank you notes, you don’t want the search engine to waste time and resources on such pages and reduce your SEO ranking. You want the search engines to crawl and read your best and most relevant pages as effortless and fast as possible.

This is where the robots.txt file comes in handy.

Why do we need a robots.txt file?

If you have pages that are not to be visible to the regular public, such as login pages, you can use the robots.txt file to block those pages and make the Googlebot visit any of the other remaining pages on your site. 

It works better than Meta Directives because you can’t work with multimedia contents using Meta Directives.

Sometimes, a website may have a crawl budget problem, and to minimize it, you can instruct the Googlebots to make it visit only the pages that matter.

How do you create a robots.txt file?

It’s a straightforward process. You need to create a new robots.txt file by using a plain text editor like notepad. If you use Microsoft Word, then it may insert an additional code of its own.

Therefore you need a plain text editor. You may also use a free online plain text editor like Editpad.org.

If you already have a robots.txt file, then delete the text, but not the file.

The basic skeleton of a robots.txt file looks like this:

User-agent: *

Disallow: /

User-agent stands for the one you are commanding. That is the search engine. You can specify a particular search engine or use an asterisk, which means the command is for all the search engines.

The next line that is “Disallow,” does what it says. You mention the page on your website that you wish to skip from crawling between two forward slashes.

For example, if you wish to disallow the page named Xyz, you write the code as follows:

User-agent: *

Disallow: /Xyz/

Add the part of the URL that comes after the .com. This is called the directory. If you write nothing after Disallow, you will still allow bots to crawl your page.

In case there is a page or subdirectory named ABC within your disallowed parent directory you wish to see, you can add another command below Disallow.

Allow: /xyz/abc

So, this is how you write a robots.txt file. It looks simple, but it does take a lot of work!

How to upload a robots.txt file?

Upload the valid robots.txt file to your root directory or save it there if you already have one. To upload this file to your server, use your favorite FTP tool to log into your web server. Then open the public_html folder and open your site’s root directory.

Depending on how your web host is configured, your site’s root directory may be directly within the public_html folder or a folder within that. Once you’ve got your site’s root directory open, drag & drop the Robots.txt file into it.

You can also create the Robots.txt file directly from your FTP editor. 

To do this, open your site root directory and Right Click and Create a new file. Then, in the dialog box, type in “robots.txt” (without quotes) and click OK. 

You will see a new Robots.txt file inside. Make sure you have made yourself the page owner to read and write the file and not made it public. The file should show “0644” as the permission code, and if it doesn’t, then right-click on your file, select “file permission,” and change it to “0644”.

So, there you go. You now have a fully functional Robots.txt file that will enhance your SEO and make a significant difference. 

Go ahead, give it a go, and see the difference for yourself!

Bottom line

There are billions of websites on the web. To make sure your website is searched more often and is always on the top in Google searches, you need a quality robots.txt file.

It should be specific and have clear instructions for the bots to follow. Do that, and you can easily see your site appearing on the top of Google search along with the recommended page assigned by you.

Leave a Reply

Start Ranking on Google | Rebuild Your Site

Contact Our Team!

Send us a message below or book a call with our team!

Our Five Star Reviews on Trustpilot, Clutch and Google Speak for Themselves.

Ivan Cadena
Ivan Cadena
Verified Google Review
Read More
There’s no doubt when I say that Curious Check is a professional SEO company. Their SEO services are top-notch and their team listened to our needs. In just a few months our website traffic and search-engine rankings noticeably improved. New clients were finally finding us online! They're one of the best agencies we've ever worked with and would highly recommend them to anybody!
Jordan Beasley
Jordan Beasley
Verified Google Review
Read More
CuriousCheck is a blessing for anyone with questions on how to build a solid business foundation for future growth. 10 of 10 will recommend to colleagues and coworkers alike.
D Dozier
D Dozier
Verified Google Review
Read More
I threw a curveball, they knocked it out of the park. I was recommended by a client and I now eagerly recommend them to any small business.
Patrick Whiddon
Patrick Whiddon
Verified Google Review
Read More
I wish CuriousCheck was around when I had my first business- this has been a definite positive for the second time around and has made the hassles I went through the first time non-existent. Employees and staff are exceptional, and I would recommend to all my business comrades.
Lucas Gil Canton
Lucas Gil Canton
Verified TrustPilot Review
Read More
As the CEO of a Startup entering the HR/Background Screening space, being able to find all relevant players at one site is extremely valuable. Not only is it good to check the status of the market, and the innovations taking place, but also to find potential partners and applications we can use ourselves! I recommend CuriousCheck to anyone who wants to improve their business through...
Tracy Shatus
Tracy Shatus
Google Review
Read More
This company went above and beyond in listening to the needs of my small business and delivered in a big way!!! Within 4 months, ranking for top keywords and custom built our background check eCommerce platform!
Previous
Next
Highly Recommended by Locals On Alignable
web design companies in georgia

 2020 CuriousCheck, LLC ©. All rights reserved – Atlanta SEO Company

Get a free site audit and suggestions for your website - No hassle

Free Expert Site Audit

Enter your email address to receive  FREE Traffic-Boosting suggestions and an audit of your site’s health. 

Easily Share with your SEO Team!