How to Exclude WordPress Content from Google Search

Exclude WordPress Content

As a website owner, your main goal is to rank high in Google search and attract visitors. But, while indexing is good for your website, there might be some pages you want to keep out of search results, for various reasons. Some pages may not be relevant for all users, such as those translated in another language, or other pages may contain private information. Sometimes, website owners are working on new projects that need to be up on the website, for testing, but are not yet ready for the public eye. 

No matter your reason, hiding WordPress content from Google search is definitely not complicated, as long as you know where to look. 

Use WordPress’ built-in feature

The simplest method to prevent search engines from indexing your website is to use the feature that WordPress already has built-inFor this, you need to follow some simple steps:

  1. From the admin panel, go to Settings and then to Reading
  2. Here, go to the Search Engine Visibility option and check the box next to “Discourage search engines from indexing this site
  3. Save changes to apply 

What WordPress does, after saving the changes, is that it will edit the robots.txt file and will discourage search engines from indexing your website. 

However, be aware that this applies to the entire website, not just particular content. So, if, for example, you are building a website, but are not entirely pleased with how it looks yet, you can keep it hid from search engines until you decide it is ready to be out in the open.

Keep in mind that, in order to get visitors, you need to make your website visible to search engines as fast as possible. If you have doubts about the content, you can always use services such as Best Essay Education, to proofread and edit your content. This will not only ensure the content is high-quality  but also help reduce the time spent publishing it. 

Edit the robots.txt file

If you want to, you can achieve the same result manually, by editing the robots.txt file. Be careful when doing this, as adding bad commands can interfere with the entire site’s SEO, which is why this method is only recommended for advanced WordPress users. 

To put it simply, the robots.txt file is a configuration file that enables websites to give direct instructions to search engine bots, as you can already tell by the name. The file is added to your website’s root directory and you can edit it to optimize it for SEO purposes. 

To access and edit the file, you need to use a File Manager and connect to your WordPress’ cPanel. The robots.txt file should be located in the public_html folder. After opening the file, you need to add the following lines to it:

User-agent: *

Disallow: /your-page/

The user-agent line targets the specific search engines you want your website excluded from. If you use the asterisk, it will exclude it from all search engines. The next line should include the specific URL you want to be excluded from the search result. For example, if the page you want to exclude is “http://yourdomain.com/2018/10/thank-you-note/, the disallow line should look like this: Disallow: /2018/10/thank-you-note /. 

Additionally, there are a few other things you can achieve by editing the robots.txt file:

  • To block search engines from indexing the entire website, change the disallow line to: Disallow: /
  • To block search engine from indexing PDF files, change the line to: Disallow: /pdfs/ 
  • For a specific image format, for example .GIF files, the line should look like this: Disallow: /*.gif$ 

Use the “no-index” meta tag 

More effective than the editing the robots.txt file, the no-index meta tag blocks search engines from indexing certain pages of your website. The no-index meta tag can be found in the <head> section of your webpage and looks like this:

<html>

<head>

<title>…</title>

<meta name=”robots” content=”noindex”>

</head> 

The above code lines will tell bots from almost all search engines not to index the page. If you want to add more specific commands, here are some other examples and what their results are:

  • <meta name=”googlebot” content=”noindex” /> – prevents only Googlebot from crawling the page
  • <meta name=”googlebot-news” content=”noindex” /> – revents the page from showing up in Google News
  • <meta name=”robots” content=”nofollow”> – prevents bots from following the links on the page
  • <meta name=”robots” content=”notranslate”> – prevents search engines to translate the page in search results
  • <meta name=”robots” content=”noimageindex”> – prevents images on the page from being indexed by search engines 

If you want to add multiple commands, you can do so by adding multiple meta lines one after another. 

To add these lines of code to your website, you need to create a WordPress child theme, and then edit functions.php. In functions.php, look for the ‘wp_head’ action hook, which is where you need to insert the meta tags. 

It is important to keep in mind that, in order for these commands to work properly, bots need to be able to crawl your page. This way, they can find the instruction and follow them accordingly. So, in case you blocked bots from crawling your pages from the robots.txt file, make sure to disable that. If the bots can’t crawl your page and find your instructions on indexing, if the page or documents are liked by other websites, they will still be indexed by search engines. 

Password-protect a page

This method won’t necessarily keep search engines from indexing your pages, but the specific pages can only be accessed and viewed by someone who has the password. This is a simple feature that is provided by WordPress, in case you only want to give access to certain users. 

To do so, you simply need to go to the page or post you want to make private and, from the Publish box, you can edit the visibility of the page. It is usually set to public, but upon clicking the Edit link, you will see two more options: private and password protected. Private pages are only available to users who are logged in on the website and have at least an editor role. Password protected is the option you should be looking for. The only thing you need to do is click on the Password protected option and type in the desired password. 

“This method is extremely useful if you want to hide pages that include, for instance, staff profiles or other sensitive information. You can give the password to those who need access to those pages and keep away unwanted visitors. Even if they can, somehow, reach the page, if they do not have the password, they can’t access the content.” says Linda Glover, a content writing and SEO expert at Studicus writing company.

Use a plugin specially designed for this

If you are a WordPress beginner, this is the easiest method on the list. You can find various tools and plugins that help you keep pages from being indexed by adding a “Noindex” command to a certain page. How does it work? Well, if a bot reaches a page with a noindex command, it will receive the message that you don’t want that page to be indexed and skip the entire content of the page. 

For example, the Yoast SEO plugin has, besides other useful SEO features, an option that allows you to turn the noindex and nofollow commands on or off. It is quite simple to use, as you only need to install the plugin and follow some instructions: 

  1. To install it, go to the Plugins section in WordPress and search for the Yoast SEO plugin, then install and activate it. 
  2. Then, open the WordPress editor for new or already existing posts and scroll down until you see the Yoast SEO window. 
  3. Along the left side of the window, you will see some icons. You need to click on the one that looks like a gear. 
  4. Click the dropdown option under “Meta robots index” and change the post type to “noindex”. 
  5. Then, next to “Meta robots follow”, click the “nofollow” option. 
  6. Save changes and that’s it. 

Preventing search engines from accessing certain pages on your website may sound counterproductive, but it can turn out to be extremely effective in some cases. In the race to become one of Google’s top-ranked websites, you are forgetting the fact that you allow search engines to see and access all sort of sensitive information that should not reach all users. No matter if you use these exclusion methods to keep personal information hidden, or simply to work on a project you want to reveal in the future, there are plenty of expert or beginner methods to do so. 

SHARE
Previous articleUltimate Guide on Migrating from WooCommerce to Shopify
Next articleThe eCommerce Guide to Amazon Web Services (AWS)
Angela Baker is a self-driven specialist who is currently working as a freelance writer at GrabMyEssay writing services and is trying to improve herself and her blogging career. She is always seeking to discover new ways for personal and professional growth and is convinced that it’s always important to broaden horizons. That's why Angela develops and improves her skills throughout the writing process to help to inspire people.

LEAVE A REPLY