Categories
SEO

A Visual Guide to Google Search Console Removals Tool

Google Search Console is informative and a convenient avenue to control a few aspects of how your web presence appears in their SERPs. One of those is the Google Search Console Removals Tool (mostly informative, but also allows you to take some direct actions).

What Can You Do With the Google Search Console Removals Tool?

According to Google, with this functionality, you can:

  1. Temporarily block URLs from sites that you own — from appearing in their SERPs. (More details to follow).
  2. From the last 6 months, you can see all “outdated content” removal requests that were submitted using the remove outdated content toolThis request, is specifically made by users who do not own your website property. For instance, say you’re noticing a page from your favorite blog that you know doesn’t exist anymore, or the info is outdated/has significant changes to it. You can submit this request.
  3. See a history of requests that other Google Users made in the past six months, to remove your URL(s) from the SafeSearch because they felt it showed adult content. Google reviews the requests before taking any action. Like the outdated content request, Google has a way of reporting inappropriate content.

Understanding Temporary Removals

As the name suggests, this feature is best reserved for situations when you want to quickly and temporarily remove URLs or their cache for clearing snippets as well as the cached version — until next crawl.

To get started, you’d have to click on the “New Request” button and select the appropriate option based on your requirement. View the video below to see how the module looks like.

A 28-second clip that demonstrates how the Google Search Console Temporary Removals feature looks like.

As you can see from the video, you actually get a couple of useful options.

Option 1: Temporary Remove URL

temporary remove urls from via google search console
A snapshot of the “Temporary Remove URL” pop-up window.

Here, you can submit a request on an individual URL basis or a prefix/path basis. Note that you’d need to be extra careful with the second choice because a wrong decision can be quite detrimental to your organic performance.

In any event, once you submit a request, it achieves the following:

  1. Blocks URLs for about six months.
  2. Clear cached content, and current snippet until next crawl.

For Permanent removals, Google recommends that you either:

  1. Mark them as noindex via meta robots (a unique ability than robots.txt).
  2. You can do the same by leveraging X-Robots-Tag.
  3. Make them 404 or 410s: If you need to know the differences, you can check out my post here — where I discuss what separates these two from each other, and soft 404s.

Do not use robots.txt to block these; robots.txt is used to configure crawling behavior only, not indexing in its entirety. A subtle distinction that many tend to forget.

Option 2: Clear Cached URL

clear cached url from google search console
A snapshot of Google Search Console “Clear cached URL.”

Like option 1, you can command removals on a URL level, or a prefix level. The manner in which this differs from temporary remove URL is that it’ll keep it indexed; except, it’ll clear the cached version and the snippet until Googlebot can crawl it again.

All in all, the reason Google even offers this functionality is because it understands that there might be situations when you’d quickly want to remove or clear URLs from it’s SERPs. It can come in especially handy after you’ve gotten rid of sensitive information.

A Brief Overview of “Outdated Content”

Since the focus is mainly on actual removals and because most of the website owners will not have to deal with this (so far, I haven’t), I will be brief on this section.

Essentially, outdated content requests can be made by non-site owners when they feel that what Google is displaying in the SERPs is no longer valid or if the pages are non-existent. I’ve rarely heard of anyone dealing with this, but Google has full details on this aspect, here.

A Quick Rundown of the “SafeSearch Filtering”

Conceptually similar to outdated content, the “SafeSearch filtering” requests can again be made by non-site owners who deem your pages displaying adult content. Obviously, the submissions are reviewed by Google. Consequently, if Google feels that your URL(s) indeed classify as adult content, they’ll be tagged as such.

On the other hand, you can challenge with a rebuttal by reporting your grievance in the Webmaster Forum.

You May Also Want to Check Out:

Conclusion

Google Search is quintessential of how a search engine should be, and Google Search Console is a direct line for webmasters, who, on a certain level, control how their website shows up in SERPs.

Amongst the endless number of things you can do with GSC, so to speak, one of those, is asking to remove or clear cache of the URLs from the SERPs, or review external requests, in the form of outdated content, and SafeSearch Filtering, that other Google users have made for your website.