Issues

Response Codes: External Blocked Resource

back to issues overview

External Blocked Resource

External resources (such as images, JavaScript and CSS) that are blocked from rendering by robots.txt or an error.

This can be an issue as the search engines might not be able to access critical resources to be able to render pages accurately.

How to Analyse in the SEO Spider

Use the ‘Response Codes’ tab and ‘External’ and ‘Blocked Resource’ filters to view these URLs.

This filter will only populate when JavaScript rendering is enabled (blocked resources will appear under ‘Blocked by Robots.txt’ in default ‘text only’ crawl mode).

Blocked resources can be viewed by URL in the ‘Rendered Page’ tab, and any pages with blocked resources can be viewed under ‘JavaScript > Pages with Blocked Resources’.

What Triggers This Issue

This issue is triggered when external resources match directives within the robots.txt. An example of this could be a directive like:

Disallow: /wp-includes/

In the robots.txt file, which would block search engines from crawling any resources that begin with

https://www.screamingfrog.co.uk/wp-includes/

Such as:

https://www.screamingfrog.co.uk/wp-includes/js/jquery/jquery.min.js

How To Fix

If critical to your content, update the external subdomains robots.txt and resolve any errors to allow resources to be crawled and used for rendering of the websites content.

Back to top