“Googlebot cannot access CSS and JavaScript files” in Google Search Console – A New Google Warning!!!

Today, if you received a warning from Google in your email, don’t be alarmed because not only you but several webmasters were also received the same alert in their email that “Googlebot cannot access your JavaScript and/or CSS files.” While also reminding them that Googlebot’s incapability to access those files may result in “suboptimal rankings” Google sent out this warning through their Search Console.

Google had written late last year that blocking JavaScript and CSS through your sites robots.txt file could result in reduced rankings and indexation. Google has done this particularly to permit your website’s entire resources, for instance JavaScript and CSS files, to be crawled and to help Google fully understand your site’s contents. By utilizing the HTML code of a webpage as well as its resources for example JavaScript, CSS, and images files; the Google indexing system renders WebPages. You can use the Fetch as Google and the robots.txt tester tools in Search Console to debug directives in your robots.txt file and to see the webpage resources that Googlebot not able to crawl.

Here you can see a copy of the notice in Google Search Console.

untitled

It’s a good thing if your site has been blocking Googlebot from accessing those files as you know about it so you can find solutions to deal with the problem. You can easily fix this issue by editing your site’s robots.txt file. Go ahead with this fix if you’re comfortable editing that file

Glance through the robots.txt file for any of the following lines of code:
Disallow: /.php$*
Disallow: /.css$*
Disallow: /.inc$*
Disallow: /.js$*

Remove those lines if you see any of them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.

Running your site through Google’s Fetch and Render tool is the next step. This will verify whether or not you solved the issue. The tool will provide further instructions on changes to be made to the robots.txt file if Googlebot is still being blocked. Into the bargain, if there are any other crawling issues you can identify those buy using the robots.txt testing tool in Search Console.

Some are getting alerts for third party resources that are blocked; however, Google has previously said third party resources are not an issue since they are generally outside of the webmaster’s control.

case studies

See More Case Studies

Contact us

Partner with Us for Comprehensive IT

We’re happy to answer any questions you may have and help you determine which of our services best fit your needs.

Your benefits:
What happens next?
1

We Schedule a call at your convenience 

2

We do a discovery and consulting meeting 

3

We prepare a proposal 

Schedule a Free Consultation
Schedule a Free Consultation

We’re happy to answer any questions you may have and help you determine which of our services best fit your needs.

Your benefits:
What happens next?
1

We Schedule a call at your convenience 

2

We do a discovery and consulting meting 

3

We prepare a proposal