Although Google recommended publicly since 2014 through a video of Matt Cutts that javascript and css files should be made available to the googlebot, it started sending messages to webmasters just now.
Simply put, googlebot started to understand css and javascript code so well that is confident now to use this information in ranking websites. From your css and javascript files it can tell if your site is responsive, if further information is loaded by scrolling etc.
In order to allow access to css and javascript files, edit your robots.txt file to include:
1
2
3
User-Agent: *
Allow: .js
Allow: .css
This code will help you if there is no other rule more specific which overwrites it. For example on all of our e-commerce sites we have a rule that prevents google from indexing cart pages (we don’t need this page indexed):
1
Disallow: /*cart*
This rule will block google’s access to javascript files like these:
1
2
http://example/js/ajaxaddtocart.js
http://example/js/cart/main.js
To fix it you must be more specific in allowing javascript files with the following code:
1
2
3
4
5
6
7
User-Agent: *
Allow: .js
Allow: .css
Allow: /cart/*.js
Allow: /cart/*.css
Allow: /checkout/*.js
Allow: /checkout/*.css
You can test which files are blocked by going into webmasters dashboard, Google Index -> Blocked Resources. You shouldn’t have any css or javascript files listed there.
After you made your edits be sure to go into Crawl -> robots.txt Tester and test your urls.