Allowing External Javascript File To Be Crawled

I am facing issue with my site in google console

I am getting below error in google console for my site

Resource : 

Type : Script   

Status : Googlebot blocked by robots.txt

My site is in xcart and my robots.txt contains

User-agent: Googlebot
Disallow: /*printable=Y*
Disallow: /*js=*
Disallow: /*print_cat=*
Disallow: /*mode=add_vote*

User-agent: *
Allow: *.js
Allow: *.css
Allow: *.jpg
Allow: *.gif
Allow: *.png
Disallow: /admin/
Disallow: /catalog/
Disallow: /customer/
Disallow: /files/
Disallow: /include/

I tried changing

User-Agent: Googlebot
Disallow: /*printable=Y*
Disallow: /*print_cat=*
Disallow: /*mode=add_vote*
Allow: .js

But no luck with above code. Anyone has solution? how can i allow third party js allowed by google bot using my robots.txt



If the .js file is on a third party site that you have no control over, then no, there is no way to un-block it from your site. A given robots.txt file only controls crawling of files on the domain/subdomain that the robots.txt file was loaded from. To unblock this file, you would need to be able to change the robots.txt file on the third-party domain. (I'm assuming you can't do that here)

The simplest way to get around this is to copy the js file to your own server, and link to it there.