Should /wp-includes/js/* folder be blocked in robots.txt file for a WordPress website?
In my WordPress website.
I am disallowing /wp-includes/js/* folder in the robots file. Is that bad?
I use SEMRush to audit my site and it shows an error that a resource at the above location is being blocked.
This is my robots.txt
User-agent: * Allow: /wp-content/uploads/ Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /linkout/ Disallow: /recommended/ Disallow: /comments/feed/ Disallow: /trackback/ Disallow: /index.php Disallow: /xmlrpc.php Disallow: /readme.html Disallow: /refer/ Disallow: /search/ Allow: /wp-admin/admin-ajax.php Sitemap: https://www.example.com/post-sitemap.xml Sitemap: https://www.example.com/page-sitemap.xml Sitemap: https://www.example.com/video_portal-sitemap.xml Sitemap: https://www.example.com/press_release-sitemap.xml Sitemap: https://www.example.com/careers-sitemap.xml Sitemap: https://www.example.com/emplois-sitemap.xml
Topic sitemap google-xml-sitemaps core Wordpress
Category Web