WordPress site using Yoast SEO being blocked from indexing by robots.txt

Morning all! I've just built a website for a client, using WordPress and the Yoast SEO plugin (free version).

https://www.fly2help.org/

Whilst they're really happy with the website, I've noticed an issue when trying to submit it to search engines, such as Google, for indexing. It is continually being blocked, with an error like this:

Crawl allowed?
error
No: blocked by robots.txt

I'm not 100% sure why this is the case. I've checked the robots.txt file and there's nothing in it that suggests it would block bots. I have also been through the WordPress installation to ensure that there are no settings (as far as I can see) which are actively blocking search engine indexing.

Is there anything you can suggest as to why this might be the case, and what I can do to resolve it? I of course want to be able to submit the website (sitemaps and URLs) for indexing and I have never come across this issue before. I have also followed advice in the following StackExchange links, but to no avail:

Why does Google Webmaster Tools say my whole site is blocked by robots.txt when I have no robots.txt?

Google saying is blocked by robots.txt without robots.txt on my website

I am happy to provide additional information if that will help.

Thanks in advance!

Topic plugin-wp-seo-yoast google-xml-sitemaps search-engines robots.txt Wordpress

Category Web

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.