How to fix No: ‘noindex’ detected in ‘X-Robots-Tag’ http header

Recently, I added my blog to the Google Search Console, one of the pages received an error saying “Submitted URL marked ‘noindex'”. Since all the pages are created and submitted similarly, and only one particular page had a “noindex” tag, that’s a bit weird.

So, I tried to look for a solution to fix the noindex tag problem. I heard that the default setting is always “index” and “follow”. Almost all of the search results are talking about how to set up “no-index” and “no-follow”.

Some suggest that installing an SEO WordPress plugin would help; such a plugin could turn on and off “index” and “follow” on each post. But only one page has this noindex problem, and I don’t want to install a plugin to fix one page.

Other suggest that one can try toggling on and off the Search engine visibility setting in WordPress Settings/Reading control panel. If I turn it off, will more pages have indexing problem?

I tried a few times using the “Inspect URL” tool on the Google Search Console, and the result was the same ‘noindex’ detected in ‘X-Robots-Tag’. What if I duplicate the content and submit another page? Or what if I change the page status, from “published” to “draft”, and then back to “published”? It works! The Inspect URL tool can now fetch the page in real-time, and the system is currently validating the coverage issue fixes. The system message suggests that this validation can take a few days. The online result shows “URL is on Google”, “Coverage: Submitted and indexed”, and “Mobile Usability: Page is mobile friendly”.