Google’s John Mueller Q&A: 4 SEO Questions Answered
Google’s John Mueller answers four quick questions about common technical SEO issues that almost everyone encounters at one point or another.
Mueller responds to questions sent by people related to:
- Blocking CSS files
- Updating site maps
- Re-upload a site to the web
- Googlebot crawl budget
These questions are answered in the last installment of the Ask Googlebot video series on YouTube.
Traditionally, these videos focus on answering a specific question in as much detail as Google is able to provide.
Advertising
Continue reading below
However, not all SEO questions require an entire video to answer. Some can be answered in one or two sentences.
Here are some quick answers to questions that are often asked by people new to SEO.
Can blocking CSS files in Robots.txt affect rankings?
Yes, blocking CSS can cause problems, and Mueller says you should avoid doing that.
When CSS is blocked in the robots.txt file, Googlebot is unable to display a page as visitors would see it.
Being able to see a page completely helps Google better understand it and confirm that it is mobile-friendly.
Advertising
Continue reading below
All of this contributes to the ranking of a web page in the search results.
How do I update the sitemap for my website?
There isn’t a single, common solution to updating sitemaps that work on every website, Mueller says.
However, most website setups have their own built-in solutions.
Consult your site’s help guides for a sitemap setting or for a compatible plug-in that creates sitemap files.
Usually all you need to do is activate a setting and you are good to go.
What’s the right way to get a site back on Google?
It is not possible to reset a website’s indexing by deleting its files and uploading them again.
Google will automatically focus on the latest version of a site and remove the old version over time.
Advertising
Continue reading below
You can speed up this process by using redirects from all old URLs to the new ones.
Would removing RSS feeds improve crawling of Googlebot?
One person writes to Mueller to tell him that 25% of Googlebot’s crawl budget goes to the RSS feed URLs that are at the top of every page.
Advertising
Continue reading below
They ask if removing RSS feeds would improve crawling.
Mueller says RSS feeds aren’t a problem, and Google’s systems automatically balance a website’s crawl.
Sometimes Google crawls certain pages more often, but the pages will not be crawled again until Googlebot has seen all the important pages at least once.
Advertising
Continue reading below
Featured image: screenshot from YouTube.com/GoogleSearchCentral