Joe Meloni

In a recent Google Webmaster Help video, Matt Cutts urged marketers to ensure that all user-facing web pages can be crawled and indexed by Googlebot. Cutts’ statement came in response to a question regarding whether slower loading pages have a negative SEO impact on a website.

As he has said in the past, Cutts affirmed that site speed affects very few queries, influencing results for as little as one in every 100 searches. Some pages within a website may contain larger elements that take longer to load, which can negatively impact search standing. However, this is the case so infrequently that Cutts advises marketers to ensure all pages can be indexed.

“As long as your browser isn’t timing out or being flaky, you should be in good shape,” Cutts said in the video. “Typically, as long as just a few pages are slow or the site overall is fast, you should be OK.”

The page speed issues related to SEO aside, Cutts suggested allowing Googlebot to crawl all pages on a website will help a company see its search standing improve. In some cases, a slower page may offer the most relevant information for a key search term, and search marketing campaigns might be hurt if a business chooses to prevent said page from being crawled.

The main reason site speed is a ranking signal for Google is related to the company’s overall focus on the user experience. Brafton recently reported that Cutts has said that Google is likely to adjust its search algorithm to focus even more on quality content, which is another key to user experience.