Rather than using a dedicated web crawler to index news pages, Google announced that Googlebot will now crawl web content and news content. However, this does not change the company’s News inclusion policies – so content marketers hoping to gain visibility in Google’s news aggregator will still need to maintain especially high quality content marketing standards.
As the search giant explains, the integration of the two crawlers should have little impact on most publishers. Googlebot will perform the same job as the old News crawler, and websites can still block the spider from indexing news content. Sitemaps will still be crawled and Analytics should remain the same, signifying which traffic comes from Google web searches and which is from Google News searches.
The only major difference is that site owners will only see Googlebot when they examine their logs, which could make it difficult to determine if the search giant is indexing news content or if a page has been filtered. Additionally, if businesses have sections that only members or paid subscribers can read, Google will not index the full content, crawling only the content that is available to all searchers.
“As with any website, from time to time we need to make updates to our infrastructure. At the same time, we want to continue to provide as much control as possible to news web sites,” David Smydra, product specialist at Google News, wrote in the company blog.
News content marketing is growing to be an increasingly popular online activity, and Google’s updates to its crawling algorithm should remind marketers that the company considers news publishers – as do consumers. As Brafton reported, more than three-quarters of Americans (76 percent) look for news online, so having this content on a business website may help generate new leads.