fbpx

Bing Updating Bingbot To Make A Turnaround In Seo Community

Bing Updating Bingbot To Make A Turnaround In Seo Community

Bing is working hard on their crawler and making sure that it is not missing any new content and also overloading one’s web servers at the same time

The principal program manager for Bing Webmaster Tools, Mr. Fabrice Canel provided an upbeat update regarding their web crawler, BingBot, mentioning that the team is putting efforts on improving its efficiency.

RESPONSE TO USER FEEDBACK

This update was a follow up to his SMX Advanced talk in June this year, during the time in which he announced an 18 months effort which went through to improve the web crawler, BngBot. The audience was asked to submit suggestions and feedbacks from Canel.

Canel thanked the SMX audience for its contributions and also mentioned that the team has made numerous improvements based on their feedback. On the Bing webmaster blog, Canel said that they will continue to improve their crawler and share whatever they have done in the new “BingBot” series.

GOAL OF BINGBOT

Canel in the first post was outlining the goal of BingBot, which is determining which sites to crawl, how often it crawls as well as how many pages to fetch from each of the sites by the use of an algorithm. While making sure that the content in its index is as fresh as possible, it also ensures site’s servers are not overloaded by the crawler as the main goal of BingBot is to limit its crawling footprint on that site.

The balance which Bing is working on to strike at scale is basically this “crawl efficiency”. Canel says it’s a work-in-progress as they have heard all concerns regarding BingBot not crawling frequently and also that their content is not fresh within the index. While at the same time they have heard that the crawler also crawls too often which causes constraints on the resources of the websites.

WHY SHOULD ANYONE CARE?

Bing is seriously trying to listen to the SEO community as well as to the webmaster. The Webmaster Tools team is making changes to improve efficiency. They are trying to make sure that the crawler doesn’t overload one’s servers. At the same time, they are also making sure that it is faster and more efficient when it comes to finding new content on anyone’s website. Canel further added that they are actively working on the same and will continue to work till improvements are made.

HOW DOES ANY OF THIS IMPACT ANYONE?

Basically, if one can’t be seen then one can’t be ranked. So, if one adds new content to their website and Bing cannot see it, so it will not rank it. This simply means that the searchers who are using Bing won’t be able to find one’s new content as Bing can’t see as well.

Google with its search engine holds the public at large, despite the seemingly enormous and huge mindshare. But on the other hand, Microsoft still continues to toil away at their own search engine, which is not only preventing a monopoly in the search engine market but also helping the synergy of solutions of its own company which relies on the usage of Bing.

Microsoft is trying very hard to improve their web crawler and its efficiency of balancing the index of new content and also making sure that the site servers are not taxed. As far as the consumer interest for search engine goes the Bing search engine has been a relative second fiddle or a second-class citizen. But they are trying to change everything with its updated version of BingBot as it can help to end the arguments of relevancy between the two giant search engines, namely Google and Bing.

Translate »