Yesterday Bing announced at SMX West that they have increased the ability to submit URLs to their search engine by 1,000X from 10 URLs per day to 10,000 URLs per day. They also said this is a fundamental shift in how search engines discover content and reduce crawling of web sites.
Major announcement today for the SEO world! Get your content indexed fast by now submitting up to 10,000 URLs per day to Bing https://t.co/XbwxaGfgVr Let’s move away from crawling to discover content change to tell us. Please adopt and save the world from global crawling warming!
— Fabrice Canel (@facan) January 31, 2019
Bing wrote “We believe that enabling this change will trigger a fundamental shift in the way that search engines, such as Bing, retrieve and are notified of new and updated content across the web. Instead of Bing monitoring often RSS and similar feeds or frequently crawling websites to check for new pages, discover content changes and/or new outbound links, websites will notify the Bing directly about relevant URLs changing on their website. This means that eventually search engines can reduce crawling frequency of sites to detect changes and refresh the indexed content.”
What? Did you read that?
Bing removed the public URL submission tool last year and had issues with the private one so had to add rate limits there. Now, they expanded it from 10 URLs per day to 10,000 URLs per day. This is a big big change.
But not everyone gets 10,000 URLs per day, it is based on the age of the verified site. Bing wrote “The daily quota per site will be determined based on the site verified age in Bing Webmaster tool, site impressions and other signals that are available to Bing. Today the logic is as follows, and we will tweak this logic as needed based on usage and behavior we observe.”
To submit URLs, go to Bing webmaster tools and submit them manually:
Or you can use the API:
I still find it weird that Bing thinks they can slowly do away with crawling? Maybe I am reading too much into this?
Forum discussion at Twitter.