It is common for search marketers to become so consumed with Google that they often forget about another reliable traffic source in Bing. While Bing and Yahoo combine for roughly 25 percent of search traffic, perhaps that index warrants closer attention. Let's look inside the Bing Webmaster Tools area and see if we can't improve our chances of appearing in some competitive searches.
A pretty straightforward process is required to claim a site in Bing Webmaster Tools. Bing provides two options to do so - via an XML file or by copying and pasting a meta tag. Once you have decided which option and completed that process, simply select the Verify file.
Once a site has been verified, data will start to show after a few days. Based on my experience, that actually takes between 12 and 72 hours for newly added sites.
When Bing is able to show site status and the recent trends of a site's activity within Bing, from the dashboard information on the pages crawled, the pages indexed and data on impressions and clicks will be available from the dashboard.
Bing's Crawl Summary provides up to six months of crawl data enabling webmasters to identify any problems that were encountered during Bing's crawl of the website submitted and verified.
This section of Bing Webmaster Tools also enables SEOs and webmasters the ability to indicate to Bing which query parameters can be ignored by the crawler. Preventing duplicate content from appearing, and directing that a page's index value will not be split between several URL variations is certainly an SEO best practice and one that should definitely be used if your site uses query parameters.
Information provided by Bing also includes details about errors, redirects, malware and exclusions detection during the most recent crawl of the site. Much like Google Webmaster Central, SEOs and webmasters are able to add a sitemap and indicate to Bing how the site submitted and verified is structured. Should you have a large website (lots of pages), then submitting one or many sitemaps will help Bing find sites it was unable to locate.
Once everything is successfully crawled and indexed, Bing provides users of the system the ability to see how the site is performing. Using the Index Explorer, users are able to see how many pages are within the index, filtering by HTTP codes, crawl date ranges, discovery ranges, what, if anything, was designated as malware, and what, if anything, was excluded by the robots.txt file.
Bing also provides the ability within Webmaster Tools to submit URLs that are not in the index (a day and month limit of 10 and 50, respectively, is currently in place. Users of the system can also block (and of course unblock) indexed URLs from appearing in the search results - the ability to block the URL and the cache, or just the cache, is also available and should also be noted. Finally, Bing offers up a way to track the inbound link counts to a site and for individual pages within the site.
The Traffic summary option will be the most interesting to many. Once all the right pages are indexed and we have successfully removed all the barries to a successful crawl, we can start reviewing traffic data and analyzing search query performance. Two reports are available - the impressions and clicks report and the queries report, which should not indicate just the impressions and clicks per query, but the click-through rate, the average impression position and the average click position.
How does Bing Webmaster Tools compare to Google Webmaster Central? All in all, pretty well. While there is some room for improvement, for the amount of traffic that an average site can expect from Bing, setting up a site properly and correcting any problems Bing finds will increase the likelihood of receiving more exposure and activity within that index.