As a white label website reseller, we understand that digital agencies are often looking to align with SEO best practices to ensure that they are delivering a quality service to their clients. Different search engines use different SEO algorithms, meaning that your client’s content will rank differently on each engine’s results. Bing holds the second largest market share of any search engine, second only to Google. Recently, Microsoft shared two SEO steps that are easy to implement and are intended to help search engines index both subscription-based and paywall content. If used correctly, they should encourage search engines to send more visitors to websites without compromising the publisher’s economic model. As a digital agency, this information can help drive traffic for your clients, allowing you to deliver exceptional results.
These steps will help your client’s website understand that it is BingBot (Microsoft’s web-crawling robot that builds a searchable index for Bing by collecting documents around the web) that is trying to access their subscription-based or paywall content. Once detected, the website should allow BingBot full access to said content. To put it simply, the website will essentially be revealing the full content to BingBot while showing users snippets until they have paid for access. Showing BingBot the content reassures it that the content is safe, permitted, and not a form of cloaking (presenting content to the search engine’s crawler that is different from the content presented to users).
Step 1: Enable Paywall Or Subscription-Based Content To Be Crawled
First, search engines like Bing need to be allowed to see the full content that users find behind a subscription or paywall. For search engines to be able to index more text that would match up with customer queries, they need to have access to full content.
Most publishers want to be sure that a crawler claiming to be a search engine crawler is, in fact, a real search engine crawler. Bing offers publishers a simple solution to this by having BingBot operating only from within a limited set of IP ranges. By referencing the IP address of the crawler against the publicly available list of BingBot IP addresses, publishers can easily confirm that it is truly BingBot trying to gain access. It’s important to check the list of IP addresses regularly since the IP address ranges can change from time to time.
Gifting You A $1000 Voucher For Our Web Development Services!
Take your agency’s web development services to new heights with reliable outsourced services. We’re dedicated to providing top-quality web development solutions for seamless user experiences and business growth. We’re gifting $1000 for a risk-free trial of our web development services! All you need to do is schedule a free 30-minute strategy call to claim your voucher!
Step 2: Be Sure Not To Leak Paywall Or Subscription-Based Content in Search Results Cache Pages
Publishers can ensure that search engines don’t expose or leak paywall or subscription-based content in the search engine’s cache pages. By using one of two methods suggested by Microsoft, it is within the publisher’s control whether or not search engines show a cached page of any document.
- Robots Meta Tag
A special robots meta tag can be placed in the <head> section of any page that shouldn’t be cached:
<meta name=”robots” content=”noarchive”>
or
<meta name=”robots” content=”nocache”>
- X-Robots Tag
A custom HTTP response header can do the same job as a robots meta tag. For non-standard web pages like Microsoft Office documents or PDFs, setting an HTTP response header is the only way to prevent caching:
X-Robots-Tag: noarchive
or
X-Robots-Tag: nocache
Are you looking for an outsource web design company that can help get your clients’ paywall or subscription-based content to rank higher by using SEO best practices? Get in touch with support@globitalmarketing.com, and we will be happy to help you out!