The Expert SEO Guide To URL Parameter Handling

The Expert SEO Guide To URL Parameter Handling

Pros:

Relatively easy technical implementation.
Very likely to safeguard against duplicate content issues.
Consolidates ranking signals to the canonical URL.

Cons:

Wastes crawling on parameter pages.
Not suitable for all parameter types.
Interpreted by search engines as a strong hint, not a directive.

Meta Robots Noindex Tag

Image created by author
Set a noindex directive for any parameter-based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.

URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.

Pros:

Relatively easy technical implementation.
Very likely to safeguard against duplicate content issues.
Suitable for all parameter types you do not wish to be indexed.
Removes existing parameter-based URLs from the index.

Cons:

Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
Doesn’t consolidate ranking signals.
Interpreted by search engines as a strong hint, not a directive.

Robots.txt Disallow

Image created by author
The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.

You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.

Pros:

Simple technical implementation.
Allows more efficient crawling.
Avoids duplicate content issues.
Suitable for all parameter types you do not wish to be crawled.

Cons:

Doesn’t consolidate ranking signals.
Doesn’t remove existing URLs from the index.

Move From Dynamic To Static URLs

Many people think the optimal way to handle URL parameters is to simply avoid them in the first place.

After all, subfolders surpass parameters to help Google understand site structure and static, keyword-based URLs have always been a cornerstone of on-page SEO.

To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.

For example, the URL:

www.example.com/view-product?id=482794

Would become:

www.example.com/widgets/purple

This approach works well for descriptive keyword-based parameters, such as those that identify categories, products, or filters for search engine-relevant attributes. It is also effective for translated content.

But it becomes problematic for non-keyword-relevant elements of faceted navigation, such as an exact price. Having such a filter as a static, indexable URL offers no SEO value.

It’s also an issue for searching parameters, as every user-generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low-quality content pages whenever a user has searched for an item you don’t offer.

It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as

www.example.com/widgets/purple/page2

Very odd for reordering, which would give a URL such as

www.example.com/widgets/purple/lowest-price

And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of the UTM parameter.

More to the point: Replacing dynamic parameters with static URLs for things like pagination, on-site search box results, or sorting does not address duplicate content, crawl budget, or internal link equity dilution.

Having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.

Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding SEO problems.

But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page – and is obviously not feasible for tracking parameters and not optimal for pagination.

The crux of the matter is that for many websites, completely avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.

So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement them as query strings. For parameters that you do want to be indexed, use static URL paths.

Pros:

Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.

Cons:

Significant investment of development time for URL rewrites and 301 redirects.
Doesn’t prevent duplicate content issues.
Doesn’t consolidate ranking signals.
Not suitable for all parameter types.
May lead to thin content issues.
Doesn’t always provide a linkable or bookmarkable URL.

Best Practices For URL Parameter Handling For SEO

So which of these six SEO tactics should you implement?

The answer can’t be all of them.

Not only would that create unnecessary complexity, but often, the SEO solutions actively conflict with one another.

For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tags. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.

Google’s John Mueller, Gary Ilyes, and Lizzi Sassman couldn’t even decide on an approach. In a Search Off The Record episode, they discussed the challenges that parameters present for crawling.

They even suggest bringing back a parameter handling tool in Google Search Console. Google, if you are reading this, please do bring it back!

What becomes clear is there isn’t one perfect solution. There are occasions when crawling efficiency is more important than consolidating authority signals.

Ultimately, what’s right for your website will depend on your priorities.

Image created by author
Personally, I take the following plan of attack for SEO-friendly parameter handling:

Research user intents to understand what parameters should be search engine friendly, static URLs.
Implement effective pagination handling using a ?page= parameter.
For all remaining parameter-based URLs, block crawling with a robots.txt disallow and add a noindex tag as backup.
Double-check that no parameter-based URLs are being submitted in the XML sitemap.

No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.

More resources: 

Featured Image: BestForBest/Shutterstock

Tinggalkan Balasan

Alamat email Anda tidak akan dipublikasikan. Ruas yang wajib ditandai *