Google Algorithm Changes Canonical Declaration

Andrew,

in lieu of the thousand of other issues with SEO on dolphin, google rolls out a new challenge for you. what are you preparing for dolpin 7.x.x and 8.x.x in regards to the new algorithm?

 

Using the new canonical tag

Specify the canonical version using a tag in the head section of the page as follows:

<link rel="canonical" href="http://www.example.com/product.php?item=swedish-fish"/>

That’s it!

  • You can only use the tag on pages within a single site (subdomains and subfolders are fine).
  • You can use relative or absolute links, but the search engines recommend absolute links.

This tag will operate in a similar way to a 301 redirect for all URLs that display the page with this tag.

  • Links to all URLs will be consolidated to the one specified as canonical.
  • Search engines will consider this URL a “strong hint” as to the one to crawl and index.

Canonical URL best practices

The search engines use this as a hint, not as a directive, (Google calls it a “suggestion that we honor strongly”) but are more likely to use  it if the URLs use best practices, such as:

  • The  content rendered for each URL is very similar or exact
  • The canonical URL is the shortest version
  • The URL uses easy to understand parameter patterns (such as using ? and %)

Can this be abused by spammers? They might try, but Matt Cutts of Google told me that the same safeguards that prevent abuse by other methods (such as redirects) are in place here as well, and that Google  reserves the right to take action on sites that are using the tag to manipulate search engines and violate search engine guidelines.

For instance, this tag will only work with very similar or identical content, so you can’t use it to send all of the link value from the less important pages of your site to the more important ones.

If tags conflict (such as pages point to each other as canonical, the URL specified as canonical redirects to a non-canonical version, or the page specified as canonical doesn’t exist), search engines will sort things out just as they do now, and will determine which URL they think is the best canonical version.

The tag in action

This tag will most often be useful in the case of multiple URLs pointing at the same page, but might also be used when multiple versions of a page exist. For instance, wikia.com is using the tag for previous revisions of a page. Both http://watchmen.wikia.com/index.php?title=Comedian%27s_badge&diff=4901&oldid=4819 and http://watchmen.wikia.com/index.php?title=Comedian%27s_badge&diff=5401&oldid=4901reference the latest version of the article (http://watchmen.wikia.com/wiki/Comedian%27s_badge) as the canonical.

The search engines stress that it’s still important to build good URL structure and also note that if you aren’t able to implement this tag, they’ll still keep the processes they have now to determine the canonical. For instance, at SMX West on Tuesday, Maile Ohye of Google explained how Google can detect patterns in URLs if they use standard parameters. For instance, with these URLs:

Maile explained that Google can detect (particularly when looking at patterns across the site) that the sort parameter may order the page differently, but that the URLs with the sort parameter display the same  content as the shorter URL (http://www.example.com/buffy?cat=spike).

While it’s rare for the search engines to join forces, this isn’t the first time they’ve come together on a standard. In November 2006, they came together to support sitemaps.org. And in June 2008 they announced a standard set of robots.txt directives. Matt Cutts of Google and Nathan Buggia of Microsoft told me that they want to help reduce the clutter on the web, and make things easier for searchers as well as site owners.

This new tag won’t completely solve duplicate issues on the web, but it should help make things quite a bit easier particuarly for ecommerce sites, who likely need all the help they can get in the current economic conditions. Site owners have been asking for help with these issues for a really long time so this should be a greatly welcomed addition. 

When a GIG is not enough --> Terabyte Dolphin Technical Support - Server Management and Support
Quote · 2 Dec 2011

This should be a help for Dolphin users organising content. It is good that G and M are pushing this system.

Quote · 2 Dec 2011

Thanks for the info...:)

so much to do....
Quote · 2 Dec 2011

Hi Dos. Which page do I put this code on? The index.php page or the header.inc. php page ?  Also, where on the page does it need to go?

Andrew,

in lieu of the thousand of other issues with SEO on dolphin, google rolls out a new challenge for you. what are you preparing for dolpin 7.x.x and 8.x.x in regards to the new algorithm?

 

Using the new canonical tag

Specify the canonical version using a tag in the head section of the page as follows:

<link rel="canonical" href="http://www.example.com/product.php?item=swedish-fish"/>

That’s it!

  • You can only use the tag on pages within a single site (subdomains and subfolders are fine).
  • You can use relative or absolute links, but the search engines recommend absolute links.

This tag will operate in a similar way to a 301 redirect for all URLs that display the page with this tag.

  • Links to all URLs will be consolidated to the one specified as canonical.
  • Search engines will consider this URL a “strong hint” as to the one to crawl and index.

Canonical URL best practices

The search engines use this as a hint, not as a directive, (Google calls it a “suggestion that we honor strongly”) but are more likely to use  it if the URLs use best practices, such as:

  • The  content rendered for each URL is very similar or exact
  • The canonical URL is the shortest version
  • The URL uses easy to understand parameter patterns (such as using ? and %)

Can this be abused by spammers? They might try, but Matt Cutts of Google told me that the same safeguards that prevent abuse by other methods (such as redirects) are in place here as well, and that Google  reserves the right to take action on sites that are using the tag to manipulate search engines and violate search engine guidelines.

For instance, this tag will only work with very similar or identical content, so you can’t use it to send all of the link value from the less important pages of your site to the more important ones.

If tags conflict (such as pages point to each other as canonical, the URL specified as canonical redirects to a non-canonical version, or the page specified as canonical doesn’t exist), search engines will sort things out just as they do now, and will determine which URL they think is the best canonical version.

The tag in action

This tag will most often be useful in the case of multiple URLs pointing at the same page, but might also be used when multiple versions of a page exist. For instance, wikia.com is using the tag for previous revisions of a page. Both http://watchmen.wikia.com/index.php?title=Comedian%27s_badge&diff=4901&oldid=4819 and http://watchmen.wikia.com/index.php?title=Comedian%27s_badge&diff=5401&oldid=4901reference the latest version of the article (http://watchmen.wikia.com/wiki/Comedian%27s_badge) as the canonical.

The search engines stress that it’s still important to build good URL structure and also note that if you aren’t able to implement this tag, they’ll still keep the processes they have now to determine the canonical. For instance, at SMX West on Tuesday, Maile Ohye of Google explained how Google can detect patterns in URLs if they use standard parameters. For instance, with these URLs:

Maile explained that Google can detect (particularly when looking at patterns across the site) that the sort parameter may order the page differently, but that the URLs with the sort parameter display the same  content as the shorter URL (http://www.example.com/buffy?cat=spike).

While it’s rare for the search engines to join forces, this isn’t the first time they’ve come together on a standard. In November 2006, they came together to support sitemaps.org. And in June 2008 they announced a standard set of robots.txt directives. Matt Cutts of Google and Nathan Buggia of Microsoft told me that they want to help reduce the clutter on the web, and make things easier for searchers as well as site owners.

This new tag won’t completely solve duplicate issues on the web, but it should help make things quite a bit easier particuarly for ecommerce sites, who likely need all the help they can get in the current economic conditions. Site owners have been asking for help with these issues for a really long time so this should be a greatly welcomed addition. 

 

Quote · 27 Jun 2013

 

Hi Dos.

 He no longer plays here. He was banished a while back.

You have mail.

ManOfTeal.COM a Proud UNA site, six years running strong!
Quote · 27 Jun 2013

oh! people revive old topics.

so much to do....
Quote · 27 Jun 2013
 
 
Below is the legacy version of the Boonex site, maintained for Dolphin.Pro 7.x support.
The new Dolphin solution is powered by UNA Community Management System.