In our business, everyone recites the familiar refrain that duplicate content should be avoided at all costs. The argument is that duplicate content can harm a website’s search engine rankings, which is true to a degree, but there is a common misconception that Google will de-list websites found to have duplicate content. This just isn’t true.

Google does not punish website owners for publishing syndicated content or for syndicating content to others.

When it comes to duplicating your own content within your own site, there can be negative effects, but syndicating your content may simply result in losing your position to another website, which is not the end of the world.

Here are some of the ways that Google deals with duplicate content:

Republishing articles that have been syndicated by other websites:

When you take syndicated articles and publish them on your website, Google will see two closely matching pages on two separate domain names. One of the two articles may become more relevant in search results than the other, depending on the search term used to find the page and what surrounds the article on the site.

Google will decide, based on a wide variety of factors, which article is the most relevant one. In most cases, Google will show the original, unless your copy is more relevant to the searcher’s needs.

Having articles republished by other websites:

Google will ultimately make the call about which page to show.

The thing is Google will not punish you for syndicating your content because that would destroy the notion of writing press releases. Press releases are written for the sole purpose of being republished, either in part or in full.

Having multiple copies of content on the same site:

Sometimes a website will have multiple URLs for the same content. If Google indexes both pages, Google will most likely show the more relevant page.

Duplicate Content and SEO

The negativity related to duplicate content stems from the fact that it may result in Google failing to crawl every page of a website. There is a chance that if there is an inordinate amount of duplication on your site, Google may deem crawling it a waste of time, and your rank could suffer.

The other concern among marketing experts is that other sites running your syndicated content may rank higher than for those articles if they have a stronger domain authority.

The best way to keep everything in check is to ensure that all of your pages have enough differentiated text to avoid accidental duplicated content.

The only time that duplicate content can really have a measurable negative effect on a website’s success, is when a business creates ten websites, using ten domain names, featuring the same content across the board.

Google will soon tag the sites as spam. It is better to put real effort into optimizing one domain properly, instead of trying to gain authority from ten.

Copyright 2012 dzine it, Inc. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Pinterest
Specialty:
Digital Marketing; News

Google’s algorithm, the ever-changing formula that determines how websites are ranked and the lifeblood of every successful search engine campaign, is changing once again. As a result, website owners need to be ready to make certain adjustments or they will certainly be left out in the cold.

Like previous algorithm changes, Google’s newest push can result in high-ranking sites being downgraded and sites that no longer meet certain criteria being indexed out altogether. In fact, recent reports state that this algorithm change will be the most substantial and most significant in Google’s history.

Here are the steps necessary to ensure that your website, and your ongoing search engine optimization (SEO) campaign, is ready for the big change:

1.  Reconsider Keywords: Think Synonyms

Google has announced that it is changing its dependency on keyword ranking to include comparable “like words.” As a result, webmasters should reevaluate their copy and brand message. Consider using similar words in context and abandoning the over-dependency on “same keywords” throughout a page.

2. Concentrate on Your FAQ Page

Under its new algorithm, Google will place increased relevance on definitions and direct answers in an effort to return highly authoritative responses to end-users based on their queries. Website owners can leverage this information by beefing up their FAQ pages and answering direct questions related to their products, services, company and industry. This will help index your pages for a wide range of keywords.

3. Educate Your Users

Focus less on selling and more on educating your users about the product or service that you offer. This approach will help foster steps one and two.

Many aspects of Google’s planned algorithm change are being implemented with the human user in mind, which means that the heart of the searcher’s interest will more readily and organically rise to the top of the results pages.

Pinterest
Specialty:
Digital Marketing; News

Shady SEO tactics can result in severe penalties for a website. Punishments can range from losing organic search engine traffic for a few days to losing it permanently.

While many “best practice” search engine optimization (SEO) techniques exist, there are a number of “worst practice” techniques that website owners must avoid to stay above board with the search engine giants.

Link Buying

Quite simply, link buying is the practice of purchasing links in an attempt to make a site seem more authoritative.

A common element of most search engine ranking algorithms is link popularity. To the search engines, the number of links that point to a domain is a measure of the authority and trustworthiness.

It is common for websites to try and cheat the system through link purchases.

Buying links should be avoided at all costs, because it is a violation in the eyes of most search engines and it can get a site banned from an index.

Cloaking

Cloaking refers to offering different content to search engines than to humans.

With cloaking, a web server will deliver different content based on whether the request is coming from a search engine or a Web browser. Search engines do not look favorably on cloaking.

Keyword Stuffing

Perhaps the most common form of black hat SEO, keyword stuffing crams a page’s content full of certain keywords to make the page appear more relevant for those keywords. Keyword stuffing is a very old-fashioned tactic that stopped working well a long time ago.

Keywords are important to a quality SEO campaign, but stuffing a page to the point where it doesn’t make any sense to the reader, will always land a website owner in hot water.

Hidden Text

The use of hidden text is another attempt to offer one type of content to the search engines and another to humans.

Hidden text and keyword stuffing often appear together. Since keyword-stuffed text reads so poorly, the idea is to hide it from visitors while still having it read by the search engines.

Quality content is essential to a website’s success.

There are plenty of legitimate ways to balance content with aesthetics and conversions, and hidden content isn’t worth the risk.

The Internet is an important marketing tool for most companies, which is why it is as important as ever to ensure that marketing tactics are focused on the long-term health and success of a website.

The risks associated with black hat SEO tactics are never worth the reward. 

Copyright 2012 dzine it, Inc. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Pinterest
Specialty:
Digital Marketing; News
Page 20 of 86
11018192021223086