Mark Jackson who is President and CEO of VIZION Interactive took the time to do an interview with me about Duplicate Site Content and Multiple Site Issues which is also the panel he will be speaking on at SES 2008. You can also checkout other interviews at Marketing.fm at the SES 2008: San Jose area.
Many companies/marketers are unaware of duplicate content/multiple site penalties. What do you tell someone who has just spent thousands of dollars developing a site that’s actually going to harm their SEO?
There are many ways to deal with the issue. Generally speaking, if there are multiple copies of a site or there are multiple copies of content you’ll want to suggest to them that while we can leave the content open to visitors, the search engines need to “see” only one copy of the content. So, you will want to use a robots.txt file or other means of disallowing the search engines from indexing more than one copy of the content.
That said, in very general terms, if I see someone about to launch a website that I know will hurt their SEO, I tell them NOT to launch until they’ve addressed any issues that could be less than “optimal” for SEO (redirects are not set up properly, URLs are horrible, pages are “invisible” to the search engines, or any other issues).
If you already have multiple sites with similar content, what steps (other than taking one down or developing completely new content for one of them) can you take to minimize the negative effects on your SEO?
Again, the “easiest thing” can be to make sure that the search engines can only index one copy of the content. There can be very legitimate reasons why you would want to keep multiple or similar copies available to human visitors, including having different versions for PPC landing pages. The best thing you can do, though, to minimize the negative effects on search engine optimization is to be aware of all of the duplicate content and to make sure that the search engine crawlers can only get to one copy. If it’s absolutely necessary to market each website separately, and there are many (thousands of?) pages of content, I would recommend that they – at a minimum – edit the first paragraph of content to be unique and ensure that Titles and Descriptions are unique.
As blogs become increasingly popular, many companies are developing unique domains for their company blog. If this content is syndicated through their homepage or elsewhere, will that have negative affects on their homepage’s SEO?
Generally speaking, most companies will tend have a “shorter version” of the content on their company home pages. It is typical for a home page to include the title (which is a link) and a brief description of the article or blog post. Having an RSS feed or other type of feed on a web page generally is not considered duplicate unless that content is the only content on the page. Most home pages generally have a lot more content on the page than one syndicated feed. In fact, we have seen that these small snippets of content on the home page, being regularly updated, can have a positive effect on SEO as search engines do like to see fresh content, rather than a home page that never changes.
Do micro-sites with completely original content, but which are affiliated/associated with the parent company/homepage, have a negative affect on SEO?
It really depends on the content of the micro-site on whether or not it has a negative effect on search engine optimization. If the micro-site has original content then it is a good thing. However, lately, we are recommending that a blog or micro-site be put directly on the main domain name rather than developing a completely separate site with a completely new domain name.
Does duplicating homepage content on social networks/profiles hurt your SEO? (E.G. if a company creates a myspace page and copies much of the content from their homepage onto their myspace page, will that be considered duplicate content)
Generally speaking, if you were to take content and text from your current website and put it on social network or a profile it typically is not considered to be duplicate. There is usually enough other text and enough other content on the social networking profile where duplicate content is not usually an issue.
How does a search engine identify that two sites are related/tied to one another? What if someone copies my webpage content to their page, but I have no association with them, do I get penalized?
In most cases, the page that gets crawled first will be the originator of the content. Any other pages that the search engines find (like someone who copied your web page and put it on their site) will be considered duplicates of the original. The same concept applies when you take copy from one of your web pages and put it on another; the first page crawled is the original; all others are duplicates. So, this illustrates the importance of trying to ensure that your original content, that exists on your website is crawled/indexed first.
For more information about Mark checkout his company website http://www.vizioninteractive.com/ or see him at SES 2008: San Jose
Tags: duplicate content, SEM, SEO, SES 2008, SES 2008: San Jose, SES Interviews, SMM
More and more site owners are concerned that they might get penalized accidentally or overtly because of duplicate content. If you run mirror sites, will search engines ban you? If you have listings that are similar in nature, is that an issue? What happens if you syndicate content through RSS and feeds? Will other sites be considered the “real” site and rob you of a rightful place in the search results? This session looks at the issues and explores solutions.