Contáctanos

abril 24, 2014

Realising the complexity and size of the subject, I’ll try to deliver a simple guide to help you avoid potential search engine penalties.

So, let’s look at the most common types of thin content according to Google:

Doorway pages, also known as bridges, are poor-quality pages trying to rank for a certain keyword or phrase. Subsequently they funnel the user to other pages, known as hubs, and even websites using simple call-to-action or re-direct. Webmasters normally deploy such pages on different domains that are cross-linked somehow. In order to prevent duplicate content, different keywords are accompanied by supplementing words with no meaning, geo locators or synonyms to mislead search engines. Be careful not to have such spamdexing pages that actually strive to rank high, grab the searcher and re-direct him shortly afterwards.

Thin Affiliates are websites you refer to on your own webpages. The searchers are supposed to visit these pages and buy products. In the end, you benefit as an intermediary. If you want to refer to a website where people can buy something, you must enrich the content, but be careful not to copy the entire proposition to end up with duplicate and dull offers.

Thin syndication indicates content you take directly from article banks or their RSS feeds and paste it to your website. This is considered as duplicate content with no added value to the users. Do not copy-paste huge strings from Wikipedia either; this could also be also considered thin syndication. Be unique and create your own content.

Scraped content is a technique that is used to steal content from more reputable sites to increase the number of pages. It is essential that your content is original and provides value to the users. Copyscape and Siteliner will help you to check your site for duplicate content around the web. Content with less than 70% uniqueness will be marked as duplicate from search engines.

Cloaking

Cloaking is defined as deceptive technique by all search engines. Basically, it is used to serve one page to the user while showing different pages to the crawling bots. Let’s say the user is looking for “baby clothes”. The search engine provides him with a list of results and he clicks on one of the first ones. Unfortunately, he ends up at a porn website. How is this possible? The answer is, cloaking.

Contáctanos

Nos encantaría saber de ti. Rellena el formulario y la persona más cercana de nuestra oficina se pondrá en contacto contigo.

 

Reinventemos el futuro