White hat link development today and tomorrow.

The rules for link development has evolved over the last ten years considerably. However there has been almost no changes to what I call white hat link development. Both Google and Bing today have link dis-vol tools where one can submit the links that point to the webmasters site to tell the search engines these are links which the site owner no longer wishes to be associated with in hopes to remove a penalty that a site may have because of unnatural links. These tools went up around the time of penguin algo roll outs.

How we got to today

To understand the differences lets go way back in time to the infancy of Google. Links pointed from one site to another as recommendations. These recommendations mirrored the type of recommendations that if one where to walk into a counter of say a store that had greeting cards and ask the question, “Where can I go to print my own card which i made on my computer?” The clerk would make a goodwill recommendation to another establishment that had that service.

By counting the number of links pointing to a site or web page across the internet a fairly good index could be made, which was very useful with a database application as a method to search the internet. Infoseek was based primarily on counting links in creating its core database, their database application then produced answers to quires by going vertically through the core database and pulling up the pages with the content, which matched the search, adding to the content the words that where in the links pointing towards the site.

Google improved on the method of using external factors to rank sites by saying that a link from a popular site was worth more than a link from a site which was not as important. Popular was determined by the number of links pointing towards a site.

Let me digress, a search engine which always shows the same set of sites has a problem because people once they have seen all those sites will stop using that search engine. Hence there is a need for other algos to provide new content, which has recently become popular, trendy, viral, or fresh. The newest of these algos is named hummingbird, which will effect 9 out of every 10 searches. While external factors play a part in these algos these are outside of the scope of this post.

With the rule being who has the most links pointing towards their site many marketers began aggressively to accumulate links from anybody and everybody. Including their own sites, as the patents suggested these links counted as well as the links on all the pages of the site they wanted to market. Some people purchase tens of thousands of domains, which could be used to link to their sites. Search engines soon began referring to sites with using unnatural methods as search engine spam, setup guidelines of what was acceptable, and a distinction was made between white hat and black hat SEO practices. Those with tens of thousands of single page sites were named thin content and ignored. Those with link farm schemes where ten thousand sites all added the same set of links to their site to get a link from the collective of ten thousand sites were also ignored as soon as they became big enough to appear above the radar.

There are tens of thousands of forums online, almost all allow a link from the profile page and many allow signature links. They too have been used aggressively by marketers. Drive by blog comments have been so bad that most blogs filter comments and read them before they become published.

Mega menus, site maps on every page, today are ignored, Navigation content is examined and site wide links to other sites are discounted. Note navigation content should be white hat and point to the important landing pages on a site where people can begin browsing.


Today we have Penguin 2.1 tomorrow Orca 1.0

White hat link development that mirrors the situation of a question being asked to the clerk at an establishment has not been harmed. And, is part of what needs to be done to recover a site out of a history of black hat link development.

Penguin has been refined to determine which of the tens of thousands of forums are quality forums. Many sites have expertise and they share it in forums, search engines want to list experts but not aggressive drive by marketers who just create a profile and say hello then leave.

White hat link development strategy, which is desirable, has been and remains a good practice. The question white hat link developers ask is if i had a stack of business cards where would be the best place to put them in hopes of bringing in people who would be interested in the product or service. The people most likely to turn into customers are those who follow white hat links, they are in fact better than search engines for promotion as one qualified lead is worth more than a hundred unqualified visitors.

Forget search engines while doing link development and the links are natural links which have a value in and of themselves. Yes it is harder, instead of filling out a form on a forum a discussion with a prospective partner needs to take place. Some of these partners may not have a popular site in the eyes of search engines today, which is natural. In the long run when the next algo comes out, maybe called Orca the killer whale, sites with natural goodwill links grow.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: