rants from the dark side of marketing

Links with less value – content text links

An interesting post by Quadszilla on his blog about Google valueing certain kinds of links less. Google is smarter than most people think.

What does this mean in practice?

1. Buying links from text link brokers is giving you even less bang for your buck. Paying $50, $100 or $500 from a site with good link popuplarity and having it end up in a “sponsored links” box might not be what you are looking for.

2 .Reciprocal links are probably not worth the time it takes. Most reciprocals end up in some “directory” or “links” page. You don’t want to spend so much time for this.

3. Link networks are much less effective. Digital Point’s COOP and Link Vault usually gets you links in footers and other non-desirable places (not too mention Coop’s links rotate randomly). Don’t get me wrong, link networks still work fine and it’s one of the main things I use to get indexed fast and I have even ranked for some good keywords.

It’s not the end of the world, text links still have value and help with SEO. But maybe it’s time to rethink about our linking strategies. It’s also a good time to develop a new automated link network. A link network that places links inside content and does not just throw 5 links on your footer.

I see two angles to approach this. One, is too automatically generate text and place links in the text. Of course putting a paragraph in your footer on every page isn’t very appealing to webmasters (at least whitehats). Maybe you can have it generate new pages in a subfolder and put links there. That would be almost like having a directory of links, plus you get updated content on your site, which is nice.

Second one is dynamic text linking. You probably have seen IntelliText ads. You put some code in your pages and then it turns keywords in advertisements. The link network could do the same thing with text links.

I don’t know how many webmasters would like something like that. Both of these are a quite more obtrusive than putting 5 links on your footer. What do you think about these link netword ideas as a webmaster?

Posted on Tuesday, August 30th, 2005 at 3:29 am under Rants. You can skip to the end and leave a response. Pinging is currently not allowed.

4 Comments

DanThiesGoogleSuckup Says:

Dan Theis is a google suckup in capable of an independent thought. Like Goofball he censors free speech. The taxpayers of the usa paid for the infrastructure that the censors known as goofballers have made their billion on scraping websites and putting on ads. Theis is a greedy turdlet.

Quadzilla is an idiot. As a google suck up he is fascinate by the minor google clerk cutts. Duh google has two database one with pages and score and one with links. If Mrs. BeatenUpSingleMomRunningAGiftBasketAffiliateSite puts “gift basket” in white same as the background, Google pal Big Giftbasket runs to Google and there goes single Mom’s site. The clerk cutts removes her from the score database which happens immediately and removes from the backlink database which takes months to update.

Anonymous Says:

Rent a word.

The problem with dynamic linking is that they’re using using dynamic URLs, which brings in traffic but not recognized backlinks. If webmasters would “rent” words or phrases on their site, a script could hyperlink one or more instances of that word/phrase per page, using the direct URL instead of a dynamic/tracking link. Problems of course w/stats and tracking so it’s not perfect, but a thought. PostNuke used to have something like this built into their system. Haven’t used it in quite awhil though so I don’t know if it’s still there. Might be an easy plug in for WP coders to make though.

blackhat-seo Says:

The problem with dynamic linking is that they’re using using dynamic URLs, which brings in traffic but not recognized backlinks.

I was talking about a system that would pick keywords in each page and make them links (using a direct url). Then store the info on a database so it doesn’t change the link every time the page is loaded.

I don’t think tracking would be a big problem. Just keep a list of how many links are on a site, on which page and to which site they point. Link Vault does that already.

nixies Says:

Google have been rumoured to devalue every type of links in their time, forum, guestbooks, rec-links, blog posts, paid for links, even dmoz is corrupt. I think they have just established that no one can be trusted.

Leave a Reply

You must be logged in to post a comment.

 

Subscribe

RSS feed

Contact


Pages


Search


Asides

Content may be king, but distribution pays the king’s mortgage.

8/12/09» 15:51» link» comments

Google acquired reCaptcha about a month ago, you might want to throttle your reCaptcha solving per IP address from now on.

14/10/09» 16:22» link» comments

Matt Cutts on how Google deals with spam.

7/10/09» 14:31» link» comments

Why you don’t want to shard.

Real World Web: Performance & Scalability.

NGINX + PHP-FPM + APC.

Gearman is interesting.

31/08/09» 4:46» link» comments
 
 
Copyright 2008, blackhat-seo.com