So there have been a bunch of theories floating around as to how the G Men are treating sites, namely how they have been slapping down sites that just seem to be getting off the ground. Sure that could be the sandbox, but it really doesn’t feel like it to me. I think there may be some very important on page factors that are signaling to Google that your site is (or may be) a spam site. Anyway that is why I will be beginning a little experiment to test out four theories that I have.
CS – First my control site… I started this one about a month ago and it went as I have described above… it climbed steadily (as high as 30th position for its KW) but now it is nowhere to be found at all in the SERPs. I built it just like I’ve been building sites for years with pretty decent success, but since panda… well really since the hubpages slap happened just after Christmas… the game seems to have changed.
S1 – On this site I strive for very low keyword density. I plan on using the keyword once in my post, so it might make it onto the page 2 or 3 times with the title etc, giving the site a max of 3% KW density (compared to my other sites which I will have the KW density at a higher than I normally would 7-10%).
S2 – On this site I will slow the backlinking down to a snail’s pace. I’ll leak links to it at a rate of 1 per day (as opposed to the other sites which will get 3-5 a day each). This site will no doubt climb slower than any other sites since it will be getting less links, but we’ll see where it ends up.
S3 – On this site I will be sure that all the posts have different titles and URLs (the homepage will be the only page with the main keyword). Typically on my site I use cousins or variations of my main keyword for the post titles. I may even give the posts search unfriendly titles like “experimenting with making sites rank real well by mixing things up” instead of a keyword title like “Google serp slap experiment”
S4 – I will be cloaking affiliate links on this site just to see if Google is not liking the fact that I have links to Amazon on pretty much every page I build.
Things I will control for:
- All the sites will be on the same host
- All the sites will use the same theme
- All the sites will be the same age
- All the sites will have similar content (length, quality)
- All the sites will target the same keyword
- All the sites will receive BMR backlinks
If anyone has any other suggestions for my little experiment I’d love to hear them. It would be great if other people would play along too and run their own similar tests, maybe with some variations of the factors they will be messing with. I’ll keep everyone updated on the results of my little experiment over the course of the next few months and we’ll see if we can’t crack this tricky little nut called the Google algorithm.
–Best of Luck
Justin DeMerchant
Leave a Reply