The need for powerful web content post penguin and panda

blog writerSince the internet began, and since search engines were the key to your website being found, web masters and website owners alike have tried to make sure their site is placed on page 1 if not in the top three.  By understanding how the search engines are programmed to pick up the web sites, SEO’s and internet marketing strategists use techniques and tools to ensure your website has the attributes to be seen in as high a profile as possible.

 

Plugging up holes in the Ice

 

However the engines (namely Google) continues to close down the loop holes and stop the bad SEO’s using dirty tricks to get rubbish websites to the top of the search pages.  There have been two changes to Google programming in the last year or so: Google Panda and Google Penguin.  Not sure why there should be an arctic symbolism here. It certainly sent icy chills down some SEO’s backs and rightly so. But for the white hat SEO and especially the internet copywriter proud of his work, this is just the right direction.

 

Why does Google Care?

 

This made me wander at first.  After all it doesn’t make any difference to Google what is thrown up on the search engines does it? Well, yes it does.  They have their competition just like any other company. (i.e. Bing, Yahoo, Safari).  They want Google to be everyone’s first choice of search engine or they stop making money (from ads etc.) Therefore when someone puts something in the text box they should be getting back the top, informative, popular sites for that search phrase. If they are just getting back rubbish they will just go to another search engine

 

So what difference does Google Panda and Google Penguin make?

 

Google was getting tired of web masters just putting up rubbish content to show the site was being updated regularly and adding poor links. Panda especially saw new programming in Google to ensure that websites had to be putting up quality content that was of a certain length and substantial quality. Ok, so it could be programmed to detect content of a certain length (400 words at least for blogs and articles) but how could it tell whether that content was informative and substantial? – Simply by whether the visitor actually read the page all the way through. But how could it tell that? Easy. If the visitor uses the scroll bar all the way to the bottom of the page (an event in Google analytics terms) there is a high chance it is all being digested.

 

So what does this mean for web content for the future?

 

People can’t get by anymore thinking their web text just fills up the web design gaps between the pretty pictures.  All web sites needed excellent marketing text that was optimised to pick up the search engines but now that is intensified even more – if you want to see your website on page 1 of Google you need to start with quality text.

 

 

 

 

 

 

 

Designed and Developed by Markets 2000