Bookmark and Share

Why Your Site Needs Fresh, Relevant Content


It is said that content is king, but today 'fresh, relevantcontent' is the master - or is it?

Every owner of a commercial web site knows that frequent freshcontent is needed on their pages in order to achieve andmaintain a high listing on search engines which actively seekfresh content. Google sends out its 'freshbot' spider to gatherand index new material from all the sites which offer it. MSNSearch seeks it too. I've noticed that MSN Search's spider paysa daily visit to a site of mine which has proper fresh contentevery day.

By incorporating fresh content, commercial web sites will remaincompetitive, for without it they will certainly fall down thesearch engine listings and lose business. Besides, havingsomething new keeps visitors coming back and attracts potentialcustomers.

But creating and then manually uploading fresh content onto ourweb sites each day is hard, time consuming work, isn't it? Whatwe want is a way of putting daily fresh content onto our websites easily and efficiently. Let's look at the currenttechniques available to us to achieve this goal and see whichone offers a global solution to the fresh content problem:

1) Server Side Includes (SSI'): These are HTML statementswritten by the webmaster and uploaded onto the server. SSI'sinform the server to include a specific block of text when aspecific page is served to a browser or a search engine spider.

Because these scripts are compiled 'before' they are served,they remain 'visible' to search engine spiders and thereforewill be seen as fresh content. Unfortunately, not all web hostssupport SSI's; this is because the server must 'read every page'on the web site as it looks for include statements, a processwhich clearly reduces server performance.

How many web site owners have the time to manually upload freshHTML content onto their servers every day? Probably very few,which is why the use of SSI's is not a global solution to thefresh content problem.

2) Blogging: Google's Freshbot spider is so voracious for freshcontent that it eagerly devours the contents of common weblogs.But can a daily blog be used to influence the listing of a webpage under specific keywords or phrases?

It can, but for the vast majority of web site owners, blogging isout of the question. Putting up a daily keyword-rich businessblog onto a web site is hard, time-consuming work, and itrequires the blogger to be a competent writer, too. Few businessowners have time available or the competence to write somethingnew about their products or services every day.

Blogging is therefore not a global solution to the fresh contentproblem.

3) RSS Newsfeeds: Having newsfeeds placed on a web site iscertainly an easy way of getting fresh material to appear eachday. 'Really Simple Syndication' or RSS, is a fast growingmethod of content distribution. Newsfeed creation is anuncomplicated procedure and therefore appears to be an easysolution to the fresh content problem.

Many owners of commercial web sites believe that byincorporating newsfeeds on their sites they will improve theirsearch engine rankings by using the links appearing within thosefeeds, which are given relevance by Google. This belief is wrongbecause newsfeeds are basically JavaScript or VBScript.

These scripts must be executed by search engine spiders for thefresh content to be noted, and since the spiders take asimplistic approach when reading web pages, these scripts willnot be executed at all. These scripts are compiled 'after' theyhave been served, and not before.

There are also a couple of growing menaces associated with RSSnewsfeeds:

o Since the popularity of RSS use is growing exponentially, theidea to monetize syndication with ads is gaining ground. Indeed,Yahoo has announced that it will begin displaying ads fromOverture's service within RSS feeds. Now who wants otherpeople's ads on their web site? I don't.

o There are rumors of newsfeeds being used to deliver spam. Ifthis gets out of control then newsfeeds will quickly becomehistory. Who wants spam messages appearing on their web site? Idon't. RSS is therefore not a global solution to the freshcontent problem.

4) Newsfeed Scripting Solutions: A software solution can berigged up to 'extract' the HTML from newsfeeds. The HTML is thenplaced onto web pages so that the fresh content will be seen bysearch engine spiders. This however involves the use of PHP andMySQL, which tends to put many business owners off. And ifthere's spam or ads in the feed, they will get extracted, too!

Newsfeed scripting solutions are therefore not a global solutionto the fresh content problem.

5) Creating Original: Content As mentioned above under SSI's andWeblogs, creating and manually uploading your own fresh contentevery day is a time-consuming chore. And what if you have anumber of web sites, each of which requires frequent freshcontent in order to remain competitive? Yet we all know thatthere is nothing better than our own proper keyword-rich freshcontent.

In summary, getting frequent proper fresh content onto our websites is not straightforward at all. HTML extracted from RSSfeeds appears to offer a partial solution, but it is toocomplicated for most businesses and is potentially menacing.

The e-commerce industry is clearly in need of a genuine solutionto the fresh content problem. The way to do it is toautomatically have our web pages updated every day with 'ourown' content, not anyone else's. Only then will we be able tosay that fresh content is truly the master!

About the author: Victor George is a "fresh, relevant content" crusader whose web site can be found at:

http://www.autopageupdate.comEasily control your web content to suit your clients and to keepthe search engines well fed with your new and relevant content.

© Athifea Distribution LLC - 2013