How Do I Improve My Web Site Conversion Rate? Part 2
Does it help to track visitor behavior on websites through software?
Yes is the simple answer. No debate is required but I'll offer a simple explanation. If you don't measure, how do you expect to know what to improve? You can guess and hope you get it right, but if you have effective tracking software, then you simply have facts in front of you.
Effective measurement is more than simply having good software though; it's analyzing why things happen. One thing we measure is bounce, the number of people arriving at one page and then leaving without doing anything. The lower the bounce rate the better, because it means people are using the site more effectively.
One perfect example comes from a recent client. She had two pages with different articles on her site with exactly the same navigation left and centre. Most articles had a bounce rate of about 53%, but one had a better bounce of about 50% and another had a much worse bounce of around 90%. We looked at both and found that the one with the 50% bounce was much more relevant to the reader arriving at the page. It had better and more relevant links at the bottom of the article than the one with 90%. We concluded that by being relevant on the poor page in the same way, the bounce rate would be reduced. We would simply not have known that this was occurring at all without tracking software. So yes, it most definitely helps to track visitor behavior.
What measurement software tools would you recommend?
We use IRIS Metrics. However apart from IRIS, I would also recommend browser-based software such as HitBox, WebTrends Live, RedSheriff, and Omniture. Generally, you get what you pay for. And while these systems are not cheap, they do provide the level of detail required to run an effective web campaign.
People have asked me if it's possible to use webalizer (free log software) to run an effective web measurement campaign. While it's possible to get a lot of useful information from free and cheap systems, you don't get path tracking, bounce rates, repeat visitor information, accurate visitor counts, accurate page counts and loads more information which is critical if you want to base business decisions on your measurements.
What is the difference between log-based and browser-based measurement?
Tracking tools that rely on server-based measurement are typically programs that are installed on your web server (by your ISP if your site is hosted) or installed locally on your PC using the log files taken from the server. Server-based measurement programs measure activity based on the text files held on the web server (referred to as log files).
I recommend the use of ASP measurement because it only measures how people using a web browser use your website.
The log files record everything visiting your pages. They need a number of added filters to stop email harvesters, search engines and a variety of other software generated crawlers or bots from being counted as 'visitors'; without them, you can get seriously skewed results. Server access is often required to get log file filtering right; otherwise, you're relying on your ISP to report your tracking correctly. The log files for one of our clients had 10 times as many page counts and visits recorded than shown by using an ASP. That's a 1000% error!
What is an average conversion rate?
This is a very good question and is the topic of serious debate. In other marketing industries they don't guess. They have standards that everyone follows. It's what's needed in online marketing before any real answer can be given. Analytics companies, the big research companies, and digital media associations are going to have to come together to define these standards and then people are going to have to follow what is agreed before accurate numbers can be delivered consistently.
Currently, we're in the process of trying to establish a worldwide benchmark with a number of other prominent people (The Web Analytics Association and the IAB to mention two) in the industry who also want to know the answer to this question. But meanwhile, here are some statistics we've gathered from different sources published both recently and over the last few years. I have figures for 3 types of websites: sales (e-commerce), lead generation, and subscription-based websites.
Generally, sales sites seem to range between a 0.5% and 8% with the average rate being 2.3% according to FireClick statistics published this year and figures published in 2003 by e-consultancy.com. In 2000, the average figure for sales conversion as published by shop.org was 1.8%. The high-end figures, I hasten to add, are the top e-tailers according to all sources. My own experience shows sites hitting between .5% and 5.3% so this seems to correlate with the published figures. Of course since there is no defined standard, these numbers have to be taken as a rule of thumb.
The only source we have for lead generation sites is e-consultancy.com. They quote 2-3% of users completing an optional or free registration process, with 5% being best in class. Our own experience again falls within the same ballpark.
Subscriptions to sale conversion is typically between 1 and 7% again the source is e-consultancy.com
We don't have figures for visitor to subscription conversion, but our own experience with clients has been between 1 and 8%. Our own site has consistently hit 15% for 6 months though the traffic is pretty well targeted and our methods very well tested.
How do you go about consistently improving conversion?
This is the million dollar question. What it really boils down to is treating web marketing as a science. We do it by consistently measuring how people use a website. Over time you will learn what works and what doesn't and stop wasting your time on the things that don't work.
First we look at the technical aspect of the website. It's amazing how many people overlook and ignore thousands of people who don't use Windows XP with Internet Explorer at a screen resolution of 1024x768. First make sure that you develop something that works for everyone.
One of the next areas we look at is where the traffic comes from. It allows you to concentrate your efforts on your best chance of generating converting traffic. Then we get into reducing the average website bounce rate. The lower the average bounce, the higher the number of people surfing your website and seeing the value of your offer. The higher the number who see your offer, the better the chance of a sale. Checking bounce rates also usually brings up some juicy problems to be solved.
Then look at testing and improving copy and graphical content, running split tests and measuring bounce rates on copy or simply testing the click-through on links. We do much more, but the basic premise is this: test and measure, follow up with experimentation, and then with more testing and more measuring. Sounds like science class doesn't it?
In part three of this series of articles we'll be looking at where traffic arrives from and how that effects conversion, specific search engine queries, PPC issues and other general topics. To summarize, I am suggesting that if you begin to scientifically measure and improve your websites based on facts and findings, not guesswork and theory, you will begin to improve your conversion rates.
Steve Jackson is CEO of Aboavista, editor of The Conversion Chronicles and a published writer. You can get a free copy of his e-book sent to you upon subscription to the Chronicles web site (http://www.conversionchronicles.com).
Is Something Missing From Your Keywords Research? (Part 1)
As you may already know, keywords are an essential part of search engine optimization (SEO). And the usual approach recommended to finding the right keywords to target with one's site involves the ideas of demand, supply and KEI (and/or CID). I would like to propose that one thing is missing in this research approach.
Search Engine Optimization Tips For 2005 - Part Three
Welcome to part three of our series of articles on search engine optimization. In the third and final part of our series of articles on search engine optimization we cover the topic of links, the types of links and what makes them so important.
Playing in Googlebots Sandbox with Slurp, Teoma, & MSNbot - Spiders Display Differing Personalities
There has been endless webmaster speculation and worry aboutthe so-called "Google Sandbox" - the indexing time delay fornew domain names - rumored to last for at least 45 days fromthe date of first "discovery" by Googlebot. This recognizedlisting delay came to be called the "Google Sandbox effect."
Two Myths About the Search Engine Listing
Getting top search engine listing is perhaps one of the hottest subjects in the internet marketing world. The problem is that many discussions on this subject are based on speculation. Almost no one knows the exact algorithms of the search engines. And what make it worse is that the search engines have a habit of changing their algorithms.
Whats with the Competition? Ever Heard of Cooperation?
I attended the SES Expo in San Jose, CA the other day. It was an hour from my home in Larkspur, CA just north of the Golden Gate Bridge. Along the way I was thinking this would be really neat to see, a lot of companies doing the same things really, and to see how they play together.
Lets Make Your Website #1
Their is simple way of making your website rank top and search optimized.
Search Engine Optimization for Beginners
If you are confused about terms like "search engine optimization" or having a "search engine friendly" site, then listen up! I am here to help.
Search Engine Essentials
What are Search Engines?
Do It Yourself SEO
Internet surfers use search engines more than any other tool to find things online. Search engines rank their results using a complex formula that considers web page content, link popularity and other details. This is why you should Search Engine Optimization (SEO) your web site.
What Size Body Section Ranks Highest?
This is another one of the controversial questions in many of the SEO (Search Engine Optimization) forums, yet it is very easy to answer for any particular search engine. While popular belief seems to be that pages should be very short (less than 10K) to rank well with the leading search engine, this article conclusively answers that question? with a completely different answer.
So, Where Has Your Search Engine Been Today?
Visit Google, Yahoo, MSN or one of the lesser search engines, and you get a few million results for just about any search term. Despite this impressive depth of results, most users consider only a few of the WebPages being pointed to. A lot of research indicates that most searchers exit search engine result pages to visit one of the top three results. That raises the question: What about the remaining million plus results?
Part 1. Wordtracker for keywords.
Site-Digest 7 Simple Steps To Place Your Site Map On Steroids
The Site Map is a too often overlooked piece of the Search engine Optimization Puzzle. Here is a Simple Step 7 step approach to create a Super Charged Site Digest on Steroids that will get your web pages indexed faster and more often.
Search Engine Wars - Quality Searches Vs Quantity!
It is no secret that Google and Yahoo are on a continuous battle to win our hearts and get everyone to convert, but is converting someone really a matter of the quantity or the quality?
SEO #4: Off-page Optimization
Yesterday you should have read the third course out of 6 courses that will help you get a TOP rank in the search engines and get EXPLOSIVE LASER TARGETED TRAFFIC for Free. Today we move on to course #4 and study off-page Optimization. Please read today's course very carefully and take some time to test what I'm about to tell you on your own webpage. Alright let's start!
The Need of Popularity
In very simple words, the link popularity of your site meansthe amount of links coming to you from another sites.Every major search engine heavily weighs the link popularityof sites when ranking sites. Some engines even claims theywon't index sites that don't have at least one link pointing tothem from another site.Why they are doing that ? Just because zero popularity meansthat nobody has an interest related to your site content. Sincenobody is interested, why search engines should consider it ?From their point of view zero popularity is a time wasting.Under the above circumstances, the link popularity of your sitebecomes very important to you, especially if you aim to earntraffic by indexing your site in search engines directories.Let's see what you can do in order to increase the link popularity of your site. First of all you should concentrate on finding sites having a similar content to yours. This is the most important point of your campaign because all those links incoming from sites with similar theme will reinforce the credibility of your site.Then you should act on the following three ways: Link exchanging.Contact the owners asking for a link exchange.
Getting To Know Google
Having greatly benefited from my relationship with Google in the past several years, I am dedicating this article to the search engine superstar.
Determining the Value of Your SEO Service
Every once in a while--and probably more often than we should--we find ourselves reviewing our SEO pricing models. Pricing SEO has always been a real sticking point for me because there is no one-size-fits-all pricing metric. As I began our most recent review of our pricing something really starting to become quite obvious; search engine optimization is requiring more and more research and analysis than it ever did before.
Sitemaps 101 - Back to SEO School
Sitemaps are without doubt one of the most often ignored and undervalued aspects of search engine optimization. You've probably spent a huge amount of time working on pages of original content, keyword density and getting incoming links but never once spared a thought for a sitemap for your new creation.
Mindset from Yahoo!- Changing the SEO Game
Yahoo! has a developed a new concept in search engine technology - "Intent Driven Search" - allowing you to sort search results according to their commercial or informational content.
|© Athifea Distribution LLC - 2013|