My department has installed google analytics on our companies website and none of us are exactly experts on understanding why the data is the way it is.
Anyways, our company is fairly large, but I wouldn't say we are exactly a well known company. We provide internet and Video on Demand to Hotels worldwide. Anyways, as of right now, since I have installed our code last month, we have a total session number of over 78,000. Our average session duration is only 24 seconds, with an average page view per visitor at 1.18 and a bounce rate of 91%.
I don't doubt the session average time. Me and my co-workers are just a little confused as to how with that many visitors, we are consistently across the board getting such a fairly small session duration and a high bounce rate. Could visitors possibly just come to our website, look for our phone number and than leave the site? I'm just trying to find a way to reduce the bounce rate and hopefully increase the session duration average. Or is it possible to add a filter that will exclude visits to the site that are less than 30 seconds, or something like that? I apologize for asking such fairly basic questions I'm sure. I am trying to get up to speed and familiarize myself with how this all works. Just thought I'd maybe ask and see if I am missing something important. Any advice would be greatly appreciated. Thank you!
Its hard to tell why your metrics are so low across the board. Your referral traffic could be to blame here - possibly bad ad copy from adwords or bing, which is making users think they're going to a different page. You can always create a filter by the acquisition, and where the clicks and sessions are coming from. From there it'll be easier to see which source of traffic is to blame, and also how you can improve your site overall for traffic optimization and user friendliness. For more info, reach us at RLCppc.com on the matter. hope that answers your question
Related
Hi stackflow community
I'm looking for a way to fix my low bounce rate issue. I'm sure a few other webmaster have or have had the same issue so hopefully it should help them as well.
Since the implementation of event tracking on the website the analytics bounce rates have dropped. I've tried to implement the fixes suggested here: google analytics - event tracking without affecting bounce rate
But still have had no luck getting proper stats on the bounce rate.
This is the websites source: view-source:http://www.quatermain.co.za/
Are you able to see what could be causing the issue?
When it comes to Google Analytics - low bounce rate is a generally a good thing for most websites.
However, in my personal opinion - bounce rate on its own is a rather pointless metric.
A high bounce rate might mean the visitors find the information they need on the landing page.
A low bounce rate might mean the visitors cannot find the information easily and navigate around a lot.
So what you need to do is set up goals for your website and track those to figure out what visitors are doing on your site, and do they find what they need - or do they wander aimlessly and ultimately leave before triggering the "goal" aka the "point of their visit" on your website.
So you need to analyze the navigation of the visitors, what are their landing pages - where do they go from there and ultimately - do the visitors exit from where you want them to exit.
For example; A webshop want a low bounce rate. However the want that in combination with triggering the goals of a sale.
Based on the website linked, I would think it is the "book now" button that's the important part of the website - the "point" so to speak.
So you'll need to measure the number of visits that trigger a booking and react to that number.
But again - generally speaking - if your bounce rate is low - that's normally a good thing
Take an example of a question/answer site with a 'browse' slideshow that will show one question/answer page at a time. The user clicks the 'next' button and a new question/answer is presented to him.
I need to decide which pages should be returned each time the user clicks 'next'. Some things I don't want and reasons why:
Showing 'newest' questions in descending order:
Say 100 questions get entered, then no user is going to click thru to the 100th item and it'll never get any responses. It also means if no new questions were asked recently, every time the user visits the site, he'll see the same repeated stale data.
Showing 'most active' questions, judged by a lot of suggested answers/comments:
This won't return those questions that have low activity, which are exactly the ones that need more visibility
Showing 'low activity' questions, judged by not a lot of answers/comments:
Once a question starts getting activity, it'll stop being shown. This will stymie the activity on a question, when I'd really like to encourage discussion.
I feel that a mix of these would work well, but I'm unsure of how to judge which pages should be returned. I'll stress that I don't want the user to have to choose which category of items to view (like how SO has the unanswered/active/newest filters).
Are there any common practices for doing this, or any ideas for how it might be done?
Thanks!
Edit:
Here's what I'm leaning towards so far, with much thanks to Tim's comment:
So far I'm thinking of ranking pages by Activity Count / View Count, where activity is incremented each time a user performs an action on a page, like a vote, comment, answer, etc. View will get incremented for each page every time a person views the page.
I'll then rank all pages by their activity/view ratio and show pages with a high ratio more often. This way pages with low activity and high views will be shown the least, while ones with high activity and low views will be shown most frequently. Low activity/low views and high activity/high views will be somewhere in the middle I imagine, but I'll have to keep a close eye on this in the beta release. I also plan on storing which pages the user has viewed in the past 24 hours so they won't see any repeats in the slideshow in a given day.
Some ideas for preventing 'stale' data (if all the above doesn't seem to prevent it): Perhaps run a cron job which will periodically check for pages that haven't been viewed recently and boost their ratio to put them at the top.
As I see it, you are touching upon two interesting questions:
How to define that a post is interesting to a user: Here you could take a weighted combination of various factors that could contribute to interestingness of a post. Amount of activity, how fresh the entry is, if you have a way of knowing that the item matches users interest etc etc. You could pick the weights based on intuition and see how well the result matches your expectation. If you have the time and inclination, you could collect data on how well your users respond to the entries and try to learn the optimum weights for each factor using machine learning techniques.
How to give new posts a chance, otherwise known as exploration-exploitation tradeoff.
BAsically, if you just keep going to known interesting entries then you will maximize instantaneous user happiness, but you will never learn about new interesting stuff hence, overall your users are unhappy.
This is a very well studies problem, and depending upon how much you want to get into it, you can read up literature on things like k-armed bandit problems.
But a simple solution would be to not pick the entry with the highest score, but pick the entry based on a probability distribution such that high score entries have higher probability of showing up. This way most of the times you show interesting stuff, but every post has a chance to show up occasionally.
I'm planning on developing my own plugin for showing the most popular posts, as well as counting how many times a post has been read.
But I need a good algorithm for figuring out the most popular blog post, and a way of counting the number of times a post has been viewed.
A problem I see when it comes to counting the number of times a post has been read, is to avoid counting if the same person opens the same post many times in a row, as well as avoiding web crawlers.
http://wordpress.org/extend/plugins/wordpress-popular-posts/
Comes in the form of a plugin. No muss, no fuss.
'Live' counters are easily implementable and a dime a dozen. If they become too cumbersome on high traffic blogs, the usual way is to parse webserver access logs on another server periodically and update the database. The period can vary from a few minutes to a day, depending on how much lag you deem acceptable.
There are two ways of going about this:
You could consider the individual page hits [through the Apache/IIS logs] and use that
Use Google Page rank to emphasize pages that are strongly linked to [popular posts would no longer be based on visits but on the amount of pages that link to it]
Using Alexa.com I can find out that 0.05 % of all internet users visit some site, but how many people equals that 0.05% ?
Is there any facts like: in US 1% from Alexa statistics is nearly equals 15 mln of people, and in France 1% is about 3 mln of people, for example?
Compete.com reckon that google has something like 147m monthly users, and alexa says they have 34% monthly. Ergo, you could estimate it to be approx 450million. That's one way of estimating...
Of course the data from both Compete and Alexa gets progressively more rubbish the smaller the site gets. Data for the biggest sites is likely to be the least skewed, but I still wouldn't trust it for anything serious.
InternetWorldStats.com has a number of 1.6 billion internet users worldwide
You can get world population statistics online - estimates are available here:
Wikipedia World Population
This will help you to rough-up some statistics, but you need to remember...
Population is not equal to "has an internet connection"
0.5% does not really equate to "internet users" - it's more like 0.5% of people who are the kind of people that would install a random toolbar that offers them very little - so you need to bear in mind it's a certain "type" of person and that the statistics will be skewed (which is why www.alexa.com isn't ranked as EVERYONE with the Alexa toolbar is going to visit that website at some point
The smaller your website, the less accurate the statistics are. If you aren't in the top 100,000 websites in the world, the statistics become largely an anomaly as they "estimate up" the statistics from the toolbar users into an "average if everyone had a toolbar".
Hope this helps.
Alexa doesn't show "X% of France users use this site". Instead it shows "X% of worldwide users use this site". So you don't have such information except the margin cases when 100% of site users are from one country.
Also most toolbars show just Alexa Rank. You can get online converter "Alexa Rank -> Monthly Traffic" here - http://netberry.co.uk/alexa-rank-explained.htm
Well, here (http://netberry.co.uk/alexa-rank-explained.htm) is described a way to make a traffic estimation based on the alexa rank. Basically, the author has offered an exponential function, not linear or polynomial.
There is also a web service which aggregated alexa rank information and has already performed all the calculations: http://www.rank2traffic.com/
I checked it, and for 80% of the websites the results are very satisfying. Still, there is 20% of (possibly, manipulated by webmasters) incorrect data (the estimated traffic is much higher than in reality)
Of course the best metric would be a happiness of your users.
But what metrics do you know for GUI usability measurements?
For example, one of the common metrics is a average click count to perform action.
What other metrics do you know?
Jakob Nielsen has several articles regarding usability metrics, including one that is entitled, well, Usability Metrics:
The most basic measures are based on the definition of usability as a quality metric:
success rate (whether users can perform the task at all),
the time a task requires,
the error rate, and
users' subjective satisfaction.
I just look at where I want users to go and where (physically) they are going on screen, I do this with data from Google Analytics.
Not strictly usability, but we sometimes measure the ratio of the GUI and the backend code. This is for the managers, to remind them, that while functionality is importaint, the GUI should get a proportional budget for user testing and study too.
check:
http://www.iqcontent.com/blog/2007/05/a-really-simple-metric-for-measuring-user-interfaces/
Here is a simple pre-launch check you
should do on all your web
applications. It only takes about 5
seconds and one screeshot
Q: “What percentage of your interface contains stuff that your customers
want to see?”
10%
25%
100%
If you answer a, or b then you might
do well, but you’ll probably get blown
out of the water once someone decides
to enter the market with option c.