Get website metrics with Alexa and measure the popularity and engagement of websites.
You may be familiar with Alexa’s free browser app and free web metrics, but there is much more. Alexa gives you information that you just simply can’t find in Google Analytics. Granted, Google is a free service while Alexa is a paid subscription.
This article will highlight types of data provided by Alexa. If you are familiar with the information given in the free version of Alexa, use the menu below to jump to Premium Service information.
Measure Websites with Free and Premium Alexa Tools Menu
About Alexa and Measuring Websites with the Free Tool
Alexa started out in 1996 and got acquired by Amazon in 1999. You may have heard of the “Alexa Rank.” This number represents the popularity of a website compared to all other sites online. If your site has an Alexa Rank of 1,345, then 1,344 sites get more traffic. Don’t focus on that, focus on the fact that you worked hard to get where you are now.
According to Netcraft, an Internet research organization, there are currently 644 million active websites. Take that into account when assessing your site’s popularity.
Also, I have to clarify that getting to the top 10,000 is not easy and takes a lot of work (I haven’t done it yet). Alexa reports data on approximately 30 million websites. Most of them aren’t very good (Sturgeon’s Law).
Quick Access to Alexa Rank
Wouldn’t it be nice to know the popularity of a site at first glance. Thankfully, there is a Chrome Web Store Extension by Alexa that will do exactly that. This extension runs on over 400 thousand browsers. This number demonstrates the magnitude of data that Alexa is aggregating.
Although, this figure could be much smaller if you count individuals, and the fact that people have multiple computers with various browsers installed.
Before the extension can be used, you have to accept the terms and enable tracking. Now, your data will be factored into Alexa’s immense data set.
If you pay for Alexa service, you can add a tracking code to your site to get accurate metrics. Additionally, you can choose whether or not to provide your information to the public data set. As you may have guessed, sharing is the default option. The result is a gigantic sample size of the Internet community, which is why many people respect the authority of Alexa Rank. Granted, it’s just a subset of the web community, and some people question its validity. What I say to those people — It’s better than nothing.
Back to Measure Websites with Alexa Menu
Free Alexa Tools
If you decide to try out the Alexa Chrome Extension, you’ll notice that it gives you more than just a solitary metric. Visit any site and click on the extension — in the upper-right part of your browser. The icon looks like this.
The above icon has a full bar because I was visiting Facebook, which has an Alexa rank of 2, second to who else but Google. If I visit a site with a position of closer to 150K, the bar will only be partially filled. Alexa rank changes over time, as do many other metrics that Alexa provides. Every metric in this article represents data from mid-April 2015.
When you click on the Alexa icon, you will see additional information:
- The Global Alexa Rank
- Country Traffic Rank – Country with the Highest Amount of Traffic – You might see the United States while at other times India or maybe Canada, but it depends on which site you’re visiting.
- If you click on this metric, you’ll see the Alexa site with a graph that shows popularity over time. Here is Huffington Post’s popularity graph.
- Sites Linking In – Number of domains with links to the site in question.
- Note – Each listed domain may have more than one link
- They have a large number of backlinks, and this helps Huffington Post get many referral visits.
- Google’s page rank mainly considers the number and quality of a site’s backlinks, which helps HuffPo get organic traffic.
- The free version shows you backlinks from the top 5 Alexa Ranked websites.
- Updates happen on a weekly basis compared to daily for those that have premium and have certified their websites metrics,
- They only count links from sites that have “measurable traffic”. Additionally, the less traffic a site has, the less likely Alexa is to crawl it.
- They also don’t show URLs from “adult sites” but these URLs do factor into the “total sites linking in” count.
- Site Speed – Measured Page Load Time with a Relative Percentage Grade
- The Huffington Post is relatively slow, at 2.54 seconds (81% of sites are faster).
- Wayback machine – Link to see how this site looked on any day during its existence
- Alexa’s Extension has a link to this service, so we won’t go in-depth here. I will state that this is an Archive.org project that keeps a record of how the internet looks and changes through its evolution.
- Search Analytics – Link to Alexa’s website providing top 5 organic queries and the percentage of total organic traffic that they represent
- To see more of them you’ll have to upgrade to premium
- Audience Geography – Accessed when you click on the global or country Alexa rank. The link leads to Alexa.com where you can scroll down and see the top 5 countries for this site ranked by percent of visitors. Next to that you will also see the popularity rank in that country. Here is the profile for HuffPo:
- Search Traffic Chart displays the change in organic search traffic to the site over time. Once again here is what the free version will show you.
- Upstream Sites gives you an idea where people came from before visiting this site:
- Audience Demographics can illuminate what kinds of people visit this site
- Additional Free Data (if collected)
- A list of the subdomains related to that domain as well as the percent of visitors that go there
- A Logo, Site Description, and a Contact if that info exists
- List of related sites and categories with related sites
- Here is a list of the sites that are similar to HuffPo: talking points memo, DailyKos, WSJ, Washington post, TMZ
Back to the Measure Websites with Alexa Menu
Measuring Websites with Alexa Premium
I happen to subscribe to Alexa premium, partly to provide consulting to various clients and also to understand more about the websites that I visit. There is another section in this article where I discuss how I use Alexa to learn about my traffic. More on that later. You don’t need to have an Ad Agency or a full-service SEO company to discover more about your competition.
There are tiers of service and the two features that I value the most happen to be in the second highest tier. EDIT (Alexa recently changed some of the plan features)
If you’re interested in expanding your knowledge of available tools, there is an opportunity to try this out without a financial obligation. You do have to provide your credit card info, and they will charge you if you fail to cancel within the first seven days.
Let’s imagine that we are doing a competitor analysis on HubSpot because I own ParDot (acquired by SalesForce in mid-2013). Pardot (Marketing Automation/CRM tools) has an Alexa rank at about 2,500 while Hubspot commands a 624 status. First, let’s learn more about Hubspot.
Back to the Measure Websites with Alexa Menu
Using Alexa to Analyze Your Competitor
When first looking at Hubspot, I can quickly ascertain several things. I will avoid putting screen captures of items you have seen already.
Global Rank 624 and US Rank of 393
Estimated Unique Visitors by Country
- Hubspot had an estimated 2 million visitors in the United States over the last 30 days (updated on a daily basis)
- Followed by the UK at 133K
- Then Canada at 124K
- Then Spain with 42K
- Unique Visitors by Country is a data lab feature, which means it is currently in the process of improvement by Alexa’s data scientists.
- NOTE: Not all countries are available for this metric
Audience Geography – Alexa Rank by Country sorted by percent of total traffic:
- The US at 39% of visitors with a country rank of 393
- India at 18% with 422
- The UK at 4% with 484
- Canada 3.7% with 349
- Brazil at 3.4% with 780
Engagement Metrics – Rate of change is highlighted in green or red, representing the variation from last three month period.
- Bounce Rate – 38%
- Daily Pageviews per Visitor – 5.4
Organic Search – You can export this list of keywords and sort it in Excel. I often do this for sites with very long lists of keywords. In this example, there are 260,319 keywords, with only 50 shown per page and sorted by percent of search traffic as a default. However, you can sort it by the popularity or competition once you have it in Excel.
Top 15 Keywords Driving the Largest Portion of Search Traffic to Hubspot – Knowing the organic keyword profile for a competitor is akin to understanding part of their communication with potential clients. It’s even better because it is the essence of how people find their service (applies to search engine traffic only). Hubspot has word-of-mouth working hard for them.
It’s important to realize that Hubspot is well-known for the quality content of their blog. They acquired this status by creating content that can benefit the reader even if they don’t become a client. Additionally, they are leveraging their landing page service as a hot keyword item.
- Inbound Marketing is 2% of Search Traffic
- Hubspot Blog – 1.3%
- Hubspot CRM – 1.2%
- Hubspot Pricing – 1.1%
- Hubspot – .87%
- Hubspot Case Studies – .73%
- Marketing Automation – .71%
- What is Hubspot -.71%
- Landing Page Examples – .7%
- Hubspot Inbound Marketing – .6%
- Best Landing Pages – .52%
- Hubspot Support – .5%
- Hubspot Academy – .48%
- Sidekick Hubspot – .47%
- Hubspot Certification – .47%
Top 5 Keywords with the Highest Competition – These high competition keywords tend to drive a smaller portion of traffic
- Marketing Automation Software is .25% of Search Traffic
- Hubspot E-commerce is .21%
- Hubspot Integration is .16%
- Inbound Marketing Strategy is .15%
- Hubspot SEO is .11%
Back to the Measure Websites with Alexa Menu
Using Alexa Site Audits to Analyze Your Website
After you sign up for a premium account and certify your site, you will have access to “My Dashboard” in the Alexa menu. This dashboard displays the critical metrics that quickly highlight the health of your site:
- Global Rank – Your relative Global Popularity
- Rank in the US or whichever country your visitors mainly come from
- Unique Visitors with percent change from previous day
- Pageviews with percent change from the last day
- Uptime – This is a good place to verify that your host is delivering on their promise. If you are managing a physical server, you can quickly get a view of how you’re doing.
This section is useful to check against Google Analytics. Use it to figure out how many unique visitors you got and how many pages they saw on average. Alexa claims to provide 100% accurate results as long as the certification code is on all the pages. Go to “view status” in your dashboard to verify correct code implementation.
Site audits can reveal easy tweaks that may improve SEO as well as reveal serious problems that need immediate attention. Find your site’s grade at the top of your audit report.
SEO Site Audit
The periodical SEO audits are broken up into topics and sections. First we’ll cover the topics, which have a colored circle next to them.
The red circle mark is used to tag issues with your site. The following are potential issues which may be marked with a red or green circle.
- Duplicate Title Tags – Search engines use the <title> tag as the search engine result title. This is the first impression you get to make on your potential organic search visitor. Make sure that it accurately reflects what the visitor will see if they click the link. These titles must be descriptive and unique. Having duplicate title tags tells search engines that your site isn’t well put together. Ask yourself – “If I saw this title on a SERP (search engine results page), would I click it?”
- Missing Title Tags – Based on the description above you probably realize that this tag shouldn’t be blank, like ever.
- Long Title Tags – The golden rule is 65 characters or fewer. Every character in a title tag past the 65th one will not be shown on the SERP either way. What you will see is “…” Studies have shown that longer title tags and URLs have lower CTR (Click Through Rates) on the SERP.
- Multiple Title Tags – Each individual URL on your site should have one unique title tag. Simple as that.
- Hostname – An example of different hostnames involves how the URL is typed.
- http://www.yoursite.com is a different hostname from http://yoursite.com
- Both of these hostnames should point to your site.
- If you don’t have all your hostnames pointing to the same place, you end up sending your traffic to different versions of your site. This is very bad for SEO.
- Reachability – This is a user experience issue. If it takes me more than 4 clicks to get to an important page on your site, Alexa will suggest doing something about this. The key takeaway here is that you don’t want visitors struggling to find what they’re looking for.
- Additionally, if a link to one of your pages is buried more than 4 pages in, there is a chance search engine bots won’t crawl that far and the page won’t get any organic traffic.
- Redirects – Imagine that you published an article with the following URL — http://www.example.com/using-alexa-like-a-boss — and a week later you changed it. Well, if Google already indexed that URL, you have just created a broken link. The solution is to redirect — http://www.example.com/using-alexa-like-a-boss — to the new URL. That way you’ll continue getting organic traffic from the first listing.
- If you use too many redirects, search engines (especially Google) may view your site as spammy. You definitely don’t want this because it may result in some/all of your pages getting dropped from the SERP. The best practice is to keep URLs the same after publishing. If you must do so, use a 301 permanent redirect. Take a look at this Moz article for more on types of redirects.
- Anchor Text – Anchor text is viewable by the site visitor while a URL hides beneath. For example, in the redirect section, I provided a link to types of redirects which is a link to https://moz.com/learn/seo/redirection.
- The HTML for this looks like this:
- <a href=”https://moz.com/learn/seo/redirection” target=”_blank”>types of redirects</a>
- The green text is the anchor text and the blue text is the destination URL.
- Interlinking – You want to use semantically relevant language when linking externally and internally. The anchor text provides users and search engines more context about the destination. A well thought out interlinking structure can boost your organic traffic by helping you rank for more keywords.
- Don’t be an Internet Noob – Never use vague or generic anchor text like “click here” or “link”. Instead, find a good way to describe the destination.
- Broken Links – You shouldn’t have links that lead to a 404 not found error message. This happens when you change published URLs, delete media from your media library, or delete files from your server. Any links to these deleted resources will be broken links. There are tools that help manage this like the Yoast WordPress Plugin integration with Google Search Console.
- Dead End Pages – Just like an alley with no exit you leave your visitor one choice, turn around and leave the way you came, in this case, the browser back button.
- In general, modern sites usually have headers and footers with navigation menus, so this shouldn’t happen to you. You might decide to use a “squeeze page” with a form and no other links, which may raise this red flag.
- Page Not Found – When someone tries to visit a page on your site that doesn’t exist you want to have a page to tell them that, instead of a generic error. Some sites customize them to be cute/quirky or clever.
- For example, if I type moz.com/alphabet into my browser’s URL bar, I will see the Roger is totally lost page.
- I still need to create a better one but here is my page not found message, I like having a search bar on the page not found URL because then the visitor can search for their interest.
- Long URLs – URLs shouldn’t be longer than 128 characters because the rest may get trimmed by Google on their SERP. This section will identify any offending URLs.
- Duplicate Content – Simply put, publishing the same content on multiple URLs is asking for poor organic traffic. If you want to have duplicate content pages you want to use the <link rel=”canonical” href=” pointing to the authentic original. This way google knows which one is the one they should index. Google doesn’t want to show multiple results to the same link.
- Duplicate Meta Description – The meta description is the snippet of text that shows below your title in the SERP. Duplicate meta descriptions confuse search engines and should be avoided. Search engines use this metadata to figure out what your site is about.
- Too Many Links – Having too many links on one page is a red flag for Google as it is reminiscent of link farms and other black-hat techniques. If you must have pages with more than 100 links, try to keep them to a minimum and never go over 1,000 links. There is much controversy about the specifics of this, but to air on the side of caution, you can use the rel=”nofollow” to make sure you aren’t giving away all your page rank/google juice.
- Server Errors – You might face server-side errors, for example — specific pages being requested but not being served. Use this section to identify the offending URLs.
- Robots.txt – This is a file placed in the root directory of your server and tells crawlers which pages they can or can’t crawl/index/follow. It’s important to look at this file if you are having indexing issues. Alternatively, the Google Search Console will alert you about any robot.txt problems.
- Session Ids – Session ids are usually synonymous with user account websites. Search engine crawlers could have trouble understanding URLs with session Ids and may led to decreased SEO. If you are having issues with being indexed due to session Ids, look up documentation for PHP, Java, or Asp.net depending on your user account solution.
- Search Engine Marketing – This section shows queries that resulted in your links being displayed on the SERP. Additional metrics shown are popularity and competition of queries. In this case, competition relates to paid ads and how intense the bidding is.
- On-Site Links – Each URL will have a number of links pointing at it. Use this to identify URLs which you want to increase the number of links to. Why do this? URLs that have more links may get favored by search engines. The assumption–more important pages will have more links pointing to them.
- Low Word Count – Google is all about providing high quality and relevant search results and ads. Pages with low word counts are less likely to get a high level of organic traffic. Therefore, you should identify the URLs with low word count and expand them. That is if you want them to rank well.
- Image Descriptions – Like the anchor tag, the image description tag provides search engines more insight into what the page is about. Using this tag wisely can boost the organic traffic for that page. Many people search using Google images search and by adding a description you increase the chances that people find your site.
Report Stats
Below all these attention items is a statistics report about the crawl that was executed to gather this data. The stats include:
- Total Crawler Requests
- Total Files Crawled
- Total HTML Pages Crawled
- Domain.com Pages Crawled
- www.domain.com pages crawled
- Total Off-Site Links Crawled
- Total 4xx or 5xx errors Found
- Unique Hosts Crawled – Check this section to make sure there aren’t any domains you don’t recognize. Link spam in comments can cause some undesirable outbound links from your site.
SEO is a complex and ever-changing field. One the first documents that I used to become familiar with SEO is the 2010 Google| Search Engine Optimization Starter Guide. I recommend saving this on your desktop and referencing it regularly to start ingraining the various concepts. Remember, baby steps turn into a marathon stride with enough patience. It takes time and continuous effort. Now let’s move on to other aspects of your web property, apart from the SEO section the site audit checks additional items of importance.
Landing Page Auditor
Enter any URL and a target keyword, to assess the quality of search engine optimization. This tool scans the URL and looks for the target keyword in important places such as:
- Title Tag
- H1 Tag
- Page Text
- In URL
- In Meta Description
- Anchor Text Analysis
- Number of Links
- Number of Unique Anchor Text Phrases
- Percent of Links with Keyword in Anchor Text
By analyzing some of your URLs, you can figure out how to tweak the content so it will rank better in organic search. Just remember, once you publish, don’t change the URL and if you must, use a 301 redirect.
Performance
When it comes to website performance, you should start with the big picture metric. As you may have guessed, it’s the all-important Page Load Time. The speed of your site affects the UX (User Experience) which influences your conversion rates and how many of your pages are indexed. It is wise to make sure your site is fast. Alexa goes a little further than many other tools out there and shows you:
- Median Page Download Time – Example Metric – .32 seconds
- 90% of Pages Loaded Within – Example Metric – .372 seconds
- 99% of Pages Loaded Within – Example Metric – .503 seconds
- 99.9% of Pages Loaded Within – Example Metric – .504 seconds
These speeds aren’t the fastest, but they are quite good. This was before I took advantage of W3C Cache for file compression, browser caching, and more.
They also provide you a list of all your indexed pages with their respective load time. A smart way to go here is to pair this report with your most popular pages and start optimizing the ones that are the most popular and have a long load time. SEO is a work in progress (WIP) and priorities are important. Doing this regularly is very important since Google has explicitly stated that page load speeds are an important factor in their ranking algorithms.
Security
Insecure Forms are a significant problem for visitor’s privacy. If you want your visitors to complete a form, you also want to have a secure connection. The standard today is to use HTTPS with Transport Layer Security (TLS) which is often mistakenly called SSL (Secure Sockets Layer). SSL is TLS’s predecessor and still better than nothing. TLS encryption protocol creates a secure connection so that emails, passwords, and other sensitive information can’t be intercepted. Additionally, in August 2015 Google came out with an official announcement, that secure sites get an SEO bump.
Meta Information can be used to identify the software running on the back-end of your website. If these get flagged you will want to figure out a good way to hide that information. When a hacker can identify your software, you make their job much easier. If you are running on outdated versions of Apache, PHP, Nginx, and/or WordPress you make it even easier for someone to find an exploit.
HTML Tags
The Web Analytics section shows what percentage of your web pages have Alexa code on them. If you have partial coverage, download the report of all the URLs and add the code to pages that need it. As an example, this site doesn’t have full coverage because I don’t need to measure engagement on my admin login page.
Social Sharing tags enable visitors to conveniently share the page with their social network. If your site is missing these, you will want to reconsider your decision. Your organic search rankings can be positively influenced when your content is shared.
Reputation
When Larry Paige and Sergey Bryn created the page rank algorithm, they mimicked the way academics cite each others work. When a paper got cited many times, it spoke to the quality of that work. The same applies to websites. That is why you want to do what you can to get high-quality sites to link to your site.
This section will rate your site in comparison to the internet community. Much like standardized tests your score will be in a certain percentile of performance.
Audit Crawler Report
While the Alexa bot crawls your site, it may encounter problems. This is where you find out about these problems. A couple examples of what can go wrong: Connection Timeout,
Back to the Measure Websites with Alexa Menu
Alexa Certified Website Metrics for your Website
In addition to all this great data, you can use Alexa as a site metrics platform. This data is essentially the same data that Google Analytics collects, with the difference being:
- Automatic Filter for Robots and other Machine Traffic
- Intuitive User Interface
- Easy Date Range over Date Range Comparison
After you certify your site and give it some time to collect data, you will be able to view visitor engagement stats for your site by clicking the Site Metrics link on your dashboard.
Once there you will have a series of options. The first item in the list is the Overview, which contains a graph for Unique Visitors, Visits, and Pageviews.
Counting Real People vs Bots
The purpose of counting unique visitors is to count real people as opposed to crawlers and other machine generated traffic. Additionally, if other websites use content from your server such as widgets or ads, these visits don’t get counted by Alexa either. Alexa only counts traffic when your URL is in the browser address bar.
Intuitive User Interface
Below the overview section, you will see intuitively titled sections. If you want to see organic search traffic, click on Search. If you want to see social network referrals click on social. It is a bit easier for a newbie to process when compared to the detail oriented Google Analytics dashboard.
Uptime
Great section to verify that your host is coming through on their promise.
Thus far with Bluehost I have had 100% uptime and you should expect no less from your Host. If your site experiences an outage, this section will identify the date and time which is useful if you need to reach out to your host to handle the problem.
If you site goes “down” which is different from a plain outage, often times this indicates a problem with software or hardware running the site.
Alternatively, your site could be unreachable which would mean that there were network connection problems at the location where your server resides.
I don’t think I’ll have problems with this since BlueHost is in Provo, Utah, which is a Google Fiber city.
If you are looking to set up your own website and want to use BlueHost please support this site by using our BlueHost affiliate link.
Search, Social, Link, and Direct
These sections identify the type of traffic that you get.
Search consists of visits from search engines, social is social media referrals, links are backlinks on other domains.
The links section also has a list of the referral counts and the percent of the total link referrals. Another useful element is that you can change top link referrals to be listed by the exact URL instead of just the domain.
Engagement
Engagement is essentially made up of three metrics.
- Bounce Rate
- Pageviews per visit
- Minutes per visit
These are intuitive metrics and are incredibly useful to track because you want visitors to stay on your site longer, read most, and visit more pages. This is where you track whether you are improving or not.
Subdomains
This section is useful if you have subdomains on your server. Maybe you host an app, a forum, and blog, and you have them on different subdomains. This way you can track pageviews and percent of pageviews for each separately.
Mobile
View your mobile visits by platform or device. The platforms that Alexa tracks are:
- iOS
- Symbian
- Blackberry
- Windows
- Android
- Other – Everything else
The Devices are divided into Phone and Tablet.
Top Content
This is an incredibly useful section because it lists a URL on your domain, shows the bounce rate, and the number of pageviews within your selected Date Range.
The best part is the percent change section which shows you whether a specific URL is becoming more or less popular for the current date range over the previous.
Additionally, you can distinguish between entry pages and viewed pages. When you select entry pages, you will only see pageviews that were the first in the session. In a question, of all of the pages that brought in visitors which brought in the most?
Location
This section shows a world map and a list of countries where your visitors reside. The list is automatically sorted by the number of visits and each country has a listing for its percent of visits.
As of now, over the course of this site’s existence, 80% of the traffic came from the United States while Russia is in second place with 3%.
I have to assume that even Alexa’s algorithms don’t remove all of the referral and ghost spam that occurs.
Date Range over Date Range Comparison
One of my favorite features of the Alexa Site Metrics interface is the Date range selection. Here are the options.
The nice thing with selecting a date range is that the graph will show that date range and the previous, superimposed on top of each other.
This site is relatively new and doesn’t yet have a lot of traffic but nonetheless, I can see progress when I view the last month over the previous. Blue line is the last 28 days while the dotted line is the 28 days before that.
When you compare this to Google Analytic’s similar feature, you have to manually select the date range and verify that your range starts on the same day of the week as the other one, while Alexa automatically does this.
Don’t get fooled, Alexa has a simpler interface, but Google Analytics is much more versatile in it’s functionality. When complexity increases, the possibility for intuitive design usually decreases. Remedying this issues is the art of designing UX.
Back to the Measure Websites with Alexa Menu
Increasing Your Alexa Rank
Viewing Your Site with the Alexa Toolbar
While researching various concepts related to Alexa and the Alexa Toolbar, I found an article about black hat SEO with the Alexa toolbar. This article states that you can pad your rankings by navigating around your own website.
You probably do that already though right? On the other hand, if you spend a bunch of time engaging in routine habits that you think will improve your Alexa rank, I will have to stop you right there. Yes, some black hat techniques work, but most only work temporarily.
This technique is essentially a waste of time from my perspective. The reason being, you can use that time to create great content for your site. ‘
You can artificially bump up your ranking by installing Tor and visiting your site from different IP addresses but do you really think that will benefit you in the long run? I believe that you are better off spending your time with forum marketing and social media rather than this boring activity.
It also depends greatly on what goals you have for your visitors. If this article highlighted new and useful concepts in web metrics for you, please click the tweet button below. To get updates for new articles regarding web metrics, please sign up for our newsletter.