Tag Archives: pages
The Top 10 Technical SEO Ranking Factors
Not all search engine optimization is keyword
research and link building.
The technical aspects of your website, from the structure of your content, down to the cold hard code, are as important, or even more important, than building links and keyword optimization. If your site isnt search engine friendly, or doesnt adhere to Googles Webmaster Guidelines, then it will be continually penalized, and will never rank to its full potential.
According to Mozs 2015 Ranking Factors Study, these are the top 10 technical SEO ranking factors in order of importance.
Hreflang Declaration
The hreflang declaration tag (seen as rel=”alternate” hreflang=”x” in HTML code) is an html tag that tells Google what language your site is written in. It is used to signal to search engines which version of a page to look for, depending on the location and language of the searcher. For example, if you have an English and a Spanish version of a page, and the prospect is searching from a Spanish speaking country, Google will choose the page with the hreflang= “es” (Spanish) over the the page tagged hreflang= “en”, as the tag helps Google infer which version is more appropriate.
This is what the snippet looks like for an English site in the Unites States:
Number of Internal Links
Internal links, or links from one page on your site to another page within your site, are important for SEO for several reasons. First, they allow you to pass on authority from your highest authority pages to your lower authority ones. Second, they provide more paths through which Google can crawl your site. The more links from your main pages to your sub-pages, the easier it is for Google to discover these deeper pages and index them.
Use this internal link to check out our blog on backlink tools.
URL Structure
URLs should be kept simple: short, and hyphen free. Thats what Google wants, as long URLs with excessive use of hyphens have proven to perform worse than short and easy URLs. This makes sense, as Google continues to stress user friendliness and the user experience, and having a short and easy to remember URL fits that criteria.
www.you-dont-want-a-url-that-looks-like-this.com
Link to Content Ratio
Google likes content, we know this. They also hate link spam, we know that too. Which is why it makes sense that if you have a ton of links on your site but not much content, Google will think youre trying to pull some kind of link scheme and de-rank you. Thats why its good to keep the link to content ratio low, to make sure youre not raising and red flags.
Code to Content Ratio
As with the link to content ratio, the code to content ratio is best kept low. Lots of code paired with little content again will raise spam flags with Google, as it makes the seem as though the site isnt being used. The excess code can also greatly hinder your page speed, which also negatively affects your rank.
Google Analytics Tracking Code
According to the study by Moz, websites with a Google tracking code installed performed better than those without. Perhaps this is a signal to Google that the website is run by a webmaster who is actively involved in monitoring it, and therefore likely to be more trustworthy.
For those of your who dont know, this is what the Google Analytics Tracking code looks like in HTML.
var _gaq = _gaq || [];
_gaq.push([_setAccount, UA-1337H@X0R-1]);
_gaq.push([_trackPageview]);
(function() {
var ga = document.createElement(script); ga.type = text/javascript; ga.async = true;
ga.src = (https: == document.location.protocol ? https://ssl : http://www) + .google-analytics.com/ga.js;
var s = document.getElementsByTagName(script)[0]; s.parentNode.insertBefore(ga, s);
})();
Robots.txt
Robots.txt are important as they tell search engine spiders like Googlebot how they should interact with the pages and files of your web site. If there are pages, files, or images that you do not want Google to index, you can block them with the robots.txt. Without a robots.txt, Google will indiscriminately index everything on your site.
URL is HTTPS
Secure websites, or websites with SSL Security Certificates, are shown to do slightly better in the SERPs. This is likely a signal to Google that your site is safe, secure, and real. Once again, the better the user experience, the better you will rank.
XML Sitemap
A sitemap is essentially a map of the pages on your site. This map contains metadata and information about the organization and content of your site. Googlebot and other search engine web crawlers use sitemaps as a guide to more intelligently crawl your site. Having a sitemap can help your pages get indexed, and allows you to highlight content that you want search engines to crawl.
Schema.org Markup
Schema markup is a way to change the appearance of the meta information presented about your site in the search engine pages. By using a schema.org markup, the meta description under your search engine listing can be modified to present information like reviews, employee profiles, etc. Having proper schema markup for certain information can even land your content in the Google answer box, which is a guaranteed way to drive traffic to your site.
So, if youre trying to rank your site up and arent seeing much success, these elements may be holding you back, as they are essential to earning Googles trust. Keep your site content oriented, user friendly, and easily able to be crawled by the Googlebot, while simultaneously ensuring your site is search engine friendly.
The Top 10 Technical SEO Ranking Factors
Not all search engine optimization is keyword
research and link building.
The technical aspects of your website, from the structure of your content, down to the cold hard code, are as important, or even more important, than building links and keyword optimization. If your site isnt search engine friendly, or doesnt adhere to Googles Webmaster Guidelines, then it will be continually penalized, and will never rank to its full potential.
According to Mozs 2015 Ranking Factors Study, these are the top 10 technical SEO ranking factors in order of importance.
Hreflang Declaration
The hreflang declaration tag (seen as rel=”alternate” hreflang=”x” in HTML code) is an html tag that tells Google what language your site is written in. It is used to signal to search engines which version of a page to look for, depending on the location and language of the searcher. For example, if you have an English and a Spanish version of a page, and the prospect is searching from a Spanish speaking country, Google will choose the page with the hreflang= “es” (Spanish) over the the page tagged hreflang= “en”, as the tag helps Google infer which version is more appropriate.
This is what the snippet looks like for an English site in the Unites States:
Number of Internal Links
Internal links, or links from one page on your site to another page within your site, are important for SEO for several reasons. First, they allow you to pass on authority from your highest authority pages to your lower authority ones. Second, they provide more paths through which Google can crawl your site. The more links from your main pages to your sub-pages, the easier it is for Google to discover these deeper pages and index them.
Use this internal link to check out our blog on backlink tools.
URL Structure
URLs should be kept simple: short, and hyphen free. Thats what Google wants, as long URLs with excessive use of hyphens have proven to perform worse than short and easy URLs. This makes sense, as Google continues to stress user friendliness and the user experience, and having a short and easy to remember URL fits that criteria.
www.you-dont-want-a-url-that-looks-like-this.com
Link to Content Ratio
Google likes content, we know this. They also hate link spam, we know that too. Which is why it makes sense that if you have a ton of links on your site but not much content, Google will think youre trying to pull some kind of link scheme and de-rank you. Thats why its good to keep the link to content ratio low, to make sure youre not raising and red flags.
Code to Content Ratio
As with the link to content ratio, the code to content ratio is best kept low. Lots of code paired with little content again will raise spam flags with Google, as it makes the seem as though the site isnt being used. The excess code can also greatly hinder your page speed, which also negatively affects your rank.
Google Analytics Tracking Code
According to the study by Moz, websites with a Google tracking code installed performed better than those without. Perhaps this is a signal to Google that the website is run by a webmaster who is actively involved in monitoring it, and therefore likely to be more trustworthy.
For those of your who dont know, this is what the Google Analytics Tracking code looks like in HTML.
var _gaq = _gaq || [];
_gaq.push([_setAccount, UA-1337H@X0R-1]);
_gaq.push([_trackPageview]);
(function() {
var ga = document.createElement(script); ga.type = text/javascript; ga.async = true;
ga.src = (https: == document.location.protocol ? https://ssl : http://www) + .google-analytics.com/ga.js;
var s = document.getElementsByTagName(script)[0]; s.parentNode.insertBefore(ga, s);
})();
Robots.txt
Robots.txt are important as they tell search engine spiders like Googlebot how they should interact with the pages and files of your web site. If there are pages, files, or images that you do not want Google to index, you can block them with the robots.txt. Without a robots.txt, Google will indiscriminately index everything on your site.
URL is HTTPS
Secure websites, or websites with SSL Security Certificates, are shown to do slightly better in the SERPs. This is likely a signal to Google that your site is safe, secure, and real. Once again, the better the user experience, the better you will rank.
XML Sitemap
A sitemap is essentially a map of the pages on your site. This map contains metadata and information about the organization and content of your site. Googlebot and other search engine web crawlers use sitemaps as a guide to more intelligently crawl your site. Having a sitemap can help your pages get indexed, and allows you to highlight content that you want search engines to crawl.
Schema.org Markup
Schema markup is a way to change the appearance of the meta information presented about your site in the search engine pages. By using a schema.org markup, the meta description under your search engine listing can be modified to present information like reviews, employee profiles, etc. Having proper schema markup for certain information can even land your content in the Google answer box, which is a guaranteed way to drive traffic to your site.
So, if youre trying to rank your site up and arent seeing much success, these elements may be holding you back, as they are essential to earning Googles trust. Keep your site content oriented, user friendly, and easily able to be crawled by the Googlebot, while simultaneously ensuring your site is search engine friendly.
The Google Index New Webmaster Tools Feature Reveals Which of Your Pages Are Indexed
A widely asked question from webmasters for several years has often revolved around the notorious Google index and their sites placing within it. Is my site included? Has it been removed? Has that new page been indexed yet? What about that other one?
Fortunately for everyone, last month Google announced its attempts to answer some of these questions by publishing a new feature to its webmaster tools.
Found under the Health section of your webmaster tools account, the new Index Status report is able to tell you exactly how many pages it has included in its index.
Initially youll be given a graph showing the total number of URLs from your site that has been added to Googles index during the last year. Most sites will see a steady increase in the number indexed over time.
Under the advanced tab you are given access to far more useful information. Not only are you given the total number of pages indexed but also the total pages crawled, the pages crawled but not indexed and the attempted page crawls which were blocked.
It is broken down as so:
Total Indexed the total number of URLs from the site added to the Google Index.
Ever Crawled the cumulative total number of URLs on your site which Google has ever accessed.
Not Selected URLs which Google have chosen not to include in their index. This is often due to the URLs redirecting to other pages or containing content which is significantly similar to other pages.
Blocked by Robots – URLs which Google have attempted to crawl but were denied access due to being blocked within the sites robots.txt file.
It is important that you note that the figures provided are all totals. In that the figure for that particularly day meant that at that point in time, those number of pages are indexed or have been crawled. The figure doesnt suggest that number of pages were indexed that day. This is important for older sites with a large number of pages. Those sites may experience significantly large differences between the number of pages crawled and the number of pages indexed.
But what if your graph doesnt look like those above. What if your graph is showing spikes and valleys? Whilst a spiking and dropping graph would be the first indicator of possible indexation problems, the important thing to do is assess how and when the graph spikes.
Any variations in the charts could well be easily explained based on changes you have made to your site.
Changing your URL structure, setting up a high number of redirects or canonical URLs could well see a rise in the “Not Selected” count as well as a spike and drop with your total indexed count. Adding lots of new content to your site which is getting initially indexed will also cause variation in the charts.
It is important to assess any variations and see if there are legitimate causes behind these changes. If you have no clear idea as to why these counts may change then that is a fairly clear indication that there are technical issues with your site which need addressed.
The most useful function of the new feature is to allow webmasters to identify trends and discover whether Google is indexing their content. If Google is shown to be having difficulty indexing the site correctly this can be the first indicator that the site is having technical issues with canonicalization, duplicate content or other elements of your sites structure.
Although only once Google reveals exactly which pages are indexed or not will this tool be able to fully solve any indexation problems.