Site speed, mobile responsiveness, SSL and more - they all impact your site’s search results, too.
One of the most misunderstood, yet most discussed topics we come across in the tech industry is Search Engine Optimization (SEO) with the promise of high rankings on the most popular search engines.
There are hundreds of agencies, some of which will promise everything from making your pages rank in the top 3 in Google, to guaranteeing your site will get a certain number of visitors from your search results. If anyone approaches you with that type of mentality, run away!. No one can guarantee what your page’s ranking will be or how many people will see your site’s listings. What a great SEO team can do however, is ensure your site is prepared to be seen by the search engines and that it has the best chance for being understood and seen as valuable information.
To better understand how the most popular search engine works, it’s important to first understand its mission statement:
Google, and the other popular search engines, has never released their formulas for how they rank pages. Instead, they provide guidance to better understand what their tools look for when indexing a site as well as helping us understand what might need to be changed to give it a bump in the overall relevancy. In Google’s world, they are trying to ensure the pages they show in their listings give people the best information they are looking for, as easily as they can. This means ensuring that the information on the site’s the index is useful and that it is also easy to read.
This is single topic covers everything from how to determine how to structure content on your content pages, to properly using metadata (descriptions, keywords, etc.), ensuring your site utilizes Robots and Site Maps and ensuring the site delivers pages in a timely fashion. As this topic is so large, we will split the basics into two and create a short series. If you understand these basics, you should be well positioned to start improving your rankings and asking the right questions of your marketing/SEO/development teams.
As our usage of the internet has changed over the past 30 years, so have the tools used to access it. Rather than sitting down at the family desktop (if you had one) in the living room, dialing up your local Internet Service Provider and waiting a few minutes for your favorite web site to load, we are now presented with a variety ways to access the information that enhances our lives including via phones, tablets, cars and even appliances in our kitchens.
Responsive Design is a methodology which basically says that as your screen gets smaller, the content on the page should adjust in size and layout to ensure the critical information is still easy to read and interact with. Recently, search engines have begun giving ranking increases to sites which utilize this.
To ensure the largest base of users can see your website, its vital that it can be viewed in a variety of sizes and screen types. If Google’s users might be coming from a variety of devices, it makes sense that they would want to put sites which are more universally usable at the top. We’ve discussed this topic previously and I’d highly recommend reading more in these other blogs: What is it? and Mobile first indexing.
Robot.txt and Sitemaps
Every site should ideally have both files in use. The robot.txt file is used to tell a search engine some very basic rules about how to index your site. It can ensure certain files/folders are included or excluded. It also should include a link to your sitemap to make sure it isn’t missed by the search engine. While a sitemap is recommended to be called sitemap.xml, it can technically be named anything as long as it referred to correctly in your robots file.
A sitemap is a complete listing of all pages on your website which a search engine should index/rank. While search engines do crawl a website to find pages and understand how they are linked to one another, not all pages are linked throughout the site. For example, you may only show the most recent twenty news items on your site, but you want to ensure Google is aware of all news items (archives).
The sitemap is also a way to tell the search engines how frequently your page changes and how important you believe the page is in reference to the rest of the pages on your site. While we typically don’t recommend changing these values on a regular basis, it could be helpful to change the page’s priority for older content which won’t be changing in the future, versus a landing page which may update daily. Google will not necessary re-index the pages based on this number, but it will use it to determine which pages to tackle first.
Once you believe your robots and sitemap are setup correctly, it's important to check them using publicly available tools. This will ensure the world is seeing the files correctly and that they are formatted correctly. Some of those tools include (but are not limited to):
- Google Search Console Coverage report
HTTP vs HTTPS
Another major push over the past 5 years by the major tech companies has been to make the internet more secure. The easiest was to do this is to install an SSL Certificate which encrypts (scrambles) the content sent between a web server and the end user requesting the webpage. At first, this was only seen as vital for anything sensitive (I.e.: banking, e-commerce or medical data), but as a user’s data becomes more valuable and identity theft more rampant, it was clear more of the internet needed to adopt this easy-to-utilize tool. End users are also becoming more web savvy and have begun looking for the trusted “lock” symbol or green highlight in the address bar to know the page they are on is taking their security seriously. The most basic level certificates are very inexpensive, and in some cases (depending on your requirements and hosting capabilities) can be free (LetsEncrypt). The search engines have also begun giving a ranking boost to sites which not only have an SSL certificate installed, but also redirect all of their non-SSL traffic to the SSL version of their site.
The speed of your website’s pages is also a vital metric that search engines take into account when ranking your content. The last thing they want to do is have a page in the first few listings which takes 20 seconds to load or isn’t available anymore. The average internet user is younger and has come to expect things to load nearly instantly. The common rule of thumb is that a given page should take no longer than 3-4 seconds to load.
While this isn’t a complete list of site-level items to take into consideration, if you focus on these primary topics, your site will be in much better shape than it previously was and its certain to be seen more favorably in the eyes of Google and the other popular search engines out there. If you have any questions about these topics or would like to discuss how we could help your site achieve its highest potential, please don’t hesitate to reach out to us.
Have questions or comments about this post? We'd love to hear from you.
Load testing involves simulating a real-life, worst case scenario and answers the question of how the application will respond to maximum capacity.
You’ve never really monitored your site much before so you are not sure where to begin, how to monitor your website or why you even need to monitor it. We'll discuss the how's and why's for monitoring your website and provide you with some tools that can assist in your ongoing monitoring project.
Website Traffic Woes?
Google search algorithm updates can wreak havoc on your website’s traffic. Don't let your organic rankings tank. Our free SEO health check can help you identify issues that make Google unhappy with your site.
Like what you read?
Subscribe to our blog "Diagram Views" for the latest trends in web design, inbound marketing and mobile strategy.