At its heart, SEO (Search Engine Optimization) developed as an extension to web accessibility by following HTML4 guidelines, in order to better identify the purpose and content of a document.
This meant ensuring that web pages had unique page titles that properly reflected their content, as well as keyword headings to be better highlight the content of individual pages, and that other tags were treated the same accordingly.
This was necessary, not least because web developers were often only focused on whether their coding worked, rather than the user experience, let alone following web publishing guidelines.
This slowly changed as it became increasingly known that search engines used these "on-page" signals to provide their "Search Engine Results Pages" (SERPs) - and that there was an advantage to ranking higher on these to tap into free and natural organic traffic.
Even in 2021, this is still the case, a state of affairs reinforced by the forthcoming Core Web Vitals, Google's way to signal that it is sort of resetting the clock.
- Here are the best web hosting services that we have curated
- Find the best website builder software
- We have shortlisted the best onpage SEO tools
The internet has evolved a lot since those early days, and major search engines such as Google now process far more "off page" information when determining their search results, not least by using semantic processing, collating user data, and applying neural networks for the machine learning of patterns, trends, and personal preferences.
Even still, the core ideals of SEO remain the same as they always have - that of ensuring pages have the correct tags for targeting keywords, not just for natural search results, but also for PPC (Pay Per Click) and other marketing campaigns, where call-to-action (CTA) and conversion rates are essential indicators of success.
But how does a business know which keywords to target on its sales pages? How does a website filter transactional traffic from general site visitors? And how can that business increase its ability to capture targeted traffic from across the internet? Here we list a number of tools that will help do exactly that.
SEMrush SEO toolkit was originally developed in 2008 by SEMrush. In 2018, the project received funding of $40 million for expansion.
The keyword research tool is accessible from SEMrush's super elaborate dashboard. You can view detailed keyword analysis reports as well as a summary of any domains you manage.
More crucially, the SEO toolkit allows you to compare the performance of your pages to see how you rank against the competition. For instance, you can analyze backlinks from other websites to yours. (this process is sometimes called 'link building').
Traffic analytics helps to identify your competitors' principle sources of web traffics, such as the top referring sites. This enables you to drill down to the fine details of how both your and your competitors' sites measure up in terms of average session duration and bounce rates. Additionally, "Traffic Sources Comparison" gives you an overview of digital marketing channels for a bunch of rivals at once. For those new to SEO slang 'bounce rates' are the percentage of visitors who visit a website then leave without accessing any other pages on the same site.
The domain overview does much more than provide a summation of your competitors' SEO strategies. You can also detect specific keywords they've targeted as well as access the relative performance of your domains on both desktop and mobile devices.
SEMrush has received many positive mentions online but has been critiqued for use of SEO jargon such as 'SERP' which may alienate inexperienced users. A 'Pro' subscription costs $99.95 per month (annual plan) which includes access to all SEO tools.
Over time, SEMrush added a few more tools to its offerings: a writer marketplace, a traffic-boosting tool, a tool set for agencies and even a white-glove service for PR agencies.
Google Search Console (GSC) is an excellent way for newbie webmasters to get started with SEO.
Even if you're not headstrong on SEO, whatever the size of your site or blog, Google's laudable Search Console (formerly Webmaster Central) and the myriad user-friendly tools under its bonnet should be your first port of call.
The suite of tools gives you valuable information about your site at a glance: it can assess your site's performance and observe potential problems to troubleshoot (like negative spammy links), help you ensure your site is Google-friendly and monitor Google's indexing of your site.
You can even report spam and request reconsideration if your site has incurred a penalty. Plus, if you don't refer to their Webmaster Guidelines now and again, well, you've only yourself to blame if you go wrong. Search Console is constantly updated, and new features are on the way, such as new URL inspection tool or the new sitemaps report.
Help is available via the Webmasters Help Community, a place for webmaster's to connect and share troubleshooting and performance tips. Since the end of 2020, Google migrated its disavow link tool to Search Console and has also updated its Outdated Content tool.
SEO Spider was originally created in 2010 by the euphemistically named "Screaming Frog". This rowdy reptile's clients include major players like Disney, Shazam and Dell.
One of the most attractive feature of SEO Spider is its ability to perform a quick search of URL's, as well as crawl your site to check for broken pages. This saves you the trouble of manually clicking each link to rule out '404 errors'.
The tool also allows you to check for pages with missing title tags, duplicated meta tags, tags of the wrong length, as well as check the number of links placed on each page
There is both a free and paid version of SEO Spider. The free version contains most basic features such as crawling redirects but this is limited to 500 URLs. This makes the 'Lite' version of SEO Spider suitable only for smaller domains. The paid version is $180 per year and includes more advanced features as well as free tech support.
Majestic SEO tools has consistently received praise from SEO veterans since its inception in 2011. This also makes it one of the oldest SEO tools available today.
The tools main focus is on backlinks, which represent links between one website and another. This has a significant influence on SEO performance and as such, Majestic has a huge amount of backlink data.
Users can search both a 'Fresh Index' which is crawled and updated throughout the day, in addition to an 'Historic Index' which has been praised online for its lightning retrieval speed. One of the most popular features is the 'Majestic Million' which displays a ranking of the top 1 million websites.
The 'Lite' version of Majestic costs $50 per month and incorporates useful features such as a bulk backlink checker, a record of referring domains, IP's and subnets as well as Majestic's integrated 'Site Explorer'. This feature which is designed to give you an overview of your online store has received some negative comments due to looking a little dated. Majestic also has no Google Analytics integration.
Moz Pro is a platform of SEO tools that aim to help you increase traffic, rankings, and visibility across search engine results.
Key tools include the ability to audit your own site using the Moz Pro spider, which should highlight potential issues and recommend actionable insights. There's also the ability to track your site rankings over hundreds or even thousands of keywords per website.
There's also a keyword research tool to help determine which keywords and keyword combinations may be the best for targeting, and there's also a backlink analysis tool that mixes a combination of metrics including anchor text in links as well as estimated domain authority.
Pricing for Moz Pro begins at $99 per month for the Standard plan which covers the basic tools. The Medium plan offers a wider range of features for $149 per month and a free trial is available. Note that plans come with a 20% discount if paid for annually. Additional plans are available for agency and enterprise needs, and there are additional paid-for tools for local listings and STAT data analysis.
Even if you don't sign up to Moz Pro, a number of free tools are available. There's also a huge supporting community ready to offer help, advice, and guidance across the breadth of search marketing issues.
Best free SEO tools
Although we've highlighted the best paid-for SEO tools out there, a number of websites offer more limited tools that are free to use. Here we'll look at the free options.
SEOquake is one of the most popular toolbar extension. It allows you to view multiple search engine parameters on the fly and save and compare them with the results obtained for other projects. Although the icons and numbers that SEOquake yields might be unintelligible to the uninformed user, skilled optimizers will appreciate the wealth of detail this add-on provides.
Gauge details about number of visitors and their country, get a site's traffic history trended on a graph, and more. The toolbar includes buttons for a site's Google index update, backlinks, SEMRush ranking, Facebook likes, Bing index, Alexa ranks, web archive age and a link to the Whois page. There’s also a useful cheat sheet and diagnostics page to have a bird’s view of potential issues (or opportunities) affecting a particular page or site.
Knowing the right keywords to target is all-important when priming your web copy. Google's free keyword tool, part of Adwords, couldn't be easier to use. Plug your website URL into the box, start reviewing the suggested keywords and off you go. Jill Whalen, CEO of HighRankings.com is a fan and offers advice to those new to keyword optimization: "make sure you use those keywords within the content of your website."
However, while useful for keyword research purposes it's important to realize the numbers provided are approximations rather than exact figures, and intended to provide a guide to popularity rather than exact real-time search volume.
Yet another Google tool on that list (not a surprise isn’t it). Optimize is not for the faint hearted and will make even seasoned SEO experts uncomfortable. SEO isn't all about rankings and without the right balance of content that engages with your visitors and drives conversions, you're earnest optimization could be wasted.
Google’s free service helps take the guesswork out of the game, allowing you to test your site's content: from simple A/B testing of two different pages to comparing a whole combination of elements on any given page. Personalization features are also available to spice things up a bit. Note that in order to run some of the more complicated multivariate testing, you will need adequate traffic and time to make the results actionable, just as you do with Analytics.
Understanding backlinks (sites linking to you) allows website owners and publishers to see what link opportunities they might be missing out on. Enter Ahrefs, arguably one of the most powerful players out there.
They maintain one of the largest live backlink indexes currently available with over 17 trillion known links, covering 170 million root domains. While Ahrefs isn't free, the backlink checker feature is, which provides a useful snapshot that includes your domain rating, the top 100 backlinks, top 5 anchors and top 5 pages, the strict minimum to provide with a feel of what Ahrefs has to offer.
What is an SEO crawler?
An SEO crawler can help you discover and fix issues that are preventing search engines from accessing and crawling your site. It remains an essential yet elusive tool in the arsenal of any good SEO expert. We caught up with Julia Nesterets, the founder of SEO crawler Jetoctopus to understand what exactly an SEO crawler and why is it so important.
If you are a webmaster or SEO professional, this is probably the most heartbreaking message you may receive. Sometimes Google’s bots may ignore your content and SEO efforts and avoid indexing your page. But the good news is that you can fix this issue!
Search engines were designed to crawl, understand, and organize online content to deliver the best and most relevant results to users. Anything getting in the way of this process can negatively affect a website’s online visibility. Therefore, making your website crawlable is one the primary goals and can highlight any issues you have with your web hosting service provider.
By improving your site’s crawlability you can help search engine bots understand what your pages are about and by that leverage your Google ranking. So how can an SEO crawler help?
1. It offers real-time feedback. An SEO crawler can quickly crawl your website (some crawls as fast as 200 pages per second) to show any issues it gives. The reports analyzes the URL, site architecture, HTTP status code, broken links, details of redirect chains and meta robots, rel-canonical URLs, and other SEO issues. These reports can be easily exported and referred to for further action by the technical SEO and development teams. Thus, using an SEO crawler is the best way to ensure your team is up to date on your website crawling status.
2. It identifies indexing errors early. Indexing errors like 404 errors, duplicate title tags, duplicate meta descriptions, and duplicate content, often go unnoticed as they aren’t easy to locate. Using an SEO crawler can help you spot such issues during routine SEO audits, allowing you to avoid bigger problems in the future.
3. It tells you where to start! Deriving insights from all available reports may be intimidating for any SEO professional. Therefore, it’s wise to choose an SEO crawler which is problem-centric and helps you prioritize issues. A good crawler should make it possible for webmasters to concentrate on the main problems by estimating their scale. That way, webmasters can keep fixing critical issues in a timely manner.
How do Google SEO spiders work and many more backlink questions
An SEO crawler can help you discover and fix issues that are preventing search engines from accessing and crawling your site. It remains an essential yet elusive tool in the arsenal of any good SEO expert. We caught up with Julia Nesterets, the founder of SEO crawler Jetoctopus to understand what exactly an SEO crawler, why is it so important and a bevy of questions about backlinks in general.
Google’s SEO spiders are programmed to collect information from webpages and send it to the algorithms responsible for indexing and evaluating content quality. The spiders crawl the URLs systematically. Simultaneously, they refer to the robots.txt file to check whether they are allowed to crawl any specific URL.
Once spiders finish crawling old pages and parsing their content, they check if a website has any new pages and crawl them. In particular, if there are any new backlinks or the webmaster has updated the page in the XML sitemap, Googlebots will add it to their list of URLs to be crawled.
So is it worth retrospectively adding backlinks? It’s worth adding backlinks to content that was posted a while ago, especially if a page is high-quality and on the same subject. This will also help preserve the equity of that page.
Is there a hierarchy of backlinks? Technically, there is no hierarchy of backlinks, as we can’t structure and scale them the way we want. However, we can increase the quality of backlinks based on several criteria like:
- Anchor text relevance
- Relevance and quality of a linking page
- Linking domain quality
- IP address
- Link clicks and a linking website traffic
- Few links on the linking webpage
The links of highest quality have relevant keywords in the anchor text and come from trustworthy websites. But again, there are no hard and fast rules on how Google evaluates backlinks. Some backlinks can still be of proper quality even if they don’t fulfill these parameters.
How often should a site audit links?
Though there’s no right or wrong way of auditing links, there are a few pointers to bear in mind when determining the frequency.
- If your website has a long history of inorganic link building, it’s wise to do a monthly disavow.
- In most cases, a quarterly audit is recommended. It allows webmasters to keep a website’s link profile clean and track new backlinks pointing.
- Links on a website that has been growing ethically and isn’t in a competitive domain can be checked half-yearly as the risk of the negative SEO is low.
Consider a website with hundreds of old, very low traffic pages with no links (e.g. eCommerce/news). Is it worth either 301 these pages to relevant key hubs or update the page with backlinks to the relevant key hubs without updating the dates?
In such a case, choose the pages with the best content and update them. Set up 301 redirects for the pages you do not want your audience to see and point them to the relevant key hubs. The key term here is ‘relevant.’ The 301 redirects should point to thematically relevant hubs. Otherwise, Google will treat them as soft 404s.
Are social media backlinks any good in 2021?
Most webmasters may feel that social media backlinks are pointless primarily because they are Nofollow links that do not impact SEO. However, social signals are an important ranking factor for Google. People are constantly clicking on links they see in their newsfeeds. If you offer great content, then this can be a great advantage for you. That’s why, do not ignore social backlinks.
Can adverts impact your SEO negatively?
When you think of SEO, you generally don’t think of ads, and with good reason. Eric Hochberger, Co-Founder and CEO of full service ad management company Mediavine, explains to us the love-hate relationship between these two entities.
By definition, advertising runs counter to the goals of SEO optimization, a process which relies on publisher content and user experience. However, as an ad management company that originated as an SEO marketing firm, we work to find the perfect balance, ensuring the two can coexist. Yes, you can run high-performing ads and still rank well in search engines thanks to the right ad tech. It’s not an either-or scenario, and here’s why:
The first key feature is lazy loading. When a website employs lazy loading, ads only load on a webpage as a user scrolls to them. Meaning, if a user doesn’t scroll to a certain screen view, the ads don’t exist on the page. This function extremely lightens the page load. A lighter page means faster loading which leads to better SEO.
Complying with the Coalition for Better Ads (CBA) standards is critical to SEO because the CBA is what Google uses to power its built-in Chrome Ad Filtering and its Ad Experience Report in Google Search Console (its main SEO tool). There’s a misconception that the number of ads affects SEO, but it’s actually the density. The CBA provides comprehensive insight regarding appropriate ad-to-content ratios for both mobile and desktop.
Lastly, reducing above the fold (ATF) ads, or ads that appear in the first screen view, is huge for both page speed and SEO. If an ad isn’t loading in the first screen view, the site will appear to load faster (how Google measures it), since users don’t notice when an ad loads if it’s below the fold.
Which leads me to this - you’ll often hear that SEO follows user experience. Google uses this line quite a bit, which makes sense. Ultimately, the goal of Google search results is to return the best user experience. If ads are bogging down a website, that doesn’t equal a high-quality user experience, which therefore will not generate good SEO. Do you see the pattern here?
While there is an ability for SEO and advertisements to coexist in a positive way, the existing resources for publishers to promote this are scarce. The solution would be to get a framework that works on the most popular CMS and focuses on Google’s best practices from Core Web Vitals to page experience and employs lazy loading and reducing ads ATF.
- Check out the best free website hosting services