4 Best On-Site SEO Analysis Tools
Managing the optimization projects of especially large e-commerce sites is one of the most common tasks we encounter at Zeo. Managing a team responsible for on-site SEO operations and making improvements on such sites requires a lot of effort. Of course, as always, there are some great alternatives to reduce the human factor in this process.
For every site with 100+ pages, you will need crawl analysis tools to see the overall projection more easily. With these tools, you will be able to measure the SEO compatibility of your pages tirelessly and in less time. In this article, I will introduce some tools that we frequently use in our clients' operations at Zeo.
Although some of the tools I will talk about are paid and some are free, I believe that they are generally complementary programs. Their simultaneous use in the evaluation process will complement each other's shortcomings and help you achieve much more effective results. In this way, by using these tools, you can have more control over your on-site SEO arrangements and create effective optimization strategies.
With this tool from Moz, we can examine the on-site SEO conditions of all projects, regardless of whether they are small or large-scale, with research reports expressed in understandable language. The reason why I say "understandable" is that even someone who does not have a high level of knowledge on the subject can easily observe the current situation.
In addition, another advantage of Moz Pro is that it offers many different services to its users such as competitor analysis, ranking tracking and link analysis as well as on-site SEO analysis. Of course, in this article, I will generally stay within the framework of on-site SEO.
First of all, we create the campaign by entering the information about the website we want to analyze. After creating the campaign, there is a certain amount of time required for the site to be crawled. This crawling process can take a few days, especially since large-scale e-commerce sites have many subpages. After all the analysis is completed, we come to the evaluation stage, which is obviously one of the most critical parts of the job. It is also very important to make this evaluation in the most accurate way, in other words, to be able to evaluate our site from Google's point of view. The reason for this is that we will decide on the changes we need to make after this process.
When you go to the "Crawl Diagnostics" tab, you can access all the analyzed pages and their conditions. The example you see is the results of the analysis of a medium-sized e-commerce site. You can see all the problems in detail on this page, which talks about the errors on the pages and the points that need attention. On the right side, it is possible to see which issues have problems and how many pages on average have this problem. By examining the menus within the "Show" button you see above, you can analyze the problems one by one, as well as easily observe them under the headings "error", "warning" and "notification".
If we talk about the analysis results that Moz gives feedback to the user as 'errors' that definitely need to be fixed;
4xx errors, 5xx errors, missing or unused headings, double page content, double page headings and pages that are blocked by robots.txt are among the important errors that should definitely be corrected by Moz. Although the number of examples may seem high, you will be able to observe that these problems slowly decrease in a short time with the creation of appropriate strategies. If you are confused about the exact definitions of these errors, you can find detailed information on the pages specially organized for each error situation.
In Moz, crawl analyzes continue periodically unless you terminate the campaign, informing you of the status and progress of the site. If you wish, you can report this progress and receive it in PDF or Excel compatible form, and you can continue your analyzes and arrangements accordingly.
Screaming Frog is a truly indispensable program to fully understand your website's performance in organic searches and to ensure that Googlebots can easily navigate your site thanks to the changes you make.
Screaming Frog is a user-friendly desktop program that is fully suitable for SEO processes. Especially for large projects, it has an important role, as it doesn't have a 10,000 or 20,000 page analysis limit like the Moz Pro tool. In this way, you can get a lot of different analysis results by reducing a control process that would take hours manually to minutes. Considering that Google does not even share some of this data in its own service Search Console, we can say that SEO Spider does serious data mining.
I would like to talk a little bit about its usage. It is quite simple to start the process immediately after downloading and installing the program. You start the process by simply entering the URL of the website you want to research in the address bar and then pressing the "start" button. Once the process is complete, you will be presented with a list of different types of results and data indicators.
We can filter this data according to page structures and use it to see and analyze the results we want. If we give an example about the contents of the categories; Under the category called "Internal", we can easily see the links within the site and which pages have links between them.
In addition, as you can easily see under the "status code" column, we can check the status of the linked page, whether it is problematic or not, or whether it is redirected in the way we want.
The "External" tab is no different. In general, it talks about off-site links and their condition. In addition, the full link, the content (image, text, video, etc.) and the redirection patterns (status code) are specified.
Other important analysis information that Screaming Frog provides to its users are "Page Titles" and "Meta Descriptions". By examining these analysis results, we can easily detect double page titles and missing / missing meta description problems. In addition to these; h1, h2, etc. You can easily access a lot of data, from the use of headers to the analysis of visual files.
You can also analyze pages and applications such as AMP and Structured Data with this tool, while you can expand your analysis by establishing API connections with many tools such as Google Search Console, Google Analytics and PageSpeed Insights.
As I mentioned at the beginning of my article, it is very important to use the tools simultaneously because you can reach a problem that you cannot reach through this program with another tool.
Although Screaming Frog and Moz Pro allow us to analyze our websites for everything from server errors to double page header issues, BrowSEO is an indispensable tool to fully analyze the site and better understand how Googlebots see our website "from the user's perspective".
BrowSEO is positioned differently from other programs with its very simple use and analysis results. Although the information it provides is much more limited than other tools, it reveals exactly what kind of structure your website has. I would like to enlighten the subject by talking about its immediate use.
When you type the address of your site directly into the address bar, you get a basic HTML view of your homepage and some analysis reports on the right. BrowSEO provides visual and numerical information on what kind of link system you have, especially for large-scale websites, by indicating on-site, off-site and nofollow links in different colors (yellow for on-site links and red for off-site links). Under the "Head" heading, it includes basic information about the website such as title, meta descriptions, infrastructure system, etc.
Thanks to this service, you can also work on preventing some major problems, as you can literally see your website through Google's eyes.
For example, you can see that the content that you attach great importance to being on your website, but that you have created with structures such as flash, AJAX, etc., is not perceived by bots and you can take measures accordingly.
Other outstanding features of BrowSEO are that it provides information about all the headings on the page, how it will appear in search engines in a possible search, and the so-called "Cloaking-Attempt", as you can see in the example above, which tests whether the pages appear the same way to Google's bots and organic visitors.
This system, which provides visual information in general, is served completely free of charge in a web-based manner. It is possible to obtain the outputs of all these analyzes from the "Download entire session" section.
We can easily say that the most basic way to understand Google's on-site evaluation algorithms is to get information from Google. Therefore, using Search Console actively and effectively is a must for long-term and lasting on-site SEO success.
The best thing about this tool is that it is absolutely "rookie-friendly". Google simplifies things considerably by making adjustments in a way that is suitable for an ordinary user who has no knowledge or experience on the subject. Another feature that distinguishes Search Console from other services is that it is free. In addition to this, Google makes its service as usable as possible by offering its possibilities for mobile users to the fullest.
Undoubtedly one of the biggest advantages of Search Console is to see your website through Google's eyes. Although BrowSEO has the same situation, it does not provide as detailed information as Search Console.
It's also possible to see which keywords are directly caught in Googlebot analytics. If the keywords you are targeting are not being detected by Google, you have the opportunity to adjust your situation on the subject by developing content strategies accordingly.
One of the most advantageous points of Search Console is that it visibly reveals crawl errors with historical changes. Google notifies the website owner of the results that come directly from its bots in this way and offers the "Mark as fixed" option for users to be notified in case of correction.
In addition, by using the "Fetch as Google" tool, you can see the indexing of the main and sub-pages of your site through Google's eyes. Depending on these outputs organized in the form of a tree structure, you can see the parts that may cause negative SEO performance and take measures to prevent them.
As a result; All 4 tools I mentioned above have different and really effective services. I recommend using all of these tools simultaneously to perform a comprehensive and high-quality on-site SEO analysis and achieve the most accurate results. Although it may seem time-consuming and time-consuming, after a few practices, I recommend that you use the trial versions of the paid tools and the direct services of the free ones to have more detailed information about these tools, which I believe will get your work done in a quality way.
DeepCrawl tool, like the Screaming Frog tool, simulates search engine spiders and crawls websites and creates issues as a result of this crawl and presents us with technical errors that need to be corrected on our site; duplicate meta tags, pages with weak content.
If you prefer to use the DeepCrawl tool; I strongly recommend you to take a look at the DeepCrawl SEO Tool Guide content prepared by Metehan Urhan from our team.