Having an SEO Friendly site with the fewest errors is possible thanks to screaming frog, a very powerful tool for optimizing the SEO of your site or analyzing those of your competitors. But before getting into the functionality of this software, you must download it, install it and then configure it on your computer. Discover through this article some steps of its configuration.
Screaming frog configuration
When you set up a screaming frog, it will be able to deeply analyze a site and get all the data you need. Find out step by step how to configure it to get the most out of it.
Start your first crawl screaming frog
Once on the tool, the first thing you will see is a bar where you can enter the URL of the site you want to analyze. You just need to enter your URL and select start to launch your first crawl. Once launched, if you want to stop, you just have to click on pause then clear to reset your crawl to 0.
Configure your SEO Spider screaming frog
Several possibilities exist to configure your crawl so that it only returns the information that really interests you. This can be very useful for example when you are crawling sites with a lot of pages. Go to the top menu, select configuration and you will have access to a number of options like the ones below.
Spider is the first option that is available when you enter the setup menu. This is where you can decide which resources you want to crawl.
Congratulations on making it this far. The first part was not easy, but will be vital for a good configuration of your crawler. This part will be much simpler than the previous one. By selecting Robots.txt from the menu, you will be able to configure how screaming frog should interact with your robot file.
This feature will be very useful in case your URLs contain parameters. In the case for example of an e-commerce site with a listing containing filters, you can in this part ask screaming frog to rewrite the URLs in flight.
Include and Exclude
A keyword, these features are the most used. Indeed, they are particularly useful for sites with a lot of URLs if you only want to crawl certain parts of the site. Screaming frog can thus, through the indications you make in these windows, only show certain pages in its list of crawl results. If you manage or analyze large sites, these features will be very important to master so that you save many hours of the crawl.
In this part, you can choose how fast you want the screaming frog to crawl the site you want to crawl. This can be useful when you want to analyze large sites again. Leaving the default configuration could take a long time to browse this website. By increasing the Max Threads, screaming frog will be able to browse the site much faster. This configuration should be used with caution, as it can increase the number of HTTP requests made to a site, which could slow it down. In extreme cases, it can also crash the server or you could be blocked by the server.