Getting Started with Screaming Frog
Getting familiar with Screaming frog
This is an overview of the Screaming frog SEO spider tool and will demonstrate the benefits and features of this tool and how it can be used to analyse websites to find common SEO issues.
Screaming Frog SEO spider crawler can be downloaded and installed from the Screaming frog website. Crawls websites for common SEO issues. Free tool for up to 500 URLs or if you wish to crawl more there is a single one-payment annually for £149 which allows for advanced configurations, saving crawls, and custom exports.
Start with a crawl by inputting the URL in the top bar and press start.
Screaming frog can be used on your desktop where you can download it via the website (Screaming Frog) and use it at scale to analyse a website structure to find issues to fix.
It’s a free tool for up to 500 URLs and a one-off payment for an annual licence to crawl unlimited URLs, additional features such as save and advanced customising – are highly recommended.
Orientating yourself around the user interface
In this series, we look at it from a beginner’s perspective all the way to how professional SEO managers would use this auditing tool to find and fix website issues.
- Licensed product with version number
- The main menu for saving, exporting and customising
- Input your chosen URL here
- Start, pause, resume and clear your chosen URL here
- Progress bar to record how much of the crawl is left to complete
- Tabs to refine your selection of all the discovered URLs
- Filter to further refine your selection
- Export facility to Google sheets, CSV files or Excel file
- Discovered URLs from the chosen domain
- Quick overview of each URL in hierarchical order with their respective quantity alongside each
- Granular view of a chosen URL from the top window for further interrogation
- Tabs to drill down into each URL in order to examine them further
Where to find each element
The crawl has started and a record is kept in the right-hand window pane. [10, 11]
The main window records all the main data points for us to analyse at a granular level in order to educate us to make informed SEO decisions. This covers everything from status codes through to canonicals, meta descriptions, image alt attributes and much more. 
The tabs organise the data specifically and allow you to filter your approach to drill down into the issues within your chosen URL. 
For example, by clicking on ‘Response Codes’ and filtering to 4xx we can see all the broken links of a website. 
You can also perform some of the finds and display via the right-hand window pane which displays elements in real-time and records the number of assets per element so it becomes very easy to analyse the individual elements of a website, no matter how large or complex. 
The lower window pane populates with more data to provide a granular approach to analysing the specific data elements, thereby allowing you a better understanding of what is going on with individual data elements. [12, 13]
The process explained
Click a single cell within the main window provides additional context to that data element in the lower window pane.
For example, by clicking a URL and then selecting on the lower window pane the tab ‘Inlinks’ we can see all the pages linking to that page. Great for a better understanding of how well internal linking has been created on a website in order to understand whether the linked-to pages are relevant or not.
All of these elements will be examined in further depth in future blogs to give you a better idea of their intended use and what benefit you can derive from them.
We can export all this data via the top ‘Export’ button and this will pull all the top window data down into a CSV file. Likewise, if you choose a tab, it will simply export that granular level of information. This is very handy for methodically working through your website and beginning your journey of website improvements.
Within each window pane, we suggest going through each of these tabs to become familiar with where to find individual elements and appreciate how many ways we can interrogate the website URLs to discover what is working and what is broken that needs fixing. [6,13]
We can completely configure the SEO spider with multiple options to crawl large and complex websites quicker and easier to find relevant data elements. This is an advanced feature we will discuss more in-depth later in this series.
Clicking on Configuration > Spider will open a pop-up window where we can customise even further.
Rolling over each element will give you an explanation of what they do and what purpose they serve.
Likewise, selecting any element under the configuration menu will allow us to interrogate the chosen URL in a specific way to answer any queries we have too.
You can also connect Screaming frog to your Google assets and SEO tools to provide deeper analysis to segment your data and track more information at a granular level.
We can also save data to return to it at a later date and reduce the time it would take to recrawl the website, especially on large sites. Likewise, we can save to disk rather than memory to speed up the efficiency of how Screaming frog interacts with your assets and provides a better experience.
We can also set up scheduling of crawls under File > Scheduling to provide fresh crawls on a regular basis. For example, weekly crawling on a particular website.
We can also switch the mode of Screaming frog by selecting Mode > List and upload a list to be interrogated rather than crawl a website. This added benefit allows us to examine specific URLs to analyse just part of a website for a piece-meal approach to resolving issues.
Likewise, we can analyse sitemaps too to ensure they are not throwing any errors that could be impeding the performance of the website by preventing the spiders to crawl your web assets.
We can bulk export data too via Bulk Export > chosen asset to drill down into the data and export, not just the information acquired in the top window pane, but its granular path in the lower window pane too.
We can also produce reports, generate sitemaps and use visualisations to ascertain the feel of a website and interrogate its structure to visually see where improvements can be made.
Continue to find out more and discover the real power of each element contained within Screaming Frog.
The guide above should help illustrate the simple steps required to get started with Screaming frog SEO Spider.
For more information check out the videos on Youtube.
Likewise, if you have any further questions, then please get in touch via our contact page.