Cover image of a blog post about automating Screaming Frog SEO crawls with command line and batch files

Automated Screaming Frog Reports With Command Line & Batch Files

Did you know that you can automate Screaming Frog crawl reports using Command Line and Windows batch files?

First, the credit goes to Tom Gregan and Open Source SEO for cutting the trail. 🙂

Starting with version 10, Screaming Frog SEO Spider has offered command line functionality– how cool is that? What’s more, the command line functionality is quite customizable, capable of both pulling in configuration files and exporting crawl data with flexibility.

Here’s some sample code for Windows Command Line you can run. Again, credit to Tom and Open Source SEO here. You can drop this into Notepad, Sublime Text, etc. and save as a batch file.

set crawlResults=C:\Users\you\Documents\client\Weekly-SFSEOS-Results
:: Creates variable %results% for where crawl will be saved

set sf=C:\Program Files (x86)\Screaming Frog SEO Spider\
:: Creates another variable telling Command Line where it can find SF

set configFile=C:\Users\you\Documents\client\Weekly-SFSEOS-Results\sample-config.seospiderconfig
:: Creates another variable telling CLI and SF where to find crawl configuration instructions - may be needed for more complex crawls or API settings

set domain=
:: Sets a variable telling CLI and SF which domain or URL to crawl

chdir /d "%sf%"
:: Directs CLI to the directory you specified earlier

ScreamingFrogSEOSpiderCli.exe --config "%configFile%" --crawl "%domain%"  --save-crawl --headless --output-folder "%crawlResults%" --export-format "xlsx" --export-tabs "Internal:All" --timestamped-output
:: Runs the SF CLI exe file, which performs the crawl with the variables specified earlier 

The fun part (as if this wasn’t cool enough!) is that you can make a ton of these batch files, slave them from a separate “leader” batch file, and schedule them to run on a recurring basis.

start cmd /k Call website-a.bat
start cmd /k Call website-b.bat

Important to note- the above code sample assumes that you place the master and the “follower” Screaming Frog crawl batch files in the same directory on your computer or server.

How nice is that? From here, you could conceivably have lots of options to pull into a business intelligence tool like Tableau, Power BI, Data Studio or pull into data frames in R or Python.

This could mean the end of blind spots for your SEO efforts! Think about monthly, weekly, or even daily crawls of your sites, competitors, publishers, etc. Happy coding!

1 reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply