Another day, another blog post? This is a rarity for sure. At the moment, I’m starting to hit my stride on getting a few old items done in R. 🙂 If I keep writing more blog posts, I’ll need to start taking more photos again for the feature covers!
Quickly sharing how I was able to use R and Google Search Console to download & export all of my website’s search analytics data.
Why Use R to Download & Export all Search Console Data?
Even for the novice SEO, this is likely a well-tread frustration. Valuable data to make critical business decisions is sitting in Google Search Console. But you can’t access it at scale!
It’s either hidden behind the abysmal UI export limit, or you have to do so much filtering that’s impossible to scale the data extraction.
SearchConsoleR: Easy Google Search Console API for Unlimited Data Access
Similar to my previous post (How to Bulk Submit Sitemap URLs in Google Search Console) this article looks at the splendid SearchConsoleR package in R. See the link above for prerequisites.
The code to get all your data in Google Search Console is really quite brief. I’ve listed it below, as well a link to the Github gist.
Note that if it’s your first time using RStudio, you’ll need to use install.packages() to load up the necessary dependencies.
# Load packages library(searchConsoleR) library(dplyr) library(ggplot2) library(writexl) # Authorize & choose Google profile scr_auth() # Specify website --- CLIENT website <- "https://www.sample.com" dimensions <- c("page", "query") ttl_queries <- search_analytics("https://www.sample.com", "2019-08-01", "2019-08-31", c("query", "page"), searchType="web", rowLimit = 100000) # Write the data frame to an XLSX file write_xlsx(ttl_queries, "ttl_queries.xlsx")
If the code has run correctly, you should have a (potentially) massive Excel file at your disposal! The possibilities are endless – data crunching in Excel, piping to a database or some other visualization software. Enjoy!