In the last year or two there's renewed interest around website performace. I think the trend started with Google announcing that Pagespeed is a ranking signal for both search and ads (quality score).

If you know anything about how the SEO industry works, it's pretty obvious what happened next: everyone jumped on the trend-wagon of optimizing for Google Pagespeed Insights scores, trying to squeeze out every little bit of competitive edge.

Why Does Google Care About Website Performance?

Good user experience affects Google's bottom line. Website performance is really all about mobile performance. Desktops have (generally speaking) access to better internet connections with no data caps. Mobile users on the other hand, are often limited by their data plans and slower connections depending on location and carrier coverage.

With more and more searches and visits (and thus search ads, display ads etc.) from mobile devises, Google wants to make sure that users can have a decent experience when they visit a website, even with a poor 3G connection.

If you click on a search ad from your phone, and it takes 30 seconds to load the website (you can check yours here), chances are you won't stay around that long. This also mean that the advertiser wouldn't pay for this click.

Moreover, inefficient websites take more resources for crawling. Google spends billions each year for crawling the web. Crawling slower websites also affects their bottom line.

This has created an urgency where the economic objectives are aligned with the user needs.

What Is the Google Pagespeed API?

The pagespeed API can help you get pagespeed scores at scale, in an automated way. Sometimes, it's useful to check the pagespeed score for each page of your wesbite to uncover potential issues that hinder the performance of specific pages.

Maybe it's a plugin you can do without, some unecessary JavaScript specific to a page, or something else altogether – evaluating the pagespeed score of each page is a great starting point for deciding what to do next in terms of performance optimization.

How to Use the Pagespeed API with Python

There are a few differrent versions of the API, but the script below makes use of Version 5 which is the latest one as of the time of writing.

To use it, you will need to create your own credentials (API key) here.

The script itself is straightforward, after importing a few necessary libraries and parameters (e.g. our API key), it reads the list of URLs we want to measure from a csv file, gets the pagespeed score, and writes the results in a new csv file that it creates.

You can easily create such a csv file with a list of URLs, by crawling your website with Screaming Frog and exporting the crawl data in csv.

Importing the Libraries and Parameters

import requests
import json
import csv
from time import sleep
API_Key = "qwertyuiopasdfghjklzxcvbnm"
baseURL = "https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url="

Creating the Request Function

This is where the bulk of the work is done. The function first forms the request URL "response_url", then uses the requests library to parse the response into the "response" object, which we then parse as json.

The API response contains a lot of detailed data, so we only parse the mobile and desktop pagespeed score as numbers in a scale of 100 (similarly to the Pagespeed Insights web page).

def get_pagespeed(API_Key, page_URL, baseURL, strategy):
    response_url = baseURL+page_URL+'&key='+API_Key+'&strategy='+strategy
    response = requests.get(response_url)
    json_data = response.json()
    lighthouseResult = json_data["lighthouseResult"]
    categories = lighthouseResult["categories"]
    performance = categories["performance"]
    score = performance["score"]
    return (score*100)
    sleep(1)

Putting Together the Response Data

This is the part of the code where the execution happens. We read the list of URLs from the CSV file, and for each row we run the "get_pagespeed" function we defined above. Then write the results in a new csv file that has 3 columns: URL, Desktop Pagespeed and Mobile Pagespeed.

myFile = open('url_list.csv','r')
outputFile = open('results/pagespeed_results.csv', 'w', newline='')
outputWriter = csv.writer(outputFile)
reader = csv.reader(myFile)
outputWriter.writerow(["URL","Desktop Pagespeed","Mobile Pagespeed"])
for row in reader:
    url = row[1]
    if url == "URL":
        pass
    else:
        try:
            desktop_pagespeed = get_pagespeed(API_Key, url, baseURL, "desktop")
            mobile_pagespeed = get_pagespeed(API_Key, url, baseURL, "mobile")
            print (url, desktop_pagespeed, mobile_pagespeed)            outputWriter.writerow([url,desktop_pagespeed,mobile_pagespeed])
        except:
            outputWriter.writerow([url, "Error", "Error"])
            print('ERROR WITH URL: ', url)
myFile.close()
outputFile.close()

If you are looking for an alternative way to do this, that doesn't involve running Python code on your computer, you can also use Screaming Frog. The crawler can use a few different services through their APIs, and pull associated metrics for every page it crawls. Keep in mind though that the API functionality is limited to the paid version which comes at £149 per year.

If you'd like to run the code, you can simply copy-paste it to your code editor, or download the files from GitHub (to prevent any formatting errors).