URL Status Code Checker with Python

You can check your status codes for your Google Ads destination pages within a few hours, you can use a simple Python script. It even works for large accounts.

If you want to check your status codes for your Google Ads destination pages you might find some Google Ads scripts for checking the URL status codes. This might work for smaller accounts and if you stay within the UrlFetchApp limits of 20.000 URLs per day. Otherwise you have to partition your URL data and it takes days to complete the status check of your full URL list. In many real world scenarios this is not an option.

If you have to check your full landing page set within a few hours, e.g. when you launch a new website or shop system, you can use a simple Python script that is doing the job. On my machine it took me 30 seconds to check a list of 100 URLs. This means you are able to check 12.000 status codes per hour.

To check your url status please do this:

1) Copy all your URLs to urls.csv. Put it in the same folder like your python script.
2) Run the script and wait
3) Look at the result in urls_withStatusCode.csv. For every URL an additional column with the http status code was added.

# status code checker
import requests
import csv
import time

SLEEP = 0 # Time in seconds the script should wait between requests
url_list = []
url_statuscodes = []
url_statuscodes.append(["url","status_code"]) # set the file header for output

def getStatuscode(url):
        r = requests.head(url,verify=False,timeout=5) # it is faster to only request the header
        return (r.status_code)

        return -1

# Url checks from file Input
# use one url per line that should be checked
with open('urls.csv', newline='') as f:
    reader = csv.reader(f)
    for row in reader:

# Loop over full list
for url in url_list:
    check = [url,getStatuscode(url)]

# Save file
with open("urls_withStatusCode.csv", "w", newline="") as f:
    writer = csv.writer(f)

If the approach is still not fast enough you can run the status code check also in parallel. This will improve the run time a lot. Please ask your IT department how many requests per second are acceptable.

Join the conversation on LinkedIn

Python in PPC / SEO

More Similar Posts