Tech SEO How to Run Automated Lighthouse Audits on WordPress Changes

How to Run Automated Lighthouse Audits on WordPress Changes

Must View

Ben White Set to Sign Long-Term Contract With Brighton Amid Premier League Interest

Brighton's Ben White, in action for Leeds. | Nigel Roddis/Getty ImagesBen White is close to signing a new contract...

Find Just the Right Free Photo With Freerange Stock

Using supply digital photography can be a challengingbusiness You have to find the right picture in collections with...
Web Content Creator, Web Designer, Graphics Manipulator, Security and SEO analyst, Freelancer, Digital Marketer, Promotional Video Maker.

You are in all probability accustomed to this state of affairs.

You put a lot of work to drastically enhance the web page pace and PageSpeed Insights scores of your client’s site.

How to Run Automated Lighthouse Audits on WordPress Changes

Mobile scores are significantly difficult to enhance!

How to Run Automated Lighthouse Audits on WordPress Changes

Your client installs a new app or plugin and all of your work goes down the drain.

How to Run Automated Lighthouse Audits on WordPress Changes

Can we go decrease than 1?

How to Run Automated Lighthouse Audits on WordPress Changes

This is a irritating, but sadly widespread scenario.

Fortunately, there’s a model of Google Lighthouse that may carry out audits on demand.


Continue Reading Below

It is known as Lighthouse CI.

How to Run Automated Lighthouse Audits on WordPress Changes

CI stands for Continuous Integration, which is a widespread observe in software program development the place modifications from totally different builders are merged into a central codebase (repository).

One attention-grabbing facet of CI is which you can run automated checks when modifications are merged. This is a excellent spot to carry out web page pace and web optimization automated QA (high quality assurance).

In the screenshot above, I outlined a Lighthouse Performance Budget and after I merged a change to the codebase, Lighthouse CI ran routinely and detected the change would decrease the pace efficiency.


Continue Reading Below

This QA failure may truly forestall the change from getting deployed to manufacturing.

Really cool. Right?

I set up a web site utilizing a trendy stack, the JAMstack, that helps CI out of the field.

However, taking benefit of this requires utterly altering platforms if you’re utilizing WordPress or comparable.

In this text, you’ll find out how to accomplish the identical, but with a conventional web site.

Specifically, we’ll use plain old WordPress, the most well-liked CMS on the planet.

Here is our technical plan:

  • We are going to create a GitHub repository to track WordPress modifications.
  • We are going to set up a Lighthouse CI motion to test modified URLs.
  • We are going to create a Google Cloud Function that runs on WordPress updates and does the next:
    • Gets essentially the most not too long ago modified URLs from the XML sitemaps.
    • Updates the Lighthouse CI motion configuration to test these URLs.
    • Commits our up to date configuration to the GitHub repository.
  • We are going to create and add a Lighthouse Performance Budget to find out when modifications damage efficiency.
  • We will evaluation assets to be taught more.

Creating a GitHub Repository

When your site is constructed utilizing JAMstack applied sciences, you need a source management repository for the web site code and content material.

In our case, WordPress content material resides in a database, so we’ll use the repository only for configuring Lighthouse and monitoring modifications.

One of essentially the most invaluable options of source management repositories is that each one your modifications are versioned. If your code stops working after new modifications, you may at all times go back to previous variations.

GitHub is the most well-liked choice and the one we’ll use right here.

How to Run Automated Lighthouse Audits on WordPress Changes

Once you create a repository, you’ll need to replace it remotely out of your local laptop or scripts.


Continue Reading Below

You can do that utilizing the git command line instrument.

Install it on your laptop for those who don’t have it.

As we’ll replace the repo, we need to get an authentication token.

How to Run Automated Lighthouse Audits on WordPress Changes

When you create the entry token, select the scopes repo and workflow.

GitHub Actions

GitHub Actions permits for automating workflows utilizing quite simple configuration recordsdata.

One of the actions accessible is the Lighthouse CI Action that we’ll use right here.

How to Run Automated Lighthouse Audits on WordPress Changes

In order to activate the motion, we merely need to:


Continue Reading Below

  • Create a folder (.github/workflows) within the root of the repository.
  • Add a YAML configuration file.

Let’s evaluation the technical steps to do this.

We may carry out the steps manually utilizing the desktop GitHub app or the command line.

We will carry out them from Python as an alternative so we will automate the method.

Cloning a GitHub Repository

Before we will make modifications, we need to clone/copy the repository regionally.

Let’s set up the Python library we’ll need to problem Git instructions.

pip set up gitpython

Next, I outline some variables to indicate the repo, entry token we created above and local folder to retailer the cope.

I created a personal repo referred to as wordpress-updates.





Cloning from Python is nearly as easy as from the command line.

from git import Repo

Repo.clone_from(distant, full_path)

Updating the Cloned Repository

Once we have a local copy, we will edit recordsdata, create new ones, remove, and many others.

Let’s create a listing for our Lighthouse CI workflows.


Continue Reading Below

%cd wordpress-updates/
!mkdir -p .github/workflows

Then, we will create the configuration YAML.

In Google Colab or Jupyter, I can use %%writefile

%%writefile .github/workflows/predominant.yml

name: Lighthouse CI

on: push



runs-on: ubuntu-latest


- makes use of: actions/checkout@v2

- name: Audit URLs utilizing Lighthouse

makes use of: treosh/lighthouse-ci-action@v3


urls: |

uploadArtifacts: true # save outcomes as an motion artifacts

temporaryPublicStorage: true # add lighthouse report to the non permanent storage

You can find a copy of this configuration on this gist. I highlighted in daring the principle areas of interest.

The most essential is there’s a part to specify the URLs to test by separating them by new strains.

After we create this configuration file and listing, we will send our modifications back to the GitHub repository.

First, let’s add the recordsdata we change to the model historical past.

# Provide a commit message

repo.index.commit('add lighthouse CI motion.')

Our commit message will indicate the aim of the change and can show up within the repository historical past.

Pushing Our Changes to the GitHub Repository

Finally, we’re prepared to push our modifications to the repository.

origin = repo.distant(name="origin")

You can open the repo and evaluation the modifications that had been dedicated, but the person is listed as root.

We can configure our person with these instructions.

with repo.config_writer() as git_config:

    git_config.set_value('person', 'e mail', 'your@e')

    git_config.set_value('person', 'name', 'Your Name')

The next time you push one other change, it’s best to see your name.

How to Run Automated Lighthouse Audits on WordPress Changes

After you’ve accomplished all of these steps, you have to be in a position to click on on the Actions tab of your repository and find the automated checks on the URLs you listed within the YAML file.


Continue Reading Below

Under the importing half, you may find the hyperlinks to the reports for every URL.

Really good!

Updating the YAML Configuration File Automatically

Hard coding a small list of URLs to test isn’t significantly versatile.

Let’s find out how to replace the configuration file from Python.

But, first, we need a bigger list of URLs to test to actually put this to good use.

What better place than the XML sitemaps?

Fortunately, I lined a great library from Elias Dabbas that makes this a breeze.

pip set up advertools
df = adv.sitemap_to_df(sitemap_url)

It creates a pandas data body that I can simply filter to list solely the pages up to date after a date I specify.

For instance, right here I want the pages up to date in October.

df[df["lastmod"] > '2020-10-01']

How to Run Automated Lighthouse Audits on WordPress Changes

You can create a list of URLs to test utilizing this or any standards that make sense on your use case.


Continue Reading Below

Let’s say your site has thousands and thousands of pages, checking each URL could be removed from sensible.

Assuming you have categorized XML sitemaps, an efficient method is to simply pattern one or more URLs from every sitemap.

Most pages of the identical sort, typically use the identical HTML template and the web page pace scores won’t change much per URL of the identical sort.

Reading a YAML file

We can use the PyYAML library to learn the configuration file we copied from the repo into a Python data construction.

pip set up PyYAML

import yaml
with open(".github/workflows/main.yml", "r") as f:

    main_workflow = yaml.load(f)

This is what predominant.yml seems like when loaded into the Python area.

How to Run Automated Lighthouse Audits on WordPress Changes

Updating the list of URLs is comparatively easy from right here.


Continue Reading Below

#that is the "path" to the URLs

Here are the steps to replace the URL list.

  • Copy the present URLs to a variable, in case we want to keep them.
  • Convert our new list of URLs to a string the place every URL is separated by a new line.
  • Assign our new list and optionally the old one to the dictionary worth.

We can carry out step 2 with this code.

"n".be a part of(modified)

Here is the final sequence.


main_workflow["jobs"]["lighthouse"]["steps"][1]["with"]["urls"] = old_urls + "n".be a part of(modified)

If we don’t want to keep the old URLs, we will merely remove the code in daring letters.

Writing Back to the YAML File

Now that we made our modifications, we will save them back to the configuration file.

with open(".github/workflows/main.yml", "w") as f:

    f.write(yaml.dump(main_workflow, default_flow_style=False))

I had to add an additional directive, default_flow_style=False so as to protect the formatting of the URLs as close to the unique as doable.

If you run the instructions within the GitHub part once more to add the predominant.yml file, commit and push the change to the repo, it’s best to see one other Lighthouse CI run with an up to date number of URLs.

This time, the URLs usually are not hardcoded but generated dynamically.


Continue Reading Below


Creating a Lighthouse Performance Budget

One of essentially the most highly effective options of Lighthouse CI is the flexibility to test reports towards budgets and fail runs when the budgets are exceeded.

That is definitely the best step of this complete setup.

You can find all of the configuration choices right here.

We can write an instance finances to take a look at based mostly on the instance within the documentation. Then regulate the values in accordance to the failure/success report.



"path": "/*",

"resourceSizes": [


"resourceType": "document",

"budget": 18



"resourceType": "total",

"budget": 200





You can save the file to the foundation of the repository and replace the YAML configuration to embrace it.

main_workflow["jobs"]["lighthouse"]["steps"][1]["with"]["budgetPath"] = "./budget.josn"

When you commit the modifications to the repository, make positive to also add the finances.json file.

That’s it.

We can run automated Lighthouse reports on any URLs we want.

But, how will we set off all these steps once we replace WordPress pages or posts?

Ping Services

We are going to put all our Python code lined up to now inside a Google Cloud Function that we will set off when there are WordPress updates.


Continue Reading Below

In abstract, the code will:

  • Download the XML sitemaps and find the newest pages up to date.
  • Update/create a predominant.yml workflow file with a list of URLs to test.
  • Commit the modified predominant.yml to the GitHub repository.

However, we solely want to call our code when there are new modifications in WordPress.

So, how will we do this?

Fortunately, WordPress has a Ping mechanism we will use for this.

How to Run Automated Lighthouse Audits on WordPress Changes

We solely need to add our Cloud Function URL to the list of Ping services.

I tried studying the payload WordPress sends to see if the up to date URLs are included and sadly, it at all times listed the home web page and the site feed in my assessments.

How to Run Automated Lighthouse Audits on WordPress Changes

If WordPress sent the list of up to date URLs, we may skip the XML sitemap downloading step.


Continue Reading Below

Deploying the Cloud Function

Here are the steps to create the Cloud Function. Make positive to allow the service first.

First, we need to authenticate with Google Compute Engine.

!gcloud auth login --no-launch-browser
!gcloud config set venture project-name

I created a gist with a working Cloud Function with all steps to get this idea to work. Please learn the code to personalize it to your details and GitHub repository.

I also created a gist with the necessities.txt file you need to embrace when deploying the Cloud Function.

You need to obtain both recordsdata and have them in the identical folder the place you’ll execute the next command.

I moved the GitHub credentials to use setting variables. It isn’t a good idea to have them in source management.

!gcloud features deploy wordpress_ping_post --runtime python37 --trigger-http --allow-unauthenticated --set-env-vars username=<your GitHub username>,password=<copy your GitHub personal token>

You ought to see output comparable to the one below.

How to Run Automated Lighthouse Audits on WordPress Changes

I highlighted the URL you need to add your WordPress Ping services list.


Continue Reading Below

Resources to Learn More & Community Projects

I didn’t embrace the efficiency finances half in my Cloud Function. Please think about doing that as a homework.

Also, I encourage you to set up a Lighthouse CI server and replace the file predominant.yml to send the reports there.

If you want a more acquainted interface, think about this venture from the superb staff at Local web optimization Guide. It makes use of Data Studio as the reporting interface.

Latest Python web optimization Projects

The momentum of the Python web optimization community retains rising robust! 🐍🔥

As ordinary, I requested my follower to share their newest initiatives.

Charly Wargnier knocked out of the half, with not only one venture, but three spectacular ones and one other one within the works.


Continue Reading Below

Greg Bernhardt simply released scripts to automate Lighthouse reports. Make positive to test his site as he’s been posting sensible Python scripts constantly.

Nichola Stott is working on automating the Wave API. Not Google Wave, although!

David Sottimano pitched the idea of doing a hackathon in 2021 and there’s already a lot of interest in it!

More Resources:

Share this:


Please enter your comment!
Please enter your name here

Related Posts

Trending Now

Every England Player to Score on Their International Debut within the 2000s

Kane netted inside two minutes of coming on | Ian Walton/Getty ImagesThe 2000s have been fairly a depressing time for fans of the English...

How to Run Automated Lighthouse Audits on WordPress Changes

You are in all probability accustomed to this state of affairs.You put a lot of work to drastically enhance the web page pace and...


Your Next Web Project...

We are a team of professionals to help you setup your next Web Design/Development project. You dictate, we build.