Hello, everybody!
I wanted to share a Python script I've created to automate the extraction of Core Web Vitals metrics for multiple URLs. This script is designed to help you gather essential metrics for both Desktop and Mobile from the Lighthouse API and save them in an Excel file.
The cool thing is that the script has an asynchronous function, allowing it to run all of the URLs almost simultaneously, making it a very fast script.
Here's how the script works:
Basic Usage:
- You'll need to obtain access to the Lighthouse API.
- Edit the list of URLs that you want to analyze.
What I've implemented in my agency is as follows:
- Custom URL List: I customized the URL list to include the most important URLs from each category of our clients' websites.
- Data Storage in Google Sheets: I configured the script to save the data into a Google Sheets file. You can find a helpful tutorial for this process here.
- Scheduled Execution with Google Cloud Function: I set up a Google Cloud Function to run the script automatically every day. While I plan to create a tutorial on how to do this, there are existing tutorials on YouTube that can guide you through the process.
- Dashboard in Looker Studio: To visualize and analyze the metrics, I created a dashboard in Looker Studio.
- Alerts via Microsoft Power Automate: I established a workflow in Microsoft Power Automate to send me an email notification whenever a score experiences a significant decrease.
I hope it can make your life easier by automating this type of analysis.
If you have any doubts, feel free to ask me! :)