Lighthouse audits running headless !

Lighthouse is an auditor created by Google to, as they say, See how well your website performs. Then, get tips to improve your user experience

It has always been an easy tool to use but with plenty of caveats. You can run it straight from you Chrome browser to give you the results. This is great to get an idea about some of the websites you visit but will not allow you to have a standardized test or, therefore, results. Indeed if you just run it on your computer the results will vary depending on where you are, your Internet speed and the amount of Internet traffic happening locally while you are testing.

Developer interest

As a developer I always want to track the performance of a project on every change. On top of all the variables just mentioned, running this manually from a chrome browser takes a lot of time.

And last, but not least, depending on the version of lighthouse the way the results are calculated can be quite a bit different. And your version of Lighthouse depends on your version of Chrome.

When I (pseudo) recently saw the release of Lighthouse version 6 I wondered how I could make it work without having to do anything by hand. It turns out (Yeah it took me a long time to figure this out) it is a public GitHub repository !! This deserves a big title =>

Lighthouse 6 was just (ish) released !

Great so you can just download it and run it on your computer directly without having to click everywhere on Chrome ! Indeed you can but as a bit of a server person I also wanted this to not need or touch Chrome.

This is why I started out trying to get Lighthouse to run inside a docker container without worrying about having a local Google Chrome to control.

Having a Docker container

In order to do this I created a Docker container that will install Lighthouse and Headless Chromium

FROM node:13-alpine

# Installs the latest Chromium package.
RUN echo "" > /etc/apk/repositories \
    && echo "" >> /etc/apk/repositories \
    && echo "" >> /etc/apk/repositories \
    && echo "" >> /etc/apk/repositories \
    && apk upgrade -U -a \
    && apk add --no-cache \
    libstdc++ \
    chromium \
    harfbuzz \
    nss \
    freetype \
    ttf-freefont \
    wqy-zenhei \
    bash \
    && rm -rf /var/cache/* \
    && mkdir /var/cache/apk

RUN mkdir -p /usr/src/app

WORKDIR /usr/src/app

RUN wget -P /usr/src

RUN export CHROME_PATH=/usr/lib/chromium/

RUN yarn global add lighthouse

COPY /usr/local/bin/audit
RUN chmod +x /usr/local/bin/audit

Creating a custom script to make sure every run does what needs to be done

That was a long subtitle, as you can see in the Dockerfile at the end

COPY /usr/local/bin/audit
RUN chmod +x /usr/local/bin/audit

This makes sure that a custom bash script is available to run when you start the container.

The script is the following :


usage() { echo "Usage: $0 [-u <string>] [-p <string>] [-o <string>] [-n <string>]" 1>&2; exit 1; }
defaultOptions='--chrome-flags="--headless --no-sandbox" --no-enable-error-reporting'
while getopts ":u:p:o:n:" o; do
case "${o}" in
    IFS=',' read -r -a paths <<< "${OPTARG}"
    exit 1
shift $((OPTIND-1))

if [ -n "$url" ]; then
  if [ -n "$name" ];then
    outputPath=" --output-path ./${name}.report.html"
  eval lighthouse "$url" "$defaultOptions" "$outputPath"
  if [ -n "$url" ]; then
    for path in "${paths[@]}"; do
      if [ -n "$name" ];then
        filename=$(echo "$name""$path" | tr / _)
        outputPath=" --output-path ./${filename}.report.html"
      eval lighthouse "$url""$path" "$defaultOptions" "$outputPath"

If you want to quickly scan a simple site docker run --rm -v ~/reports:/usr/src/app codebuds/lighthouse audit -u https://site.mine

This will run the audit and then copy the file from within the container to your computer towards your /home/{your name}/reports directory.

This was a great start as it takes about 20 seconds on my computer instead of the 2 minutes when running it straight from my Chrome browser.

But I also wanted to make it simple to test paths within the site by providing the default paths and the following sub-paths. So I added an option -p that will let you add one or multiple sub-paths separated by a , . For Example :

docker run --rm -v ~/reports:/usr/src/app codebuds/lighthouse audit -u -p /en/blog/traefik-intro,/en/contact

This will create all the results in your volume but, by default, Lighthouse only uses the base path and a date for the filenames. so when running the previous command the final files in the ~/reports directory will be :




Which does not help to figure out which file is related to which page or sub page. This is why I have added another option : Add a default basename to the files, with the -n options you can set a default name that will take the subdomains path names into account (and remove / as that can be a problem on the OS).

docker run --rm -v ~/reports:/usr/src/app codebuds/lighthouse audit -u -p /en/blog/traefik-intro,/en/contact -n

Will give you :


Which will make it easier to find the file that corresponds to the page tested.

If you want to locally have the sub-domain name and a date let me know and I'll add some options however this is not my case. I'll tell you why.


Here we are ! I want to have most things in my life to be automatic. I have a server running GitLab and you can tell it to run many things as soon as you push changes to any of your code. In order to do this it is simple just add a task to the .gitlab-ci.yml file in your project liek the following :

    - main
  stage: audit
  image: codebuds/lighthouse
  when: on_success
      - ./*.report.html
    - audit -u -p /en/blog/traefik-intro,/en/contact -n
    name: develop
    - develop

Of course this is the one you want to run last when all the build and deploy stuff has been done.

Then if it has worked (yay!) you can find the files in your GitLab CI / CD jobs and download them (or if you are fancy run them in the GitLab pages).

A lot more todo

As I mentioned I love automating stuff. Some of it has been done, but, I can see a lot more to do. Instead of having to view or download any of the audit pages I could set Lighthouse in JSON mode and upload the data to a server. Maybe then make it look at the difference between the new build and the last one and trigger an alert on Mattermost thanks to our bundle. But so far it is already saving a lot of time and helping us keep our sites in good condition.

Don't hesitate to ask any questions in the comments or create any issues in the gitlab.

Thanks for reading, see you soon.