Service Workers

JAMstack + Service Workers

Contents


Overview

This project was developed for the purpose of implementing service workers, and other PWA functionality, into a specific JAMStack architecture that I use in web development at work.

Most JS documentation on service workers (or any other feature for that matter,) only shows minimal examples of each feature or method—in a very piecemeal way—leaving you to put it all together. instead, I wanted to provide a real-world example of implementing service workers, and the challenges I faced.

The JAMStack architecture is a setup using Jekyll, Nodejs, Webpack + sass-loader + Corejs + Babel + ES6 , SASS, autoprefixer etc.. This architecture allows us to write and maintain a modular SASS setup for styling and ES6 which is polyfilled and transpiled into JS with greater browser support.

The project is setup to use the babel presets defined in package.json — specifically, Corejs v3 with ES proposals. Browser support is defined in the "browserslist". Because I need to support IE10+ at work, I let "useBuiltIns": "usage", handle the polyfilling as opposed to importing individual polyfills where I use them.

Our build process is handled using npm-scripts (npm run <SCRIPT_NAME> ) and the npm package npm-run-all which gives us the ability to run npm scripts sequentially and in-parallel.

Using npm-run-all we are able to run our Jekyll command and Webpack commands in parallel:

  • Jekyll handles the local sever with it’s --livereload option and, of course, compiles the static site files.
  • Webpack handles transforming our Custom ES6 files into cross-browser JS, and compiles, prefixes, and minifies custom SCSS into stylesheets.

We utilize our own custom Webpack plugin (built for Webpack 5) to generate a YAML file.
This YAML file contains the hash of the main bundle (this is the same hash—'[fullHash]'—accessed inside a webpack.config.js file using: filename: '[name].[fullhash].js'.)

To be clear, when I use the word “hash” in this document, I’m referring to the hash (i.e. the random looking string of characters) that Webpack calculates from the files it consumed (e.g. c1ffa09121e3327be06c.)

The two most important APIs used include Google Sheets and Google Docs. I often use the Google Sheets API when I have content/information that updates frequently. Instead of running an entire site-build to generate new HTML, the Sheets API allows me to “fetch” new data (held in the spreadsheet) on-the-fly as the user loads the page.

Project Dependencies

Nodejs + Ruby

To run this project locally you will need a linux-like environment (I use a mac with Big Sur) with Nodejs v14 and Ruby-2.6.3. I use Ruby Version Manager RVM and Node Version Manager NVM to install and manage Ruby and Nodejs versions.

The project has an .nvmrc file. If you want to use it, and have cowsay installed (via Homebrew) add the following to your .functions dotfile:

## Use a local .nvmrc file if present # from https://stackoverflow.com/a/48322289
enter_directory() {
  if [[ $PWD == $PREV_PWD ]]; then
    return
  fi

  PREV_PWD=$PWD
  [[ -f ".nvmrc" ]] && nvm use > /dev/null 2>&1 && nvm use | cowsay $n
}

export PROMPT_COMMAND=enter_directory

Jekyll

In addition to ruby-2.3.6 you will need to install Jekyll. See https://jekyllrb.com/docs/installation/ for the latest method of installation.

Install Project Dependencies

To run the local build commands, you will need to install the projects npm and gem dependencies. Run the following commands to install them:

npm i && bundle i

The package.json file will download the Webpack, Babel, Corejs, Bootstrap & Popperjs, colors (for colorful console output,) and npm-run-all Nodejs dependencies. The Gemfile installs jekyll v4 and rouge v3.26 (for syntax highlighting.)

Running the builds

The project has a development build and production build. Both use the jekyll serve --livereload command for jekyll, and the npx webpack command for Webpack. These two command run in parallel by running either the production or development script. Use npm run production and npm run development to start the commands in parallel.

When you are finished with previewing the build (navigate to http://localhost:3000 in you browser to preview,) use control + C to stop the running commands. Both production and development builds watch for file changes and therefore the processes need to be stopped.

After changes are created in a development build, they should be tested and previewed in the production build. Use npm run production and navigate to http://localhost:3000 in your browser. Only an npm run production build should be used to commit changes to GitHub—DON’T PUSH A DEVELOPMENT BUILD.

Development Build

Using the dev-build npm run development sets a production environment variable for Nodejs. The webpack.config.js file checks for the variable and creates a development version of the bundled JS, and inline CSS that webpack injects into the document <head> (with <style> tags). The development bundles are easier to debug and read, and the inline-styling results in a faster development environment.

Production Build

The production build (npm run production) creates a minified production version of the bundled JS and a separate CSS file with the bundle’s hash in its filename (main.[fullhash].css.)

Project Features

To follow along with this documentation it will help to know about some of its features. The features discussed in this section are as follows:

It is especially important to understand my custom Webpack 5 plugin WebpackHashFilePlugin.js as this documentation addresses some the real-world challenges when trying to implement service workers.

ES6 Modules

Exports

This project setup uses Webpack 5 to transpile and polyfill any custom ES6—the same setup I use at work. We use a specific ES6 module-pattern to keep code organized, reusable, and easy to maintain. It uses ES6 modules:

  • A module should only export one function, (its export default <myModulesMainFunction>;.)
  • All other functions are to be internal—they are defined and referenced only from within the module’s scope.
  • A module should contain code related to a single task, functionality, or solution. No unrelated code.
// If this module depends on other JS modules, add any static imports here:
import someFunction from './someModule.js';

// Define variables available to this module's scope at the top:
const SOME_IMPORTANT_CONSTANT = 'messing with me may break things.';

function someInternalTask(param) {
  // Code related to this small task
  someFunction(param);
}

function myDefaultExportFunction() {
  // More function calls and code for this module ...
  someInternalTask(SOME_IMPORTANT_CONSTANT);
  // ...
}

export default myDefaultExportFunction;

If you have a very small/simple module you could also define the default export inline with the functions declaration like this:

export default function myDefaultExportFunction() {
  // More JS ...
}

Imports

The project is also capable of dynamic imports and lazy-loading of modules.
Static imports (import moduleName from './moduleName';) must be at the top of any JS file. The ES6 import() method, however, can be used anywhere—even inside an if-statement:

const MY_CONST = 'some important string.';
const myRandomTest = (MY_CONST.search(/important\sstring\./g) !== -1);
let param = undefined;

if ( myRandomTest ) {
  import('./myModule').then((module) => {
    const defaultFunction = module.default; // Define the modules default func

    defaultFunction(param);
  });
}

The modules’ default function needs to be defined after the import. The shortest way to do this is inline with the .then() callback of the import:

window.addEventListener('load', () => {
  import('./myModule').then(({default: defaultFn}) => { // Much more concise
    // More code ...
    defaultFn();
  });
});


Important Information

NOTE: You cannot mix different module syntaxes (i.e. CommonJS + ES6 modules.)

If you want to use CommonJS, or some other module sytax, you will need to change all the export/import statements for the bundle entrypoint file and it's module dependencies.

Do not deviate from the ES6 module syntax, or try to export multiple functions from a single module, or Webpack may throw an error and exit the build command.


Cache Busting with Webpack + Jekyll

// Custom Webpack plugin to output the hash into a file.
// Filename/location: `./builtools/WebpackHashFilePlugin.js`
 
// 1.) Require any dependencies here:
const fs = require('fs'); // Built into Nodejs - no installation needed
const path = require('path'); // Built into Nodejs - no installation needed

// 2.) Create the plugins' class constructor and default options:
// Webpack 5 plugins require a class method with a constructor and options for setting defaults
class WebpackHashFilePlugin{ // Class constructor
  constructor (options){
    // 3.) Assign some default options which can be overridden when the plugin is initiated.
    this.options = Object.assign({}, {
      fileName: 'hash',
      path: '../_data.yml'
    }, options)
  }

  // 3.) The Webpack plugins' class needs an `apply(compiler)` method to hook into the `compiler`
  apply(compiler) {
    const options = this.options;
    // 4.) Here we add a 'hook' to 'tap' into the webpack-compiler's `stats` object. Pretty standard Webpack plugin code.
    compiler.hooks.done.tap("WebpackHashFilePlugin", stats => {
      // 5.) Setup some variables for accessing the plugins' options.
      const content = stats.hash; // The string held within this `const` will become the contents of the written file.
      const outputPath = options.path; // String representing the path to create the file (relative to plugin files' location, e.g. '../data/`.)
      const fileName = options.fileName; // String for the filename + extension (e.g. 'hash.yml').
      // 6. Uses Nodejs builtin `path` method to resolve the filename and location.
      const output = path.resolve(__dirname, outputPath, fileName);
      // 7. Use the Nodejs builtin `fs.writeFile()` to write the hash file containing the bundle's hash.
      fs.writeFile(output, content, (err) => { // `fs.writeFile()` takes 3 params: path containing the filename/extension/location, contents, and an error callback-function
        if (err) {
          console.error(err)
          return
        }
        // Log some info if the file is written successfully
        console.log(`Plugin: {WebpackHashFilePlugin} created: ${output}`);
        console.log(`Hash: ${content}`);
      });
   });
  }
};
// 8.) Export the plugins class
module.exports = WebpackHashFilePlugin;

Because the hash is calculated from the bundle’s files, it is guaranteed to be a new value, when any changes to the source files are made. That makes Webpack’s hash perfect for cache-busting purposes.

In the ./webpack.config.js file we can dynamically name files. In the config.output.filename setting, you can use [fullhash] in the filename string (example below):

const config = {
  output: {
    filename: '[name].[fullhash].js',
  },
  // More configuration below...
}

module.exports = config;

We use a hash.yml file with the hash as it’s content to accomplish the following:

  • The YAML file is injected into _data/hash.ymla Jekyll folder—triggering the Jekyll-build to update:
    • This causes Jekyll to copy any changes in the ./assets/* folder and subfolders into the site build.
    • This also causes the local server to update via Jekyll’s livereload.
  • The hash is also built into any JS script or CSS link elements (via Jekyll’s Liquid abilities and code similar to this: <link href="/dist/main.{{ site.data.hash }}.css">):
    • {{ site.data.hash }} allows us to access the contents of the hash file, since ./_data/ is a Jekyll folder with a special purpose.
    • Any styling or JS changes will output a new ./_data/hash.yml which is built into new files/filenames.
    • If the above link element used the hash c1ffa09121e3327be06cit would become:
      <link href="/dist/main.c1ffa09121e3327be06c.css"> in the HTML markup.

To use the above original plugin file, you need to require the plugin and add some configuration options from within our webpack.config.js file:

// Filename/location: `./webpack.config.js` file for Webpack v5
// 1.) Require the plugin:
const WebpackHashFilePlugin = require('./buildtools/WebpackHashFilePlugin'); // Our custom plugin found in `/buildtools`

// 2.) Add the plugin to your plugins array:
const plugins = [
  new WebpackHashFilePlugin({ // Initialize our custom plugin
    path: '../_data/',
    fileName: 'hash',    // Remove the '.yml' extension from fileName.
    fileExtension: 'yml' // Add a fileExtension option
  }),
];

const config = {
  plugins, // 3.) Reference the plugins array in the config
  entry: {
    'main': './assets/js/src/main.js',
  },
  // More Webpack configuration code here ...
}

module.exports = config;


Important

IMPORTANT: On your first build using the WebpackHashFilePlugin.js plugin, you may get an error saying that the hash file does not exist.

If this happens, simply create two blank files in the same locations that the hash files will build:

  • _data/hash.yml
  • hash.json

Then, run another build.

Why do it this way?
It avoids giving the plugin unnecessary permissions. Giving file-write permission where it can be avoided is not a safe practice. Why open up potential vulnerabilities when you can avoid it?

Service Workers Challenges

 serviceworkers/
      |__ assets/
      |     |__ js/
      |         |__ dist/
      |         |     |-- 29.c1ffa09121e3327be06c.js
      |         |     |-- 314.c1ffa09121e3327be06c.js
      |         |     |-- 671.c1ffa09121e3327be06c.js
      |         |     |-- 909.c1ffa09121e3327be06c.js
      |         |     |-- main.c1ffa09121e3327be06c.css
      |         |     |__ main.c1ffa09121e3327be06c.js
      |         |
      |         |__ src/
      |              |-- main.js
      |              |__ registerServiceWorker.js
      |
      |__ serviceworkers.js

As you can see above, our cache-busting mechanism (using a random hash in JS and CSS filenames,) makes it a little challenging to cache those assets in a service worker. To cache the files in the service worker, we need the filenames and path—our filenames are not predictable. So I needed to find a way to pass the hash created by Webpack to the service worker.

My initial idea was to use the navigator.serviceWorker.controller.postMessage() method to send the hash to the service worker. I would first need to get the bundle’s hash into my custom JS file (and then hand it of to the service worker via postMessage().) Instead of creating another solution, I decided to utilize my existing custom Webpack plugin.

I would need to modify the Webpack plugin to be able to output two different files—one for the existing _includes/hash.yml file, and the second to create a new hash.json file.

I could then access this hash.json file from within my custom JS file that registers/installs the service worker. This, I thought, would allow me to send the hash to the service worker using the postMessage() method.

After much difficulty trying to get the navigator.serviceWorker.controller.postMessage() method to hand off the hash, I decided I would go about the problem a little differently.

For making a request and fetching a resource, use the fetch() method. It is implemented in multiple interfaces, specifically Window and WorkerGlobalScope. This makes it available in pretty much any context you might want to fetch resources in.

Source: https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API#concepts_and_usage

I remembered from reading a lot of Service Worker documentation on MDN, that the Fetch API is among the allowed methods you can use from within a service worker—and that the synchronous XMLHttpsRequest method would not work.

So, instead of trying to hand-off the hash from the registering JS, to its’ service worker, I thought it would be simpler to access a ./hash.json file from the service worker’s own context via WorkerGlobalScope.fetch().

Adding a simple Fetch API call to the service worker’s installation event is pretty simple:

fetch('/hash.json').then(response => {
  if (!response.ok) {
    throw new Error(`HTTP error! status: ${response.status}`);
  }
  return response.json();
}).then((json) => {
  const hash = json.hash; // The hash is stored under the JSON object's hash key

  // Use the Webpack hash here
  console.log(hash);
  // Expected logged result: 'c1ffa09121e3327be06c' (or similar random string of the same length)
})

The Solution

To implement this idea I need to modify my WebpackHashFilePlugin.js Webpack-plugin-file to be able to create both the existing ./_data/hash.yml file and a new ./hash.json file in valid JSON format.

The hash.json file, along with the Fetch API, will allow us access the bundle’s hash from within the service worker’s scope. We can then use this hash to reconstruct the filenames for caching in our app.

For this, I need to do the following modification to the plugins JS file:

  • Create another plugin option (with a default setting) to specify a file-extension separate from the file-name:
    • This will allow us to create either a JSON or YAML file.
  • Modify the output file contents to create a valid JSON file (if the extension is json).

The original plugin file is shown above. It’s pretty easy to figure out what is going on in the plugin if your familiar with JS. Take a look at the file and my inline comments.

WebpackHashFilePlugin.js modifications

The modification needed to add the fileExtension option and JSON syntax support are shown below. Lines of code that stay the same are commented-out via JS inline comments (// Inline comment):

// class WebpackHashFilePlugin{
//    constructor (options){
        this.options = Object.assign({}, {
          fileName: 'hash',
          path: '../_data', // Remove the file-extension from path
          fileExtension: 'yml' // Add a `fileExtension` key with a value set to the file's extension (e.g. 'yml')
       }, options)
//   }

//  apply(compiler) {
//    const options = this.options;
//    compiler.hooks.done.tap("WebpackHashFilePlugin", stats => {

      let content; // Define content as an undefined `let` instead of a `const`.

      if (options.fileExtension == 'json') { // If the `fileExtension` option is set to 'json' ...
        content = `{\n  "hash": "${stats.hash}"\n}`; // ... Wrap the hash in an object with a `"hash"` key set to the hash.
      }

      if (options.fileExtension == 'yml') { // If the `fileExtension` option is 'yml' ...
        content = stats.hash; // ... Set the file contents to the hash itself.
      }

//    const outputPath = options.path;
//    const fileName = options.fileName;
      const extension = options.fileExtension;
      const output = path.resolve(__dirname, outputPath, `${fileName}.${extension}`); // Add the fileExtension - `${fileName}.${extension}`
//    fs.writeFile(output, content, (err) => {
//      if (err) {
//        console.error(err)
//        return
//      }
        // File written successfully
//      console.log(`Plugin: {WebpackHashFilePlugin} created ${output}`);
//      console.log(`Hash: ${content}`);
//      });
//   });
//  }
// };

// module.exports = WebpackHashFilePlugin;

The entire plugin file should look like this:

// File: `./buildtools/WebpackHashFilePlugin.js`
// Save this file in the project's `./buildtools/` directory as `WebpackHashFilePlugin.js`

const fs = require('fs');
const path = require('path');

class WebpackHashFilePlugin {
  constructor (options){
    this.options = Object.assign({}, {
      fileName: 'hash',
      path: '../_data',
      fileExtension: 'yml'
   }, options)
  }

  apply(compiler) {
    const options = this.options;
    
    compiler.hooks.done.tap("WebpackHashFilePlugin", stats => {
      let content;

      if (options.fileExtension == 'json') {
        content = `{\n  "hash": "${stats.hash}"\n}`;
      }
      if (options.fileExtension == 'yml') {
        content = stats.hash;
      }

      const outputPath = options.path;
      const fileName = options.fileName;
      const extension = options.fileExtension;
      const output = path.resolve(__dirname, outputPath, `${fileName}.${extension}`);
      
      fs.writeFile(output, content, (err) => {
        if (err) {
          console.error(err)
          return
        }
        console.log(`Plugin: {WebpackHashFilePlugin} created ${output}`);
        console.log(`Hash: ${content}`);
      });
    });
  }
};

module.exports = WebpackHashFilePlugin;

Next, our webpack.config.js file needs to be updated with the new plugin options. We also need to create a second instance of the plugin in order to generate another hash file (i.e. hash.json):

// Filename/location: `./webpack.config.js` file for Webpack v5
// 1.) Require the plugin:
const WebpackHashFilePlugin = require('./buildtools/WebpackHashFilePlugin'); // Our custom plugin found in `/buildtools`

// 2.) Add the plugin to your plugins array:
const plugins = [
  new WebpackHashFilePlugin({ // One instance of the plugin for the YAML file
    path: '../_data/',
    fileName: 'hash',    // Remove the '.yml' extension from fileName.
    fileExtension: 'yml' // Add a fileExtension option
  }),
  new WebpackHashFilePlugin({ // Another instance of the plugin for the JSON file
    path: '../',
    fileName: 'hash',     // Remove the '.yml' extension from fileName.
    fileExtension: 'json' // Add a fileExtension option
  }),
];

const config = {
  plugins, // 3.) Reference the plugins array in the config
  entry: {
    'main': './assets/js/src/main.js',
  },
  // More Webpack configuration code here ...
}

module.exports = config;

Now we can create a service worker JS file, and an ES6 module, to cache network calls.

Creating the Service Worker


Now let’s dig into the details of creating our service worker, registering and installing them, and caching assets.


registerServiceWorker.js Module

To create the code that will register our service worker, I created a new JS file named registerServiceWorker.js. I created a default export function with the same name as the module filename (i.e. function registerServiceWorker()):

function registerServiceWorker () {

}

export default registerServiceWorker;

Let’s use this module to register our service worker which will be named ./serviceworker.js, and will be located in the project’s root. When registering a service worker you also define it’s scope. It’s scope is simply where, within the websites directory structure, the worker is allowed to operate.

The service worker has to be in the project’s root so that its’ scope can include all the files hosted from the site’s origin. If the service worker’s JS file were within a subfolder, the worker’s scope would be limited to that directory (and it’s subdirectories.)

To do the registration we need to use the
navigator.serviceWorker.register('./<SERVICE_WORKER_FILE>.js', { scope: './' }) method, where the first parameter is the service worker’s filename/location. The second (optional) parameter is an object used to configure the worker’s scope.

Be sure to ues any serviceWorker methods inside a check for service worker/browser compatibility. That way we won’t break the website for user’s who don’t have service worker support in their browser. This also helps our website adhere to the progressive web app (PWA) philosophy of progressive enhancement. Users who can support modern features (like service workers) will be able to utilize those feature which then enhance their experience—all while ensuring that the underlying functionality is available to as many as possible.

// File: `./assets/js/src/registerServiceWorker.js`

function registerServiceWorker () {
  // 1.) Check if the browser supports service worker:
  if ('serviceWorker' in navigator) {
    // 2. Register the worker:
    // We can omit the optional scope parameter, it will default to a scope matching the worker's parent folder (and any sub-folders.)
    navigator.serviceWorker.register('./serviceworker.js')
    .then((reg) => {
      // registration worked
      console.log('Registration succeeded. Scope is ' + reg.scope);
    }).catch((error) => {
      // registration failed
      console.log('Registration failed with ' + error);
    });
  }
}

export default registerServiceWorker;

Internally, service workers use Promises. This means we can tack on .then() and .catch() callbacks to run code after registration is successful and/or after an error.

The last thing to do is import this module into the bundle’s entrypoint (/assets/js/src/main/js.) Remember that service workers are not executed on the main JS thread, they run in the background on their own thread:

// File/location: `assets/js/src/main.js`
import '../../scss/main.scss';

document.addEventListener('DOMContentLoaded', () => {
  // Do not run service workers (or our module) inside of a 'DOMContentLoaded' listener!!
  // Essential JS ...
});

window.addEventListener('load', () => {
  // Load the service worker's module after page load with the ES6 `import()` method.
  import('./registerServiceWorker')
    .then( ({default: registerServiceWorker}) => registerServiceWorker() ) // Define modules' default and execute right away
    .catch( (err) => console.error('Error loading module', err) )
});

If we wanted to be a little more clever, we could check for service worker compatibility before we ever import the module. To do this, just remove the service worker check from ./registerServiceWorker.js, and add it into ./main.js, before we use the ES6 import:


Important Information

NOTE: The code block below shows edits on two files:

  • registerServiceWorker.js
  • main.js


// Module that registers the service worker:
// File/location: `assets/js/src/registerServiceWorker.js`

function registerServiceWorker () {
  // Removed the: `if ( 'serviceWorker' in navigator )` check.
  navigator.serviceWorker.register('./serviceworker.js')
  .then((reg) => {
    // registration worked
    console.log('Registration succeeded. Scope is ' + reg.scope);
  }).catch((error) => {
    // registration failed
    console.log('Registration failed with ' + error);
  });
}

export default registerServiceWorker;
// Main entrypoint for JS bundle:
// File/location: `assets/js/src/main.js`

import '../../scss/main.scss';

window.addEventListener('load', () => {
  // Check for service worker compatibility so that users who don't need it...
  // won't unnecessarily import the module.
  if ('serviceWorker' in navigator) {
    import('./registerServiceWorker')
      .then(({ default: registerServiceWorker}) => registerServiceWorker())
      .catch((err) => console.error('Error loading module', err))
  }
});


./serviceworker.js File

Finally, we can create our service worker file in the projects root. Let’s create a basic service worker setup with an oninstall handler to cache files and an onfetch handler to return network requests from the cache if it exists, otherwise make a regular network request:

// 1.) Create an event listener for the install event:
self.addEventListener('install', (event) => {
  // 2.) Use the `ExtendableEvent.waitUntil({ //... })` method to do some caching after installation:
  event.waitUntil(
    // 3.) Open a cache (or create if non-existing) and give it a version name:
    caches.open('v1').then((cache) => {
      // 4.) Use `cache.addAll([...])` method and pass it an array of 
      //     origin-relative URLs of the resources to cache:
      return cache.addAll([
        './index.html',
        './404.html',
        './sitemap.xml'
        // More items to cache ...
      ]);
    });
  );
});

// 6.) Create a fetch event-listener to intercept any network requests:
self.addEventListener('fetch', (event) => {
  // 7.) Use `events.responsdWith()` to "hijack" the network requests
  event.respondWith(
    // 8.) Check the cache for a key matching the request
    caches.match(event.request).then((resp) => {
      // 9.) Return the cached item if it exists, otherwise use a normal fetch() call to get the request from the network
      return resp || fetch(event.request).then((response) => {
        // 10.) After a fetch() call, for any items not in the cache, open the cache and add the missing item
        return caches.open('v1').then((cache) => {
          cache.put(event.request, response.clone());
          // 11.) Finally, hand off the response to be used by the document
          return response;
        });
      });
    })
  );
});
// 1.) Create an event listener for the install event:
self.addEventListener('install', (event) => {
  // 2.) Use the `ExtendableEvent.waitUntil({ //... })` method to 
  event.waitUntil(
    fetch('/hash.json').then(response => {
      if (!response.ok) {
        throw new Error(`HTTP error! status: ${response.status}`);
      }
      return response.json();
    })
    .then((obj) => {
      const hash = obj.hash;
      console.log('Hash log 1: ' + hash);
      caches.open('v1').then((cache) => {
        return cache.addAll([
          './index.html',
          './404.html',
          `./assets/js/dist/main.${hash}.js`,
          `./assets/js/dist/main.${hash}.css`,
          `./assets/js/dist/29.${hash}.js`,
          `./assets/js/dist/314.${hash}.js`,
          `./assets/js/dist/671.${hash}.js`,
          `./assets/js/dist/909.${hash}.js`,
        ]);
      });
    })
    .catch((err) => {
      console.error(`Error in service worker's pre-install tasks: \n${err}`, err);
    })
  );
});

self.addEventListener('fetch', (event) => {
  event.respondWith(
    caches.match(event.request).then((resp) => {
      return resp || fetch(event.request).then((response) => {
        return caches.open('v1').then((cache) => {
          cache.put(event.request, response.clone());
          return response;
        });
      });
    })
  );
});

Updating the App (and the Service Worker)


In the Progressive Web App (PWA) architecture, service workers are used to serve the app cache-first. This gives the website or app near instantaneous load times on subsequent loads.

If any cached resource in the app/website has changed, the cache needs to be updated to reflect this change. Otherwise, you will always see the old version served from the cache.

To update the cache you will need to delete the old one. However, you will want the new version of the service worker and cache to be ready for installation before we do any deletion. Once the old cache is deleted, the new one can begin populating immediately.

By design, service workers will not activate until all tabs/windows using the old version of the worker are closed. Once they are all closed, the new version will activate and it can then install the new version which should already be waiting for the oninstall event.

Use the activate event and event.waitUntil() to manipulate the caches during service worker activation.

The cacheKeeplist constant below holds an array of cache version to keep:

self.addEventListener('activate', (event) => {
  const cacheKeeplist = ['v2']; // Caches that should NOT be deleted

  event.waitUntil(
    caches.keys().then((keyList) => {
      return Promise.all(keyList.map((key) => {
        if (cacheKeeplist.indexOf(key) === -1) {
          return caches.delete(key);
        }
      }));
    })
  );
});