Victor is a full stack software engineer who loves travelling and building things. Most recently created Ewolo, a cross-platform workout logger.
Building a simple static site generator using Node.js

A colleague of mine was recently looking into setting up a blog and asked me for recommendations. After doing a bit of research into static site generators and blog engines, I decided that Hugo would be a great choice. However, my colleague also had a couple of requirements like wanting a custom url for blogs and a custom css theme. While all of this is possible with Hugo, I decided to skip the learning curve and see if I could make a really simple static site generator given that she already had the html ready to go and that she had no problems writing her blog posts in html.

The static site generator script turned out be 100 lines and hardly any magic. The code and a sample blog can be found here. Note that Gitlab provides free hosting for static pages and also comes with a CI/CD feature which allows you to compile your pages before deployment.

The following is a tutorial on setting up your own static site generator using Node.js >= 8.11.x. Let us first setup the project:


npm init
npm i --save-exact bluebird chokidar fs-extra mustache
mkdir src
mkdir public

The first order of business is to ask the question - why exactly does one need a static site generator? The answer is in fact that you don't really. If all you are doing is running a low traffic blog, you could very simply create your html pages by hand and publish them. In fact this is how most web publishing was done for the longest time before the rise of server-side programming. However, once you have a few pages and some content, it can get tedious to make changes to sections that are common across all pages, like the footer for example. Therefore, it would be ideal if we could have some sort of a simple templating engine that could allow splitting of common content and inserting it where required.

Before we start looking into templating engines, let us first setup our website. For the moment, we will create 2 folders under the project root, src (where our current website lives) and public (which will contain our generated website). For our initial attempt we will simply copy the contents of src over to public. Create the following index.js under your project root:


const Promise = require("bluebird");
const fse = require("fs-extra");

Promise.resolve().then(async () => {
  await main();
});

const main = async() => {
  await generateSite();
};

const generateSite = async() => {
  await copyAssets();
};

const copyAssets = async() => {
  await fse.emptyDir("public");
  await fse.copy("src", "public");
};

Run this script via node index.js and bask in the glory that is programming.

Congratulations! You are a backend developer now.

As a second step we will add a file watcher so that any changes within our src folder will re-generate the website. Since this will be a blog with 500-1000 files in total (assuming 100 blog entries), we can afford to regenerate the entire website on any change:


const chokidar = require("chokidar");

const main = async() => {
  await generateSite();
  watchFiles();
};

const watchFiles = () => {
  const watcher = chokidar.watch(
    [
      "src"
    ],
    {
      ignored: /(^|[\/\\])\../, // chokidar will watch folders recursively
      ignoreInitial: false,
      persistent: true
    }
  );

  watcher.on("change", async path => {
    console.log("changed " + path + ", recompiling");
    await generateSite();
  });

  // catch ctrl+c event and exit normally
  process.on("SIGINT", function() {
    watcher.close();
  });
};

The above code makes it clear now why the initial version had a function called generateSite. We can start our static site generator now via node index.js and if we now edit any file under src, the changes should be reflected under public. At this point, we will also add an environment variable to differentiate between development and production modes. In development mode we will watch for changes and regenerate the website and in production mode we simply regenerate:


const env = process.env.NODE_ENV || "dev";

const main = async () => {
  console.log("Running app in " + env);
  await generateSite();

  if (env === "dev") {
    watchFiles();
  }
};

We can run the above via: export NODE_ENV=prod || set NODE_ENV=prod && node index.js. Note that watching the source directory for changes and recompiling is not entirely necessary, you could always skip this step and simply run the script everytime you make changes but programming is all about avoiding repetitive tasks.

Interestingly, we are almost done! All that is really required now is to come back to the original question of creating a static site generator in the first place - templating. We will use Mustache.js for templating mostly because it is the simplest and our needs are not too complicated. Let us create a folder src/partials which will hold our common sections. We then modify our website structure slightly so that all our pages now live under src/pages. Now all that is remaining to do is to load all partials, load the pages and render them using Mustache:


const fs = require("fs");

const generateSite = async () => {
  await copyAssets();
  await buildContent();
};

const buildContent = async () => {
  const pages = await compilePages();
  await writePages(pages);
};

const compilePages = async () => {
  const partials = await loadPartials();

  const result = {};
  const pagesDir = path.join("src", "pages");
  const fileNames = await fs.readdirAsync(pagesDir);
  for (const fileName of fileNames) {
    const name = path.parse(fileName).name;
    const fileContent = await fs.readFileAsync(path.join(pagesDir, fileName));
    result[name] = Mustache.render(fileContent.toString(), {}, partials);
  }
  return result;
};

const loadPartials = async () => {
  const result = {};
  const partialsDir = path.join("src", "partials");
  const fileNames = await fs.readdirAsync(partialsDir);
  for (const fileName of fileNames) {
    const name = path.parse(fileName).name;
    const content = await fs.readFileAsync(path.join(partialsDir, fileName));
    result[name] = content.toString();
  }
  return result;
};

const writePages = async pages => {
  for (const page of Object.keys(pages)) {
    await fs.writeFileAsync(path.join("public", page + ".html"), pages[page]);
  }
};

To see the final version check out the Software Dawg project. There are a few minor differences from the tutorial here:

  • The script itself is under the src folder.
  • It is slightly over 100 lines, 130 at the time of writing, mostly due to clean code practices, i.e. constants instead of strings for folder paths, etc.
  • Instead of copying over the entire src folder, the script only copies necessary assets such as css, images, etc.
  • The project also uses node-sass to compile the template css. This dependency is however not required as the compiled css is also checked into git.

As a bonus, you could also install the browser-sync package globally and run that via the provided command npm run live-reload so that your browser automatically refreshes on any change. Note that this does not work out that well on windows unfortunately, due to the fact that we are regenerating the entire site on any change.

Gitlab provides static website hosting for free and all that is needed is a .gitlab-ci.yml configuration file. What is really incredible is that you can define the build process which means that in our case, we can generate the website before deployment! See here for more details on this feature.

This concludes the tutorial and my colleague is extremely satisfied with her current solution as it is ultra flexible and allows her to customise it to her liking. She is looking into creating custom paths for blog posts at the moment :)