How to test gatsby-node

3 min read

This article will be enough to get you started with unit testing gatsby-node.js, we are not going to deep dive into every possible test you might need to make.

First of all, you need to ask yourself "should you test it?". What other tests do you have in place that cover this code? Do you have Visual Regression or End to End tests?

Even if you do have those in place, you may still decide that unit tests are worth writing because they increase your speed and confidence when making changes.

Setup

There's a complete gatsby testing setup guide in the gatsby documentation, for our purposes all we need is

  1. Jest
yarn add -D jest
  1. And node modules ignored
// jest.config.js
module.exports = {
  testPathIgnorePatterns: [`node_modules`, `\\.cache`, `<rootDir>.*/public`],
}

gatsby-node tests

This is from a gatsby app that creates an e-book website, each chapter needs a page, so we use the createPages() API:

const path = require(`path`)
const { createFilePath } = require(`gatsby-source-filesystem`)

exports.createPages = async ({ graphql, actions }) => {
  const { createPage } = actions

  const ChapterTemplate = path.resolve(`./src/templates/chapter.js`)
  const result = await graphql(
    `
      {
        allMdx(sort: { fields: frontmatter___section, order: ASC }) {
          chapters: group(field: frontmatter___chapter) {
            number: fieldValue
            sections: nodes {
              frontmatter {
                chapter
                section
              }
            }
          }
        }
      }
    `
  )

  if (result.errors) {
    throw result.errors
  }

  const chapters = result.data.allMdx.chapters

  chapters.forEach(chapter => {
    createPage({
      path: `chapter-${chapter.number}`,
      component: ChapterTemplate,
      context: {
        chapter: parseInt(chapter.number, 10),
      },
    })
  })
}
// ...

We are testing the "happy" path here and ignoring the error cases which might be enough for a gatsby site because your build will fail if errors occur.

Read over the tests then we will talk about anything unique:

const { createPages } = require('../gatsby-node')
const path = require('path')

jest.mock('path', () => ({
  resolve: jest.fn(),
}))

describe('testing gatsby-node', () => {
  describe('given data', () => {
    const data = {
      allMdx: {
        chapters: [
          {
            number: 1,
            sections: [{ section: 1 }, { section: 2 }, { section: 3 }],
          },
          {
            number: 2,
            sections: [{ section: 1 }, { section: 2 }, { section: 3 }],
          },
        ],
      },
    }

    describe('when querying for chapters', () => {
      const graphql = jest.fn(() =>
        Promise.resolve({
          data,
        })
      )
      const createPage = jest.fn()
      const ChapterComponent = () => {}

      path.resolve.mockImplementation(() => ChapterComponent)

      beforeEach(() => {
        createPages({ graphql, actions: { createPage } })
      })

      it('then runs a graphql query', () => {
        expect(graphql.mock.calls[0][0]).toMatchSnapshot()
      })

      it('then creates a page for each chapter', () => {
        const chapters = [1, 2]
        chapters.forEach(chapter => {
          expect(createPage).toHaveBeenCalledWith({
            component: ChapterComponent,
            context: {
              chapter,
            },
            path: `chapter-${chapter}`,
          })
        })
      })
    })
  })
})

We are opting for a unit test instead of integration by mocking the path.resolve() dependency, and the graphql() and action.createPage() parameters.

The tests the graphql query we used toMatchSnapshot(), which is borderline abuse of the tool, but I think it's OK for small use-cases like this.

We have mocked the action.createPages() parameters, which we then assert that it has run with the parameters.

Beyond unit testing

Focusing exclusively on unit testing leaves us vulnerable to false-positive tests when upgrading the gatsby package or other dependencies. To assist with upgrades you could make use of visual regression tools like percy which are a better indicator of the whole website rendering as expected.

Was this article helpful?

Subscribe to get chapters of Robust UI while it's being written