"Correctly" Unit Testing Express With Rescript And Zora
Introduction
This is part of an ongoing series about the Rescript programming language, and the second in a miniseries about unit testing using the Zora test framework. I’ve been taking notes as I build a toy “Recipe Book” app to explore the Rescript ecosystem.
Most recently, I added some unit tests, but I haven’t tested any express endpoints yet. Now is a good time to rant about “unit” testing express endpoints!
Patreon
This series takes a lot of time and dedication to write and maintain. The main thing that has kept me invested in writing this series is support from my Patrons. If you appreciate the work, a little monetary feedback would be most welcome!
Other ways to show support include sharing the articles on social media, commenting on them here, or a quick thank you on the Rescript forum.
Other articles in series
With over a dozen articles and counting, I’ve created a table of contents listing all the articles in this series in suggested reading order.
A short rant on “unit” testing
In researching what libraries were best used for testing an express application, the first Google result is How to correctly unit test Express server.
I don’t generally feel comfortable picking on specific posts on the Internet, but as the top hit, I think this one deserves a bit of a callout.
Unit testing has several defining aspects, but one of the main one is to only test one thing at a time. The problem with the above article and every other express testing article I could find is that it spins up an active HTTP server for every test.
This is ridiculous. Your unit tests should not fail if the port is in use. They should not fail if some other script hits that endpoint while the tests are running. They should not fail if you are depending on a buggy version of express.
Yes, these things should be tested, but those tests happen at a different level; integration or end-to-end testing, for example. If the linked article were titled “How to correctly integration test…” or even “How to correctly test…”, I wouldn’t feel so agitated about it. But if you are writing true unit tests, you are not spinning up an entire express server and listening on a port to do so.
For me, the most important thing they should not do is waste time spinning up an entire server and binding to a port. If you are not a new reader to this series, you know how much I love Rescript’s (and Zora’s) speed. I don’t want to lose that instant feedback loop restarting express on every test change.
The worst part is that I could find only one project on the entire internet that mocks Express correctly. It only has 11 stars on github and hasn’t been updated in over a year.
Now, Express is kind of dated and the cool kids are apparently talking about fastifyjs these days. It appears that they get it right, so maybe a port of this entire project with bindings to fastify is in order.
Getting our bearings
This article will add tests to the express portion of the
rescript-express-recipes.
You can git checkout
the
zora
branch if you want to start from the same place I am in this article.
If you want to follow along with the changes in this article, I try to make a separate commit for each section. You can find them in the zora-express branch.
Another short rant (this time on nodejs)
So this is kind of hilarious and/or embarrassing. After getting the test suite up and running in the previous article, I never ran the actual app itself.
Turns out, when I switched the package type to es6 modules (in package.json
and bsconfig.json
), it broke the import on the Express package. The whole
commonJS and ES Modules thing is an absolute
nightmare
in the node ecosystem.
The summary is that you have to dance backwards in a circle around a pentagram an unknown number of times in order to make common js modules and es6 modures work together, and even then you’re pushing your luck.
Rescript makes this even trickier, because it typically needs to rebuild all of
your dependencies in order to type check correctly (rescript build --with-deps
). In my case, when I rebuilt it in es6 module mode, it also
rebuilt the bs-express dependency in es6 module mode. The recompiled bs-express is now importing the wrong thing from nodejs, and the result is the following error from node:
TypeError: Express is not a function
at Module.express (file:///Users/dustyphillips/Desktop/Code/rescript-express-recipes/node_modules/bs-express/src/Express.mjs:746:10)
at file:///Users/dustyphillips/Desktop/Code/rescript-express-recipes/src/Index.mjs:12:19
at ModuleJob.run (node:internal/modules/esm/module_job:183:25)
at async Loader.import (node:internal/modules/esm/loader:178:24)
at async Object.loadESM (node:internal/process/esm_loader:68:5)
at async handleMainPromise (node:internal/modules/run_main:63:12)
I’m not certain what the solution to this probelm is, and, frankly, I don’t care. Maybe in the future I’ll sort it out and write another article, but today, I’m going to stop ranting and get down to the business of testing express apps (that don’t work).
Bait and Switch
I’m not going to use the mock express library, or indeed, any express testing library. Instead, I’ll assume that express is well tested and does the right thing. Most of this article is really about refactoring my express server so that the business logic (mostly decoding and encoding json and submitting actions to the reducer) is decoupled from the Express hookups.
Right now, all of our express endpoints look something like this:
App.get(
app,
~path="/",
Middleware.from((_, _, res) => {
let result = Js.Dict.empty()
result->Js.Dict.set("Hello", "World"->Js.Json.string)
let json = result->Js.Json.object_
res->Response.status(Response.StatusCode.Ok)->Response.sendJson(json)
}),
)
If we look closely at the endpoint, there are several things that are not part of our code, and therefore don’t require unit testing, at least not by us (presumably the good folks that maintain the express package are already doing so):
App.get
Middleware.from
Response.sendJson
However, the way the endpoint is written, it’s not obvious how to test just the business logic. The easiest solution is to actually refactor the code so the logic under test is in its own function:
let helloWorld = () => {
let result = Js.Dict.empty()
result->Js.Dict.set("Hello", "World"->Js.Json.string)
result->Js.Json.object_
}
App.get(
app,
~path="/",
Middleware.from((_, _, res) => {
let json = helloWorld()
res->Response.status(Response.StatusCode.Ok)->Response.sendJson(json)
}),
)
Now, you might be asking yourself, “But what if there is a typo in the glue code that connects express to the business logic?”
That can happen, of course, and if your tests do not run the endpoint, then it
won’t get caught until you push it to production. However, this isn’t nearly
so sinister in Rescript as it is in Javascript. The compiler will check your
glue code, and will catch most of the obvious errors or typos. It won’t catch
things like, “I used App.get
when I should have used App.post
type of
things, but it will catch the bulk of issues.
And, as I said before, yes you should have integration tests that access a real server without a client. They just shouldn’t be part of your unit testing setup.
Refactoring to better support testing
Now that I’ve started thinking about splitting out the functions like this, I think it makes more sense to put the business logic under test should got in its own file.
Aside from the obvious organizational improvements, there is also a technical
reason to make this change. In my current version, Index.res
is the main
entry point for node to start the express server. This means that if we wanted
to import the endpoints into a unit test, the server would get started as an
undesirable side-effect.
So instead, I’m going to split Index.res
into two files: one for setting up
the server and endpoints, and one for modelling the JSON and calling the
actions in the Store
.
This is starting to look like the model view controller pattern if you’re into
that kind of thing. The model is in Store.res
, the view is the collection
of express endpoints, and the controller is the business logic. This isn’t
a wholly accurate representation since much of the business logic happens
in the reducer in Store
, but it’s close enough.
Create a new Controller.res
file that we can move the serializing logic into.
The key point is keeping this loosely coupled. Controller.res
should never
have to use any Express
types, and the Index.res
file shouldn’t need to use
anything from the Js.Json
package or our own Store
module.
I’m not going to break down all the changes, but here is the new addRecipe
controller:
let addRecipe = body => {
let jsonFields =
body
->Belt.Option.flatMap(Js.Json.decodeObject)
->Belt.Option.map(jsonBody => (
jsonBody->Js.Dict.get("title")->Belt.Option.flatMap(Js.Json.decodeString),
jsonBody->Js.Dict.get("ingredients")->Belt.Option.flatMap(Js.Json.decodeString),
jsonBody->Js.Dict.get("instructions")->Belt.Option.flatMap(Js.Json.decodeString),
))
let jsonResponse = Js.Dict.empty()
switch jsonFields {
| Some(Some(title), Some(ingredients), Some(instructions)) => {
open Store.Reducer
let id = Store.uuid()
dispatch(
AddRecipe({id: id, title: title, ingredients: ingredients, instructions: instructions}),
)
jsonResponse->Js.Dict.set("id", id->Js.Json.string)
}
| _ => jsonResponse->Js.Dict.set("error", "missing attribute"->Js.Json.string)
}
jsonResponse->Js.Json.object_
}
In addition to the obvious cut-pasting, the one subtle change was to add the
jsonResponse->Js.Json.object_
call at the end of the addRecipe
function,
which returns the response.
The /addRecipe
endpoint in Index.res
has now been reduced to:
App.post(
app,
~path="/addRecipe",
Middleware.from((_next, req, res) => {
let jsonResponse = req->Request.bodyJSON->Controller.addRecipe
res->Response.sendJson(jsonResponse)
}),
)
The rest of the endpoints in Index.js
look virtually identical to this, so I
won’t show all of them. The controllers themselves are a little more
complicated. Here’s the addTagToRecipe
controller to be called from the
/AddTagToRecipe
endpoint:
let addTagToRecipe = body => {
open Belt
open Store.Reducer
let jsonResponse = Js.Dict.empty()
let jsonFields =
body
->Option.flatMap(Js.Json.decodeObject)
->Option.map(jsonBody => (
jsonBody
->Js.Dict.get("recipeId")
->Option.flatMap(Js.Json.decodeString)
->Option.flatMap(id => getState().recipes->Map.String.get(id)),
jsonBody->Js.Dict.get("tag")->Option.flatMap(Js.Json.decodeString),
))
switch jsonFields {
| Some(Some(recipe), Some(tag)) => {
jsonResponse->Js.Dict.set("success", true->Js.Json.boolean)
dispatch(AddTag({recipeId: recipe.id, tag: tag}))
}
| _ => jsonResponse->Js.Dict.set("error", "invalid request"->Js.Json.string)
}
jsonResponse->Js.Json.object_
}
getRecipe
is a little different because it needs to access the Request
query
parameters (submitted as a Js.Dict
) instead of the body:
let getRecipe = params => {
open Belt
let jsonResponse = Js.Dict.empty()
let state = Store.Reducer.getState()
let recipeOption =
params
->Js.Dict.get("id")
->Option.flatMap(Js.Json.decodeString)
->Option.flatMap(id => state.recipes->Map.String.get(id))
switch recipeOption {
| None => jsonResponse->Js.Dict.set("error", "unable to find that recipe"->Js.Json.string)
| Some(recipe) => {
jsonResponse->Js.Dict.set("id", recipe.id->Js.Json.string)
jsonResponse->Js.Dict.set("title", recipe.title->Js.Json.string)
jsonResponse->Js.Dict.set("ingredients", recipe.ingredients->Js.Json.string)
jsonResponse->Js.Dict.set("instructions", recipe.instructions->Js.Json.string)
jsonResponse->Js.Dict.set("tags", recipe.tags->Js.Json.stringArray)
}
}
jsonResponse->Js.Json.object_
}
The endpoint for this one is a tiny bit different from the others, as it passes
the request through Request.params
instead of through Request.bodyJSON
:
App.get(
app,
~path="/recipes/:id",
Middleware.from((_next, req, res) => {
let jsonResponse = Controller.getRecipe(req->Request.params)
res->Response.sendJson(jsonResponse)
}),
)
The /allTags
endpoint is so simple that I decided not to add the overhead of
moving it the controller. I am comfortable trusting that between Rescript’s
type safety and my visual scan of the function that it doesn’t require
additional testing. Of course, I may be wrong, but if so, I can add the test
when it becomes obvious that it is necessary.
Finally, the getTag
endpoint uses getParams
, so it’s similar to getRecipe
above. This is the most complicated endpoint in the file, but the changes
to migrate the business logic to a function in Controller.res
is the same:
let getTag = params => {
open Belt
let jsonResponse = Js.Dict.empty()
let state = Store.Reducer.getState()
let taggedRecipesOption =
params
->Js.Dict.get("tag")
->Option.flatMap(Js.Json.decodeString)
->Option.flatMap(tag => state.tags->Map.String.get(tag))
switch taggedRecipesOption {
| None => jsonResponse->Js.Dict.set("error", "tag not found"->Js.Json.string)
| Some(taggedRecipes) => {
let recipes =
taggedRecipes.recipes
->Array.map(id => {
state.recipes
->Map.String.get(id)
->Option.map(recipe => {
let dict = Js.Dict.empty()
dict->Js.Dict.set("id", id->Js.Json.string)
dict->Js.Dict.set("title", recipe.title->Js.Json.string)
dict
})
})
->Array.keep(value => value->Option.isSome)
->Array.map(opt => opt->Option.getUnsafe->Js.Json.object_)
->Js.Json.array
jsonResponse->Js.Dict.set("recipes", recipes)
}
}
jsonResponse->Js.Json.object_
}
And that’s it for the refactoring. Overall, I think this separation of concerns makes a lot of sense. However, I will point out that all this file is doing is converting JSON to object types in the store and back again. It would be much simpler to convert it to use something like rescript-jzon. Spoiler alert: This is the subject of my next article!
Dealing with global state
There’s nothing terribly magical about these controllers; you pass in some json
and get out some other json. However, the controllers are also invoking actions
in and read state from the Store
, and this is problematic.
If we write a test that changes the store, that state will still be in the store when the next test runs. This can actually be considered a good thing as you can write a series of assertions about the state as it changes over time. However, it also means that tests cannot be executed in parallel because they will end up conflicting with each other.
I puzzled with this for quite a while and decided there are four ways we could address the issue, with various advantages and disadvantages:
- Don’t test this code at all. In some cases, this is a reasonable solution. Pragmatism is important, and sometimes it’s just not worth the number of hours it takes to test a certain thing.
- Use a mocking library such as sinon or test double so we can focus on the
implementation of
Controller.res
and mock out theStore.res
. In my experience, this kind of mocking quickly becomes very painful as you struggle to maintain two parallel implementations. Worse, I’m not sure I could find an elegant way to bind Rescript to mocks. Rescript tends to enforce types on things, and mocking’s soul purpose is to subvert the underlying types. - Localize the state. One way to do this would be to store the state with the app and pass it into the various controller methods alongside the body. All things being equal, I would say this is the superior solution. However it involves yet another refactor that I don’t have space for in this article.
- Run the tests for the controller synchronously so we don’t have to worry
about independent tests making conflicting changes while everything is in
flight. This solution makes me sad because I love the idea of parallel tests
being lightening fast. However, it does have a couple advantages. For one
thing, I don’t have to further modify the code under inspection in order to
test it. More importantly, it allows me to write tests that are, perhaps, a
little more future-friendly. The
Store.Reducer
would never make it into a production system, after all; it would need to be backed by a database of some sort. Because I don’t know how that database will look yet, it doesn’t make a lot of sense trying to mock it.
I’m choosing that last option because time is my most valuable resource at the moment (I have a lot more articles to write!).
The summary is that we’ll use t->block
instead of t->test
for our tests and
test a sequence of actions instead of one unit at a time.
Clearing the state
I want a way to reset the state in the Store so that my tests can all start from a known situation, even if they aren’t able to run in parallel.
To do this, I added a ‘dangerousResetStatemethod to my
UseReducermoduleType and
Reducermodule in
Store.res`:
module type UseReducer = {
let getState: unit => state
let dispatch: action => unit
let dangerousResetState: unit => unit
}
module Reducer: UseReducer = {
let currentState = ref(initialState)
let getState = () => currentState.contents
let dispatch = (action: action) => {
currentState.contents = reducer(currentState.contents, action)
}
let dangerousResetState = () => {
currentState.contents = initialState
}
}
Writing some actual tests
Let’s start with a new file named TestController.test.res
. The .test.res
is
necessary so that our onchange and pta invocations in package.json
can pick
them up.
I always write a warmup unit test in any fresh test file before I try to figure out the code I actually want to test:
open Zora
zoraBlock("Test endpoints", t => {
t->block("first test", t => {
t->ok(true, "It should make a test")
})
})
Now, the work in our unit test is mostly to construct and decode json objects
that are in the format we expect. This feels pretty clumsy using Js.Json
(json always feels bulky in typed languages). I toyed with trying to use
%raw
tags to make it neater, but then I realized I can just use strings
and parse and encode them as needed. Here’s my first test, which adds a recipe
and then gets it back by ID to make sure the values are as expected:
t->block("The Happy Path", t => {
let body = Some(
Js.Json.parseExn(`
{
"title": "Bread",
"ingredients": "Flour and Water",
"instructions": "Mix and Bake"
}
`),
)
let result = body->Controller.addRecipe
let id =
result
->Js.Json.decodeObject
->Belt.Option.getUnsafe
->Js.Dict.get("id")
->Belt.Option.getUnsafe
->Js.Json.decodeString
->Belt.Option.getUnsafe
t->equal(id->Js.String2.length, 36, "The id should be the length of a uuid")
let params = Js.Dict.empty()
params->Js.Dict.set("id", id->Js.Json.string)
let result = params->Controller.getRecipe
let json = result->Js.Json.stringifyAny->Belt.Option.getUnsafe
let expected = `{"id":"${id}","title":"Bread","ingredients":"Flour and Water","instructions":"Mix and Bake","tags":[]}`
t->equal(json, expected, "get recipe should match input")
})
The massive pipeline to extract a string id from the addRecipe
result is the
bulk of the method! If it comes up again, I’ll extract this into its own
function, but for now it can stay where it is.
We can add some extra setup and assertions to the happy path test to test the add and get tag features. Once again, the bulk of the work will be in building and dismantling json:
let body = Some(
Js.Json.parseExn(
`
{
"recipeId": "${id}",
"tag": "Carbs"
}
`,
),
)
let result = body->Controller.addTagToRecipe
let json =
result
->Js.Json.decodeObject
->Belt.Option.getUnsafe
->Js.Json.stringifyAny
->Belt.Option.getUnsafe
let expected = `{"success":true}`
t->equal(json, expected, "addTagToRecipe should return success")
let result = params->Controller.getRecipe
let json = result->Js.Json.stringifyAny->Belt.Option.getUnsafe
let expected = `{"id":"${id}","title":"Bread","ingredients":"Flour and Water","instructions":"Mix and Bake","tags":["Carbs"]}`
t->equal(json, expected, "get recipe should match input")
let params = Js.Dict.empty()
params->Js.Dict.set("tag", "Carbs"->Js.Json.string)
let result = params->Controller.getTag
let json =
result
->Js.Json.decodeObject
->Belt.Option.getUnsafe
->Js.Json.stringifyAny
->Belt.Option.getUnsafe
let expected = `{"recipes":[{"id":"${id}","title":"Bread"}]}`
t->equal(json, expected, "tag should now have recipes")
I put this inside the original t->block
. That block is getting pretty long
now, and I would rather split it up into several more descriptive t->block
calls. However, there’s a bit of a hiccup: I would need access to the id inside
the next blocking test, but Rescript doesn’t expose it outside the block scope.
I thought I could rescue the situation by nesting t->block
calls inside the test,
but I found it wasn’t any more readable that way so I left it as is.
Testing the sad paths
I first heard the term “happy path” in an article by Kent C. Dodds. The basic
idea is that you test the situation where everything that goes as it should
first. But you still need to check the edge cases. I ran npm run test:coverage
and saw that nearly 82% of the output Controller.bs.js
(one of
the drawbacks of Zora and Rescript is that Rescript doesn’t have source maps so
you can’t see the coverage of the original file).
82% isn’t bad for a first pass, but if you run npm run test:htmlcoverage
instead
and navigate the html report, you’ll see that none of the negative cases have been
covered.
We can test these in a new test block. Before we initialize the block, though, we should reset the state:
Store.Reducer.dangerousResetState()
t->block("addRecipe missing attribute", t => {
let body = Some(Js.Json.parseExn(`{}`))
let result = body->Controller.addRecipe
let json = result->Js.Json.stringifyAny->Belt.Option.getUnsafe
let expected = `{"error":"missing attribute"}`
t->equal(json, expected, "There should be missing attributes")
Js.log(json)
})
This test checks what happens when the body is valid json, but doesn’t have the correct attributes. I’ll leave adding similar tests for the other endpoints as an exercise.
We can also test what happens, for example, when we try to add a tag to a recipe that doesn’t exist:
t->block("can't add tag to nonexistent recipe", t => {
let body = Some(
Js.Json.parseExn(`
{
"recipeId": "Not a Recipe",
"tag": "Carbs"
}
`),
)
let result = body->Controller.addTagToRecipe
let json =
result
->Js.Json.decodeObject
->Belt.Option.getUnsafe
->Js.Json.stringifyAny
->Belt.Option.getUnsafe
let expected = `{"error":"invalid request"}`
t->equal(json, expected, "addTagToRecipe should return success")
})
Or if we request a recipe that doesn’t exist:
t->block("Can't get recipe that doesn't exist", t => {
let params = Js.Dict.empty()
params->Js.Dict.set("id", "Not a Recipe"->Js.Json.string)
let result = params->Controller.getRecipe
let json = result->Js.Json.stringifyAny->Belt.Option.getUnsafe
let expected = `{"error":"unable to find that recipe"}`
t->equal(json, expected, "get recipe should match input")
})
Again, I’ll leave the similar case for testing getTag as an exercise.
Don’t let your tests cross layer of abstraction boundaries
I made a conscious decision not to mock Store.res
. Notice, however, that I
never call Store.Reducer.getState
anywhere in my Controller
tests. There is
no technical reason not to check the contents of the state after, say, adding a
recipe. However, it isn’t a good separation of concerns. We know that the
reducer is doing its job because we tested it already in the last article. The
tests at the controller “layer” should only be interacting with that layer. If
you start coupling your tests to multiple layers, you’ll end up having fragile
tests that are annoying to maintain without much benefit.
The title of this article is an intentionally misleading “Correctly unit testing express…”. The correct way to unit test an express application is to not test express. Split as much of your logic out to testable units as possible, and leave testing the express plumbing to the express developers and your integration tests.
This is a philisophical discussion, so I won’t go any deeper. All discussions of unit testing devolve into philisophical discussions at some point…
But for the love of whatever you hold holy, do not run an http server on a tcp socket just to unit test your code!
Conclusion
I’m getting bored with testing. I intended to do another article on testing the graphql resolvers, but I don’t think doing so will teach us anything new about Zora. The resolver functions are already independent from the GraphQLExpress plumbing, so you can test them in a very similar way to what we did in this article.
In the next article, I’m going to port the controllers over to use a less ungainly json encoding/decoding mechanism. See you then!