Getting Fastify Reply Hooks to Work With Next.js
Next.js exposes Node.js request handler functions that you can use to wrap Next.js with your own server. This is useful because you get to leverage the development patterns of established frameworks like Fastify while using Next.js as the rendering layer. However it comes with some challenges that I will address.
One of the main reasons why I wanted to wrap Next.js with Fastify is to use the middleware pattern for things like authentication, error handling and logging. Next.js doesn't have Node.js compatible middleware APIs. As of writing this, its middleware feature is only compatible with its edge runtime. So if you are stuck with Node.js like many of us are at work, the only way to add common functionality to all routes is by wrapping each page and API route with a higher order function or writing a webpack transform to apply the function at build time.
Unfortunately, Fastify reply hooks like onResponse
do not get called if you
implement the wrapping naively:
fastify.all("*", async (request, reply) => {
await nextRequestHandler(request.req, reply.res);
reply.sent = true;
});
This is because Next.js manipulates the Node.js response stream directly so it bypasses the Fastify reply pipeline and thus the reply hooks logic.
The way I solved this is to pass Next.js an intermediary Duplex
stream and
then pass the readable part of stream back into Fastify's reply APIs. However
Fastify needs to know the response status code and headers upfront before you
pass it a readable body stream so I had to write some code to separate the
response headers into promises and the response body into a readable stream.
In the end I have something like this:
fastify.all("*", async (request, reply) => {
const response = OfflineResponse.from(request.req);
nextRequestHandler(request.req, response.raw);
reply
.status(await response.statusCode)
.headers(await response.headers)
.send(response.body);
});
OfflineResponse
is the container object that contains the intermediary
ServerResponse
object from the Node.js http
module. It is passed to Next.js
via response.raw
.
Normally you would instantiate ServerResponse
with a socket but since sockets
can be duck typed as writable streams, I instantiated ServerResponse
with a
custom Transform
stream called HttpResponseStream
. This is probably the
hackier part of the implementation as we don't know for sure if a socket and
writable stream are completely interchangeable (TypeScript doesn't seem to think
so at least). I made this inference by reading parts of the ServerResponse
implementation.
export class OfflineResponse {
private responseStream: HttpResponseStream;
constructor(public raw: http.ServerResponse) {
this.responseStream = new HttpResponseStream();
raw.assignSocket(
// Writable stream can be duck typed as socket.
this.responseStream as any
);
// ...
}
static from(req: http.IncomingMessage) {
return new this(new http.ServerResponse(req));
}
// ...
}
In the Transform
stream that has disguised itself as a socket, I parse the
incoming HTTP response to get the HTTP status and headers as promises
(response.statusCode
and response.headers
), and I stream the body into the
readable side of the Transform
stream (response.body
). I use these 3 parts
to then construct the Fastify reply.
reply
.status(await response.statusCode)
.headers(await response.headers)
.send(response.body);
I use the http-parser-js
library to incrementally parse the HTTP response
inside the Transform
stream to ensure I'm not buffering huge responses.
this.httpParser = new HTTPParser(HTTPParser.RESPONSE);
// Resolve header and statusCode promises once they are parsed.
this.httpParser[HTTPParser.kOnHeadersComplete] = (result) => {
this._headers.resolve(headerListToObject(result.headers));
this._statusCode.resolve(result.statusCode);
};
// Push the body into readable side of the Transform stream.
this.httpParser[HTTPParser.kOnBody] = (data, start, len) => {
this.push(data.slice(start, start + len));
};
There are other approaches
I found on the the internet
such as writing a Proxy
for ServerResponse
to intercept Next.js calls to the
response object and send them to Fastify. I haven't explored this approach but
it seems a trade-off with this approach is that it would be easier to miss edge
cases since you need to make sure you exhaustively proxy all the APIs that could
potentially affect the final response.
For what it's worth, my current approach seems to work well. After a couple of years in use, we haven't had too many issues with the 80+ applications we currently manage at work today.
Some other caveats to note is that this breaks web sockets since the Transform
stream doesn't properly handle web socket responses. This is problematic for the
webpack HMR web socket so I've had to make an exception for HMR requests:
fastify.all("/_next/webpack-hmr", async (request, reply) => {
await nextRequestHandler(request.req, reply.res);
reply.sent = true;
});
It's pretty annoying that I've had to implement these workarounds just to get Node.js middleware capabilities into Next.js. Hopefully the folks over at Vercel address this issue one day.