Around four months ago, I shared a story of how we were trying to solve the problem of sharing code for Anzu. As we are building both our frontend and backend with TypeScript, we thought it should be easy enough to share parts between both sides, from types to functions. This road has been far bumpier than we expected, and today I’ll go into more detail about how we solved the issue in the end.
The Status Quo
While all our code lives in a monorepo managed with pnpm workspaces, sharing code as libraries has its downsides. To access any shared code, it must have been built before. This requires compiling the entire backend, which takes a couple of seconds. It’s not terribly slow, but still painful.
Once you get to including code, you’ll quickly notice that adding an import statement also means you’re potentially importing backend modules that may have native dependencies, which cannot run on the web, or strongly depend on Node.js core modules like crypto. Ugh.
We tried finding a middle way using type-only imports, but those would not work for enums and so we had to move all enums into one dependency-free file. This was terrible for ergonomics and made it harder to understand the codebase because types that belonged together were now awkwardly split in half, but at least the frontend could use our backend types. It still took way too long to rebuild the backend and we had to reload the TypeScript language server for new types to become available in our IDEs, but we tolerated the downsides when the alternative was no sharing at all.
Fast-forward four months and imports started to break again, as some subtree contained Node.js core modules. I stepped back and looked at our situation, trying to figure out a path to improving our developer experience.
What do we need for Anzu?
At Anzu, we strive to move ridiculously fast. We’re still very early in our journey and may have to pivot lots of times down the road until we find a course that works for us (i.e. reaching product/market fit). This means that we’re still iterating on our core data model and infrastructure primitives.
One fact about building software is that the more indirections we add, the harder it gets to change things. In later stages this may be desirable, especially when trying to decouple systems and have teams work independently. This doesn’t work for us right now, though, so we decided against rolling a full GraphQL setup with code generation or picking RPC-like frameworks with similar features. It’s not off the table that we’ll adopt GraphQL eventually, but at the moment, we have other priorities and little time on our hands.
So from this perspective, we wanted a tool to help us share backend types with the frontend and, if possible, with other TypeScript services and libraries like the Node.js SDK and the service powering our hosted auth page.
Crafting the perfect tool
For a very early version of Anzu, we worked on lots of code generation not so long ago. I remembered how easy it was to parse source files and work on the AST level to perform static code analysis, so we used the same idea for our new type-sharing system.
Taking this idea one step further, if we had a way to analyze our code, we could collect all types and enums that should be shared and write them to a file in each respective service. And that’s exactly what our new system does!
Given an interface or enum definition, how exactly do we know to make it available to other services? We decided to force shared types to be exported and declare a specific JSDoc annotation, @api-public
. This way, everyone immediately knows a type is meant for public distribution versus internal use and we don’t have to deal with arcane type prefixes.
// @api-public
export interface IWebhookDeliveryContent extends IEvent {
// event scope
scope: {
workspaceId: string;
environmentId: string;
};
// delivery details
delivery: {
webhookId: string;
deliveryId: string;
};
}
// @api-public
export interface IEvent {
id: string;
kind: EventKind;
payload: unknown;
createdAt: string;
version: string;
}
// @api-public
export enum EventKind {
WebhookCreated = 'webhook.created',
WebhookEnabled = 'webhook.enabled',
UserIdentityCreated = 'user_management.user_identity_created',
UserIdentitySuspended = 'user_management.user_identity_suspended',
UserIdentityUnsuspended = 'user_management.user_identity_unsuspended',
UserIdentityDeleted = 'user_management.user_identity_deleted'
}
We used ts-morph, a TypeScript compiler API wrapper that adds some convenient helper methods and operates fully in-memory.
import { Project, Node } from 'ts-morph';
import { resolve } from 'path';
import { writeFileSync } from 'fs';
const project = new Project({
tsConfigFilePath: resolve('../../services/backend/tsconfig.json'),
skipFileDependencyResolution: true
});
let output = '// THIS FILE WAS AUTO-GENERATED, DO NOT EDIT IT MANUALLY\n';
function shouldInclude(n: Node) {
if (!Node.isExportGetable(n)) {
return false;
}
if (!n.hasExportKeyword()) {
return false;
}
return n
.getLeadingCommentRanges()
.find(c => c.getText().includes('@api-public'));
}
for (const source of project.getSourceFiles()) {
const filePath = source.getFilePath().toString();
if (!filePath.includes('services/backend/src')) {
continue;
}
const interfaces = source.getInterfaces();
for (const i of interfaces) {
if (!shouldInclude(i)) {
continue;
}
const printed = i.getText();
output += `\n${printed}\n`;
}
const enums = source.getEnums();
for (const e of enums) {
if (!shouldInclude(e)) {
continue;
}
const printed = e.getText();
output += `\n${printed}\n`;
}
const typeAliases = source.getTypeAliases();
for (const t of typeAliases) {
if (!shouldInclude(t)) {
continue;
}
const printed = t.getText();
output += `\n${printed}\n`;
}
}
const outputPaths = [
resolve('../../services/app/src/generated/backend/index.ts'),
resolve('../../services/auth-hosted-page/src/generated/backend.ts'),
resolve('../../libs/anzu-sdk-node/src/generated/api.ts')
];
for (const outputPath of outputPaths) {
writeFileSync(outputPath, output);
console.log('Generated types at ' + outputPath);
}
That’s it, that’s all the code we need to collect relevant interfaces, enums, and type aliases from our backend source code and write them to target files.
With our new type-sharing system, it takes less than a second to synchronize all services and libraries interested in our backend types. We’re also no longer sharing any internal types and definitely no modules that depend on the Node.js core. No more broken builds or huge bundle sizes due to imports that were meant for types only. And if we forgot to mark some inner type as public, our code wouldn’t compile!
There’s no such thing as free lunch
Obviously, this straightforward solution has a few limitations. Since we’re using ts-morph, we can easily read and print TypeScript node definitions (interfaces, enums, type aliases, and more), but that’s all we can do. If we had to provide types for services in different languages, we would need to create a translation layer. Not impossible at all, but definitely additional work.
The second downside is related to our requirement of full flexibility: We don’t have full type-safety as we can’t force the backend to return the exact same type our frontend expects. This is an important convention we have to look after ourselves, it’s just not codified.
Lastly, our tool operates on the type system layer, so we cannot access and share dynamic values generated at run time. Instead, we need to return these values at run time, which is probably better than sharing static code at build time, as it elegantly avoids the question of timing releases on that end. Of course, we still need to make sure our backend and frontend types are aligned, but using a monorepo removes most of the complexity away in this case.
What’s next?
As with most tools, there are still some potential opportunities we can look at. Translating types to other languages would be interesting, as we could then share types with our CLI written in Go and other upcoming SDKs. For safety purposes, we could also introduce a CI step to prevent any PRs from being merged without generating the types if any source content changed in the meantime, which should take care of drift between our services.
If you found this interesting, make sure to check out Anzu and stay tuned for more updates. To build the best product possible, we’re venturing into adjacent areas of computing, building the tools and infrastructure we need. It’s almost funny to think of the high impact theoretical computer science and other fields have on our day-to-day software engineering experience.