I’ve been using TypeScript and Node.js as a key part of my stack for more than half of the last decade. Since then, many tools have come and gone, but one problem has stuck around: Sharing code.
In many cases, you’ll be working on multiple services, across frontend and backend, which should (in the best case) share types and logic to make your life easier. There are multiple strategies to make this possible, but I have yet to find a solution that really works smoothly.
As with everything in software engineering, we have to weigh the benefits and drawbacks. For my projects, I value velocity and simplicity over most other things, but for other teams, a structured workflow, strict versioning, and other criteria may be more important. Due to this, I’m biased toward my personal needs and preferences in my analysis, keep that in mind.
Numerous guides and blog posts have expressed the benefits of using monorepos, meaning storing all code in one single repository in a version control system. This doesn’t have to include all code of a company, but could be used for the complete application code, or just for backend and frontend code separately.
The closer code of one system resides to the code of other systems, the easier it should be to share parts, at least in theory. Other workflows like code reviews, multi-service changes, documentation, CI, and deployment also get radically more simple when moving to a monorepo approach.
Now, one idea could be to share our code by publishing a package to npm or GitHub Packages. This would give us versioning out of the box and could be used to distribute code to other teams or make it available publicly, but there’s a downside: Every time you want to perform a change, you have to build, package, and push a new version of the package. This just isn’t feasible for the early days of a product.
Two important use cases for public packages are SDKs and developer tools, for which versioning becomes important again. As a developer, you wouldn’t want to rely on an ever-changing codebase that might just break at any point in time, so consistency and quality are much more important when interacting with external parties rather than internal teams.
Another way to share code using the package system could be to use package manager features like pnpm workspaces or lerna. These tools can help you link packages locally, without involving a registry. This saves the push step, but you’ll still have to build, and some bundlers don’t work nicely with hot-reloading code when a dependency changes without an install step.
Another downside is that, for every library, you have to do the initial work of configuring your build setup, linting, formatting, and other steps. This can probably be simplified with templates, but maintaining multiple internal libraries still adds overhead.
I would go for locally-shared libraries once I have a sense of which modules are needed and want to create stricter boundaries. This could work well when multiple teams are involved, each having ownership over their part of the system.
The easiest way to share code across services is when you don’t have to cross project boundaries at all. This flow does not work when you need to share code across frontend and backend, but you can already make your backend setup much simpler, especially if all services use the same underlying language and framework.
This is a flow I’ve refined over time, currently, I’ll simply create a single directory/package that contains different code entrypoints for different services. Everything else, including database handling, interfaces to external systems, monitoring, logging, and so on can be reused for all services.
Refactoring is easy as well, as you can immediately change all occurrences across your project and don’t have to worry about updating code in places you didn’t think about. On top of that, you’re dealing with consistent package versions, so you won’t run into nasty bugs arising from a dependency version mismatch.
I’ve been lucky to work with Go a lot in the past, so I was able to explore a different style of sharing code and handling modules. With Go modules, you can easily group code under a module, which can be used remotely when pushed as a public repository, or used internally by adding the
replace directive in your
go.mod file. This is similar to using pnpm workspaces and Go even has its notion of workspaces now, which makes sharing code even easier.
A big part of why sharing code in Go feels so much easier is that there are fewer variables: As long as you use the same language version across modules, you won’t run into weird issues. You don’t have to worry about compiling modules first, there are no competing package managers, bundlers, build tools or other distractions that keep you away from meaningful work.
Go modules have their own share of downsides, but all things considered, nothing gets in your way when you want to share code, and I value that. Refactoring is easy even though your code may be in different places, and to avoid cyclic dependencies, you need to come up with a sound architecture.
In the end, it all boils down to your needs. Do you value velocity and ease of changing code across your entire codebase? Then you may prefer fewer build steps, version upgrades, and other barriers, making a monolithic setup attractive. Do you need to publish a package for your customers? They probably would not tolerate breaking changes and intransparent versioning, so using a package registry seems like a fit.
I think we’re still not at a point with Node.js, where reasoning about code sharing and dependency management is where it should be (the breaking transition from CommonJS to ESM is a great example).
I really don’t want to think about different module systems, build steps, and other headache-inducing inconsistencies. As a developer, I would like to build, ship, and run my work as fast and with as little friction as possible.