Go’s Major Versioning Sucks – From a Fanboy

Go's Major Versioning Sucks

The post Go’s Major Versioning Sucks – From a Fanboy first appeared on Qvault.

I’m normally a fan of the rigidity within the Go toolchain. In fact, we use Go on the front and backend at Qvault. It’s wonderful to have standardized formatting, vetting, and testing across the entire language. The first real criticism I’ve had is with the way Go modules handle major versions. It’s over-the-top opinionated and slows down development in a significant number of scenarios.

Refresher on “Go Mod”

Go modules, and the associated commands go mod and go get can be thought of as Go’s equivalents to NPM and Yarn. The Go toolchain provides a way to manage dependencies and lock the versions that a collection of code depends on.

One of the most common operations is to update a dependency in an existing module. For example:

# update all dependencies
go get -u ./...

# add missing and remove unused dependencies
go mod tidy

# save all dependency code in the project's "vendor" folder
go mod vendor

Semantic Versioning

Go modules use git tags and semantic versioning to keep track of the versions of dependencies that are compatible with the module in question. Semantic versioning is a way to format version numbers and it looks like this: v{MAJOR}.{MINOR}.{PATCH}. For example, v1.2.3.

Each number is to be incremented according to the following standards:

MAJOR version when you make incompatible API changes,
MINOR version when you add functionality in a backwards compatible manner, and
PATCH version when you make backwards compatible bug fixes.

What’s the Problem?

When new versions of dependencies are released we have a simple command to get the newest stuff: go get -u. The problem is that this command has no way to automatically update to a new major version. It will only download new minor changes and patches. There isn’t even a console message to inform you that a new major version exists!

That said, the reason for not auto-updating is clear, and to be fair, well-founded:

If an old package and a new package have the same import path, the new package must be backwards compatible with the old package.

Import compatibility rule

In other words, we should only increment major versions when making breaking changes, and if breaking changes are made we shouldn’t be downloading them without knowing exactly what we are doing. Seems logical.

I Want Semantic Versioning, but Go Slows Me Down

It is often the case that I want to build a package that has domain-specific logic and will only be used in services at the small company I work for. For example, we have a repo that holds the struct{} definitions for common entities used across our system.

Occasionally we need to make backward-incompatible changes to those struct definitions. If it were an open-source library we wouldn’t make changes so often, but because it’s internal and we are aware of all the dependencies, we change the names of fields and keep things updated with business logic. This means major version changes are a fairly regular occurrence.

The problem is that Go makes updating major versions so cumbersome that in the majority of cases, we have opted to just increment minor versions when we should increment major versions. We want to follow the proper versioning scheme, we just don’t want to add unnecessary steps to our dev process.

How Is It Cumbersome?

On the Go Blog, the answer is laid out via an example:

To start development on v2 of github.com/googleapis/gax-go, we’ll create a new v2/ directory and copy our package into it.

In other words, for every major version, we are encouraged to maintain a new copy of the entire codebase. On the client side of the API we have even more inconveniences:

Users who wanted to use v2 had to change their package imports and module requirements to github.com/googleapis/gax-go/v2.

In other words, instead of a few simple CLI commands to get the latest dependencies, we also have to grep through the codebase and update all of our import paths.

Hey – I Get It

I understand why these decisions were made – and I even think in a lot of cases they were great decisions. For any open-source or public facing module this makes great sense. The Go toolchain is enforcing strict rules that encourage good API design.

In their effort to make public APIs great, they made it unnecessarily hard to have good “local” package design.

What I Wish Was the Case

If I could have it my way, I would only change a couple things:

  • Major versions on the package side should just use git tags like minor and patch versions. None of this copying the entire codebase nonsense.
  • Major versions should not be specified in the import path, just in the go.mod and go.lock files.
  • When running an update command like go get -u , there should be a prompt to ask “do you want to fetch major version updates?”

Barring this, at the very least can we add a warning to go get that at least informs us that there is a new major version? That would be nice.

Go still has the best toolchain and ecosystem. NPM and PIP can suck it.

If you disagree, @ me on Twitter.

Thanks For Reading!

Follow us on Twitter @q_vault if you have any questions or comments

Take some coding courses on our new platform

Subscribe to our Newsletter for more programming articles



source https://qvault.io/2020/09/15/gos-major-version-handling-sucks-from-a-fanboy/

Comments

Popular posts from this blog

Why is Exclusive Or (XOR) Important in Cryptography?

What are UUIDs, and should you use them?

6 Things to Avoid When Contributing to Open-Source Projects