Interested in hearing opinions about
https://github.com/treeverse/lakeFS/issues/1481 -- how to deliver protobuf changes to the Spark client and other customer code. So far I have 3 possibilities:
1. Do nothing, just copy proto files until we actually get tooling to do it. Might mean the Spark client never supports any new features (because we forget to copy them over) until a customer runs into a missing field and complains.
2. Move to a third git repo and use git submodules. Known to work, never seen it
happily used. Probably lowest friction for Treeverse developers.
3. Build proto packages inside the lakefs repo (or a new proto-only repo) for each language, publish to many different package repositories (Maven Central, Go via GitHub, probably PyPI and npmjs at some stage). Adds maximal friction to core development.
Happy to read your comments on the issue or here.
Thanks!