Is it possible to run lakefs locally in docker, bu...
# help
d
Is it possible to run lakefs locally in docker, but have the blockstore be backed by GCS? I tried
Copy code
docker run --name lakefs -p 8000:8000 -e LAKEFS_BLOCKSTORE_TYPE="gs" -e LAKEFS_DATABASE_TYPE=local -e LAKEFS_AUTH_ENCRYPT_SECRET_KEY="..."  -e LAKEFS_BLOCKSTORE_CREDENTIALS_JSON=my-svc-acct.json   treeverse/lakefs:latest run
and it failed with
time=“2024-03-14T221322Z” level=warning msg=“Failed to get Google numeric project ID from instance metadata” func=“pkg/cloud/gcp.(*MetadataProvider).GetMetadata” file=“build/pkg/cloud/gcp/metadata.go:23" error=“Get \“http://169.254.169.254/computeMetadata/v1/project/numeric-project-id\“: dial tcp 169.254.169.25480 connect: connection refused”
time=“2024-03-14T221322Z” level=info msg=“initialize blockstore adapter” func=pkg/block/factory.BuildBlockAdapter file=“build/pkg/block/factory/build.go:32" type=gs
time=“2024-03-14T221322Z” level=fatal msg=“Failed to create block adapter” func=cmd/lakefs/cmd.glob..func8 file=“cmd/run.go:159” error=“dialing: google: could not find default credentials. See https://cloud.google.com/docs/authentication/external/set-up-adc for more information”
Judging by the fact that it’s attempting to connect to 169.254.169.254 I’m guessing it expects to be running inside GCP?
(and if the answer is “no”, is there a way for me to test out importing a gcs bucket locally? Using the local blockstore I get an error that import is not enabled for local).
(same for mem blockstore)
a
I was able to run some time back by using this command:
Copy code
docker run -d --restart always --pull always -p 8000:8000 -v $PWD:/myfiles -e LAKEFS_BLOCKSTORE_TYPE='gs' -e LAKEFS_BLOCKSTORE_GS_CREDENTIALS_FILE='/myfiles/my-svc-acct.json' --name lakefs-gcp2 treeverse/lakefs run --local-settings
gratitude thank you 1