Hello, I've recently deployed LakeFS with an AWS b...
# help
d
Hello, I've recently deployed LakeFS with an AWS backed RDS for postgres, and the database backed with an S3 bucket for files. I recently ran into an issue where I'm trying to log into the instance after the pod went down. I'm routed by default to
/auth/login
and I'm trying to use the Access Key ID and Secret Access Key that LakeFS gave me the first time we configured it. When I enter the Access and Secret keys I get a runtime error:
invalid memory address or nil pointer dereference
. I looked through the documentation and found this but nothing regarding needing any type of Persistent Volume for the configuration to recover on next startup. We're also using these charts for deployment. I assumed the user information for login would be stored in the postgres instance, is it actually stored in local files? Has anyone run into a similar situation?
n
Hi @David ! Sorry to hear that, can you provide some more information? Specifically can you provide the modified values file you used and the lakeFS logs?
d
Hi, I can provide more details on the lakeFS logs but the error is happening in a disconnected environment so pasting them is not easy, what would help the most? Here are the env values we are overriding currently:
Copy code
- name: LAKEFS_DATABASE_POSTGRES_CONNECTION_STRING
  valueFrom:
    secretKeyRef:
      key: LAKEFS_SQL_CONNECTION_STRING
      name: secrets
- name: LAKEFS_AUTH_ENCRYPT_SECRET_KEY
  valueFrom:
    secretKeyRef:
      key: ENV_LAKEFS_AUTH_ENCRYPT_SECRET_KEY
      name: secrets
- name: ENV_LAKEFS_POSTGRES_USER
  valueFrom:
    secretKeyRef:
      key: ENV_LAKEFS_POSTGRES_USER
      name: secrets
- name: ENV_LAKEFS_POSTGRES_PASSWORD
  valueFrom:
    secretKeyRef:
      key: ENV_LAKEFS_POSTGRES_PASSWORD
      name: secrets
We do also use the config.yaml:
Copy code
data:
  config.yaml: |-
    blockstore:
      type: s3
      s3:
        region: <>
    database:
      type: postgres
    committed:
      block_storage_prefix: <>
What is common across my local instance and my isolated env is the issue that when I port forward to the login page and use the creds that I created the instance with I get
Copy code
time="2024-08-29T21:49:24Z" level=info msg="Failed to authenticate user" func=pkg/auth.ChainAuthenticator.AuthenticateUser file="-v80.1/lakeFS/pkg/auth/authenticator.go:54" error="2 errors occurred:\n\t* built in authenticator: invalid secret access key\n\t* email authenticator: not found: no rows in result set\n\n" host="localhost:8080" log_audit=API method=POST operation_id=Login path=/api/v1/auth/login request_id=8cb2096c-7d59-4cd0-b023-f1e6f14ac529 service_name=rest_api username=<>
time="2024-08-29T21:49:24Z" level=error msg=authenticate func=pkg/api.userByAuth file="-v80.1/lakeFS/pkg/api/auth_middleware.go:236" error="2 errors occurred:\n\t* built in authenticator: invalid secret access key\n\t* email authenticator: not found: no rows in result set\n\n" service=api_gateway user=<>
n
Hi David - in order to assist you we will need to full lakeFS logs. What I am most worried about is the
invalid memory address or nil pointer dereference
error you mentioned. If you can provide logs for this scenario, it will be great
d
Hi, this is the issue I'm seeing:
Copy code
2024/08/30 15:20:12 http: panic serving 10.42.26.147:50620: runtime error: invalid memory address or nil pointer dereference
goroutine 840 [running]:
net/http.(*conn).serve.func1()
	/usr/local/go/src/net/http/server.go:1854 +0xbf
panic({0x196a640, 0x3935940})
	/usr/local/go/src/runtime/panic.go:890 +0x263
<http://github.com/treeverse/lakefs/pkg/auth.(*BuiltinAuthenticator).AuthenticateUser(0xc009bcb840|github.com/treeverse/lakefs/pkg/auth.(*BuiltinAuthenticator).AuthenticateUser(0xc009bcb840>?, {0x2ab6b30?, 0xc001061830?}, {0xc009bcb840?, 0xc0001a3b60?}, {0xc009bcb855, 0x28})
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/auth/authenticator.go:98 +0x94
<http://github.com/treeverse/lakefs/pkg/auth.ChainAuthenticator.AuthenticateUser(|github.com/treeverse/lakefs/pkg/auth.ChainAuthenticator.AuthenticateUser(>{0xc000618140, 0x1, 0x7f153b3a75b8?}, {0x2ab6b30, 0xc001061830}, {0xc009bcb840, 0x14}, {0xc009bcb855, 0x28})
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/auth/authenticator.go:47 +0x1cb
<http://github.com/treeverse/lakefs/pkg/api.userByAuth(|github.com/treeverse/lakefs/pkg/api.userByAuth(>{0x2ab6b30, 0xc001061830}, {0x2ac80b0, 0xc000594510}, {0x2aa8120?, 0xc000750888?}, {0x2aced38, 0xc0006f0180}, {0xc009bcb840, 0x14}, ...)
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/api/auth_middleware.go:234 +0x8a
<http://github.com/treeverse/lakefs/pkg/api.checkSecurityRequirements(0xc000849e00|github.com/treeverse/lakefs/pkg/api.checkSecurityRequirements(0xc000849e00>, {0xc001101940, 0x4, 0xc000b2b600?}, {0x2ac80b0?, 0xc0007302b0?}, {0x2aa8120, 0xc000750888}, {0x2aced38, 0xc0006f0180}, ...)
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/api/auth_middleware.go:98 +0x3af
<http://github.com/treeverse/lakefs/pkg/api.AuthMiddleware.func1.1({0x2ab3cd0|github.com/treeverse/lakefs/pkg/api.AuthMiddleware.func1.1({0x2ab3cd0>, 0xc0005c0140}, 0xc000849e00)
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/api/auth_middleware.go:49 +0x16c
net/http.HandlerFunc.ServeHTTP(0x1aad9e0?, {0x2ab3cd0?, 0xc0005c0140?}, 0xc?)
	/usr/local/go/src/net/http/server.go:2122 +0x2f
<http://github.com/treeverse/lakefs/pkg/httputil.DefaultLoggingMiddleware.func1.1(|github.com/treeverse/lakefs/pkg/httputil.DefaultLoggingMiddleware.func1.1(>{0x2ab5980?, 0xc000c34700}, 0xffffffffffffff01?)
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/httputil/logging.go:76 +0x665
net/http.HandlerFunc.ServeHTTP(0xc000849c00?, {0x2ab5980?, 0xc000c34700?}, 0xc00937bb70?)
	/usr/local/go/src/net/http/server.go:2122 +0x2f
<http://github.com/treeverse/lakefs/pkg/api.OapiRequestValidatorWithOptions.func1.1(|github.com/treeverse/lakefs/pkg/api.OapiRequestValidatorWithOptions.func1.1(>{0x2ab5980, 0xc000c34700}, 0xc000849b00)
	/home/ec2-user/lakefs-v80.1/lakeFS/pkg/api/serve.go:176 +0x2d1
net/http.HandlerFunc.ServeHTTP(0xc009acd050?, {0x2ab5980?, 0xc000c34700?}, 0xc0004ef3f8?)
	/usr/local/go/src/net/http/server.go:2122 +0x2f
<http://github.com/go-chi/chi/v5.(*ChainHandler).ServeHTTP(0x194ce60|github.com/go-chi/chi/v5.(*ChainHandler).ServeHTTP(0x194ce60>?, {0x2ab5980?, 0xc000c34700?}, 0xc000e91835?)
	/home/ec2-user/go/pkg/mod/github.com/go-chi/chi/v5@v5.0.0/chain.go:31 +0x2c
<http://github.com/go-chi/chi/v5.(*Mux).routeHTTP(0xc000d811a0|github.com/go-chi/chi/v5.(*Mux).routeHTTP(0xc000d811a0>, {0x2ab5980, 0xc000c34700}, 0xc000849b00)
	/home/ec2-user/go/pkg/mod/github.com/go-chi/chi/v5@v5.0.0/mux.go:436 +0x1f9
net/http.HandlerFunc.ServeHTTP(0x2ab6a88?, {0x2ab5980?, 0xc000c34700?}, 0x3935390?)
	/usr/local/go/src/net/http/server.go:2122 +0x2f
<http://github.com/go-chi/chi/v5.(*Mux).ServeHTTP(0xc000d811a0|github.com/go-chi/chi/v5.(*Mux).ServeHTTP(0xc000d811a0>, {0x2ab5980, 0xc000c34700}, 0xc000849a00)
	/home/ec2-user/go/pkg/mod/github.com/go-chi/chi/v5@v5.0.0/mux.go:87 +0x32a
<http://github.com/treeverse/lakefs/cmd/lakefs/cmd.glob..func8.3({0x2ab5980|github.com/treeverse/lakefs/cmd/lakefs/cmd.glob..func8.3({0x2ab5980>, 0xc000c34700}, 0xc000a83ad0?)
	/home/ec2-user/lakefs-v80.1/lakeFS/cmd/lakefs/cmd/run.go:353 +0x102
net/http.HandlerFunc.ServeHTTP(0xc000e91868?, {0x2ab5980?, 0xc000c34700?}, 0x77d794?)
	/usr/local/go/src/net/http/server.go:2122 +0x2f
net/http.serverHandler.ServeHTTP({0x2aae950?}, {0x2ab5980, 0xc000c34700}, 0xc000849a00)
	/usr/local/go/src/net/http/server.go:2936 +0x316
net/http.(*conn).serve(0xc000e98510, {0x2ab6b30, 0xc0005bd410})
	/usr/local/go/src/net/http/server.go:1995 +0x612
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:3089 +0x5ed
I was able to get it to occur with an invalid
LAKEFS_AUTH_ENCRYPT_SECRET_KEY
Is there a good way to recover the value from the configuration at all?
n
Can you please paste here the first log lines of the lakeFS server you ran
d
Copy code
time="2024-08-30T15:14:08Z" level=info msg="Configuration file" func=<http://github.com/treeverse/lakefs/cmd/lakefs/cmd.initConfig|github.com/treeverse/lakefs/cmd/lakefs/cmd.initConfig> file="/home/ec2-user/lakefs-v80.1/lakeFS/cmd/lakefs/cmd/root.go:61" fields.file=/etc/lakefs/config.yaml file="/home/ec2-user/lakefs-v80.1/lakeFS/cmd/lakefs/cmd/root.go:61" phase=startup
time="2024-08-30T15:14:08Z" level=info msg="Config loaded" func=cmd/lakefs/cmd.initConfig file="-v80.1/lakeFS/cmd/lakefs/cmd/root.go:103" fields.file=/etc/lakefs/config.yaml file="-v80.1/lakeFS/cmd/lakefs/cmd/root.go:103" phase=startup
time="2024-08-30T15:14:08Z" level=info msg=Config func=cmd/lakefs/cmd.initConfig file="-v80.1/lakeFS/cmd/lakefs/cmd/root.go:111" actions.enabled=true auth.api.endpoint="" auth.api.supports_invites=false auth.api.token="" auth.cache.enabled=true auth.cache.jitter=3s auth.cache.size=1024 auth.cache.ttl=20s auth.encrypt.secret_key="******" auth.logout_redirect_url=/auth/login auth.oidc.authorize_endpoint_query_parameters="map[]" auth.oidc.callback_base_url="" auth.oidc.client_id="" auth.oidc.client_secret="" auth.oidc.default_initial_groups="[]" auth.oidc.enabled=false auth.oidc.friendly_name_claim_name="" auth.oidc.initial_groups_claim_name=initial_groups auth.oidc.is_default_login=false auth.oidc.url="" auth.oidc.validate_id_token_claims="map[]" blockstore.azure.auth_method=access-key blockstore.azure.storage_access_key="" blockstore.azure.storage_account="" blockstore.azure.try_timeout=10m0s blockstore.default_namespace_prefix="" blockstore.gs.credentials_file="" blockstore.gs.credentials_json="" blockstore.gs.s3_endpoint="<https://storage.googleapis.com>" blockstore.local.path="~/data/lakefs/block" blockstore.s3.credentials_file="" blockstore.s3.discover_bucket_region=true blockstore.s3.endpoint="" blockstore.s3.force_path_style=false blockstore.s3.max_retries=5 blockstore.s3.profile="" blockstore.s3.region=us-gov-west-1 blockstore.s3.streaming_chunk_size=1048576 blockstore.s3.streaming_chunk_timeout=1s blockstore.type=s3 committed.block_storage_prefix=lakefs-int committed.local_cache.dir="~/data/lakefs/cache" committed.local_cache.max_uploaders_per_writer=10 committed.local_cache.metarange_proportion=0.1 committed.local_cache.range_proportion=0.9 committed.local_cache.size_bytes=1073741824 committed.permanent.max_range_size_bytes=20971520 committed.permanent.min_range_size_bytes=0 committed.permanent.range_raggedness_entries=50000 committed.sstable.memory.cache_size_bytes=400000000 database.connection_max_lifetime=0s database.connection_string=------ database.drop_tables=false database.dynamodb.aws_access_key_id=------ database.dynamodb.aws_region="" database.dynamodb.aws_secret_access_key=------ database.dynamodb.endpoint="" database.dynamodb.read_capacity_units=0 database.dynamodb.scan_limit=0 database.dynamodb.table_name=kvstore database.dynamodb.write_capacity_units=0 database.max_idle_connections=0 database.max_open_connections=0 database.postgres.connection_max_lifetime=0s database.postgres.connection_string="******" database.postgres.max_idle_connections=0 database.postgres.max_open_connections=0 database.type=postgres email.burst=10 email.lakefs_base_url="<http://localhost:8000>" email.limit_every_duration=1m0s email.local_name="" email.password="" email.sender="" email.smtp_host="" email.smtp_port=0 email.use_ssl=false email.username="" fields.file=/etc/lakefs/config.yaml file="-v80.1/lakeFS/cmd/lakefs/cmd/root.go:111" gateways.s3.domain_name="[<http://s3.local.lakefs.io|s3.local.lakefs.io>]" gateways.s3.fallback_url="" gateways.s3.region=us-east-1 installation.fixed_id="" listen_address="0.0.0.0:8000" logging.audit_log_level=DEBUG logging.file_max_size_mb=0 logging.files_keep=100 logging.format=text logging.level=INFO logging.output="[-]" logging.trace_request_headers=false phase=startup security.audit_check_interval=12h0m0s security.audit_check_url="<https://audit.lakefs.io/audit>" stats.address="<https://stats.treeverse.io>" stats.enabled=true stats.flush_interval=30s ui.enabled=true ui.snippets="[]"
time="2024-08-30T15:14:08Z" level=info msg="lakeFS run" func=cmd/lakefs/cmd.glob..func8 file="-v80.1/lakeFS/cmd/lakefs/cmd/run.go:111" version=dev-d1873a0d.with.local.changes
time="2024-08-30T15:14:08Z" level=info msg="KV valid" func=pkg/kv.ValidateSchemaVersion file="-v80.1/lakeFS/pkg/kv/migration.go:59" version=4
time="2024-08-30T15:14:08Z" level=info msg="initialized Auth service" func=pkg/auth.NewKVAuthService file="-v80.1/lakeFS/pkg/auth/service.go:201" service=auth_service
time="2024-08-30T15:14:13Z" level=info msg="initialize blockstore adapter" func=pkg/block/factory.BuildBlockAdapter file="-v80.1/lakeFS/pkg/block/factory/build.go:40" type=s3
time="2024-08-30T15:14:13Z" level=info msg="initialized blockstore adapter" func=pkg/block/factory.buildS3Adapter file="-v80.1/lakeFS/pkg/block/factory/build.go:108" type=s3
time="2024-08-30T15:14:13Z" level=info msg="initialize blockstore adapter" func=pkg/block/factory.BuildBlockAdapter file="-v80.1/lakeFS/pkg/block/factory/build.go:40" type=s3
time="2024-08-30T15:14:13Z" level=info msg="initialized blockstore adapter" func=pkg/block/factory.buildS3Adapter file="-v80.1/lakeFS/pkg/block/factory/build.go:108" type=s3
time="2024-08-30T15:14:14Z" level=error msg="Audit check failed" func="pkg/version.(*AuditChecker).CheckAndLog" file="-v80.1/lakeFS/pkg/version/audit.go:106" check_url="<https://audit.lakefs.io/audit>" error="audit check request failed: 424 Failed Dependency (Status code: 424)" version=dev-d1873a0d.with.local.changes
time="2024-08-30T15:14:14Z" level=info msg="initialize OpenAPI server" func=pkg/api.Serve file="-v80.1/lakeFS/pkg/api/serve.go:63" service=api_gateway
time="2024-08-30T15:14:14Z" level=info msg="initialized S3 Gateway handler" func=pkg/gateway.NewHandler file="-v80.1/lakeFS/pkg/gateway/handler.go:122" s3_bare_domain="[<http://s3.local.lakefs.io|s3.local.lakefs.io>]" s3_region=us-east-1
time="2024-08-30T15:14:14Z" level=info msg="starting HTTP server" func=cmd/lakefs/cmd.glob..func8 file="-v80.1/lakeFS/cmd/lakefs/cmd/run.go:340" listen_address="0.0.0.0:8000"
     ██╗      █████╗ ██╗  ██╗███████╗███████╗███████╗
     ██║     ██╔══██╗██║ ██╔╝██╔════╝██╔════╝██╔════╝
     ██║     ███████║█████╔╝ █████╗  █████╗  ███████╗
     ██║     ██╔══██║██╔═██╗ ██╔══╝  ██╔══╝  ╚════██║
     ███████╗██║  ██║██║  ██╗███████╗██║     ███████║
     ╚══════╝╚═╝  ╚═╝╚═╝  ╚═╝╚══════╝╚═╝     ╚══════╝
│
│ If you're running lakeFS locally for the first time,
│     complete the setup process at <http://127.0.0.1:8000/setup>
│
│
│ For more information on how to use lakeFS,
│     check out the docs at <https://docs.lakefs.io/quickstart/repository>
│
│
time="2024-08-30T15:14:14Z" level=info msg="Up and running (^C to shutdown)..." func=cmd/lakefs/cmd.gracefulShutdown file="-v80.1/lakeFS/cmd/lakefs/cmd/run.go:499" version=dev-d1873a0d.with.local.changes
│ For support or any other question,
│     join our Slack channel <https://docs.lakefs.io/slack>
│
Version dev-d1873a0d.with.local.changes
n
Previously you said you were using a helm chart for deployment. From what I see in this log you are running a dev version with changes
d
Its deployed from a helm chart, I'll track down where the image was sourced from.
n
Ok, in any case I suspect that this version is based on an old code base. I will ask you to try and reproduce this issue with the latest official release of lakeFS. If this occurs also with the latest release please let us know
d
Yes, its based on 0.80.0 from what I can tell
Is there a way to retrieve the
LAKEFS_AUTH_ENCRYPT_SECRET_KEY
that lakefs is using from the database?
n
You won't be able to use the latest chart with this version. Please update your environment to use the latest lakeFS version