Hi everyone, I am running the lakefs binary direct...
# help
c
Hi everyone, I am running the lakefs binary directly on my computer. I downloaded the lakeFS binary for my OS, installed and configured PostgreSQL and created a configuration file. While I can compare and merge successfully between two branches, I can't merge between main and a branch. Any recommendations on this?
👀 1
b
Hi, is it possible for you to send me the lakefs log lines at the time of the merge. Also, can you check if there is a file ..2572 (the one specified in the error) at that location?
c
Here's another one. The file in the specified error is present at the location (for both errors)
b
can you use explorer or the cmd and go to <home>/data/lakefs/cache and check if there is a folder named 'bucket-id'?
if there isn't one, create the folder and retry.
c
the bucket-id and my-bucket folder already exist
b
Thanks for checking. I will try to reproduce it on a Windows environment. Are you using the latest version of lakeFS?
👍 1
c
Yes, I am
b
thanks, I'll try to update soon.
👍 1
Managed to reproduce the state and capture the information into this issue: https://github.com/treeverse/lakeFS/issues/2531
Will address this bug during the following week and update the issue.
c
Great! Thank you @Barak Amar
Hello @Barak Amar The new release solved the error. I was wondering, is there any way for me to integrate any of the data frameworks like Trino whole running lakefs locally?
y
Hi @Chidumga, I'm glad to hear!
You can absolutely integrate any S3 compatible framework with lakeFS both locally and in production, and that includes Trino
Let me refer you to the documentation
c
Great, that would be helpful. I had tried setting my storage namespace to my s3 bucket, but I got an error saying only local:// is allowed.
y
If you are running lakeFS with a "local" blockstore type, then your storage namespace should indeed start with "local://". Essentially it means that your data would be ephemeral and not persisted in the cloud, only on your local disk. You can still use Trino over such an installation. Do you think that can work for you or would you like to persist your data in S3?
c
I want to try using Trino and other data frameworks over local installation.
y
Great, then you can use a local storage namespace. You will then configure Trino on top of lakeFS as described in this doc: https://docs.lakefs.io/integrations/presto_trino.html
Wherever it says in the docs to use lakefs.example.com, use your local lakeFS server (usually localhost:8000)
c
Great! Thank you
y
We also have a very interesting blog post about how to set up the entire Trino stack on top of lakeFS, in a single docker-compose. I highly recommend trying this out: https://lakefs.io/the-docker-everything-bagel-spin-up-a-local-data-stack/