I can use <https://s3>.{region}.<http://amazonaws....
# help
g
Being a version control tool with the structure of repository and branches, this will reflect in the path for obtaining files, rather than the direct access from a bucket of s3
a
Hi @Guangdong Liu (liugddx), In addition to what @Offir Cohen said: this question comes up a lot. So I wrote a blog post that goes into details of what happens and why. You do not need to read this blog post to use lakeFS! But it might help if you're curious or worried.
Or perhaps I misunderstood your question, and you would like to know options for reading your data? Easiest if you are using Python is to use the high-level Python SDK. It is like Boto for lakeFS. Otherwise, you can use any of these: • For Spark, lakeFSFS • Use an OpenAPI generated API client for Java or Python • Use lakectl (best for manual usage, I would not recommend using it from a program) • Use the GUI • Access the REST API directly, without a generated client -- as you did. • The S3 gateway API lets you access objects on lakeFS using any S3 client If you would like more advice, could you provide more details on your programming environment?
g
Sorry, my reply is late.I'm implementing rust's sdk for lakefs. This project is apache opendal https://github.com/apache/opendal/tree/main/core/src/services/lakefs.
I originally wanted to directly use the interface of s3 to obtain the files on lakefs, but I found that this did not work, so I had to provide the method of lakefs separately.
a
We and many of our users use the S3 gateway successfully. Please provide details of what does not work, perhaps we can help?