I just tried to use asw cli to list the file in la...
# help
u
I just tried to use asw cli to list the file in lakefs, "aws s3 --profile lakefs --endpoint-url http://127.0.0.1/api/v1 ls s3://myexample/main/", but got an error "maximum recusion depth exceeded in comparision". Can anyone give me a clue for this?
u
Try to add
--max-depth
and set it to high number. Be aware though that this might take some time to complete
u
My mistake. There is no such option. See this though
u
Hey @donald @Shimi Wieder! couple of things worth looking at: 1. You should remove
/api/v1
from the endpoint (this is used for the OpenAPI endpoint, not the S3 gateway) 2. I imagine you're not running lakeFS on port 80, the default is port 8000 so a correct S3 gateway address is likely
<http://127.0.0.1:8000>
instead of
<http://127.0.0.1/api/v1>
Hope this is helpful!
u
I still got the same error by changing the endpoint to "http://127.0.0.1:8000"
u
I can successfully run lakefs cli to list files, "lakectl fs ls lakefs://aifactory/main/"
u
First, happy to hear that lakectl command worked. I understand that your repo is
aifactory
? In this case you should use it in s3 uri (you can read more here.
u
Yes, the repo is aifactory. And I have followed the AWS CLI for lakeFS document step by step. Then I got that error.
u
So can you please try using this s3 uri:
<s3://aifactory/main/|s3://aifactory/main/>
?
u
If I use the AWS CLI for Linux, I can successfully the list file. It seems that AWS CLI for Windows has something wrong.
u
But I have another problem. It seems that AWS CLI does not support batch copy when I run the command "aws --profile lakefs --endpoint http://127.0.0.1:8000 s3 cp s3::/aifactory/main ./" Do you know how I can checkout all the data under main branch?
u
I have figured out how to copy whole folder by just adding "--recursive"
u
Thanks for the update! In case that it's relevant, we also have the capability to export data from lakeFS.