Hey everyone. How do I upload multiple files at th...
# help
b
Hey everyone. How do I upload multiple files at the same time with
lakectl
? I have files in the
data
folder and I want them uploaded to my repo called
<lakefs://example>@master/data
.` But I am getting always an error when I do it like this on GitBash:
a
Hi Bex, Not sure you can do this with lakectl fs. I usually use
aws s3 cp
. Can you use that? If you can specify a use-case then let's open an issue.
Would the CLI syntax at https://github.com/treeverse/lakeFS/pull/977 be suitable for you? Note that you do need to repeat the
-s
flag before each argument. However this is probably a feature, given the trickiness of the way the
cp
command handles its arguments.
An alternative PR: https://github.com/treeverse/lakeFS/pull/979 would provide a
--recursive
flag, which is often more convenient.
b
Thanks for your answers. I will try them
a
Great! Please note that these are merely pull requests for now. Let me know if you need to try one in a hurry, please contact me. And I shall pull one of them today, and we should be able to give you an executable sooner.
I pulled https://github.com/treeverse/lakeFS/pull/979, which adds a
--recursive
flag to
lakectl upload
. Please let me know if you need it quickly, and I'll do a small release with it.
b
Hey, thanks for the effort! But I still need a little more help. When I run the --recursive command it is giving me an unknown command error
Do I have to download and extract the binaries for the pull request to take effect in my repos?
a
Haven't released it yet. We'll release a version today. If you need it before and you're comfortable building from source, download the repo and "make build".
b
I think I will wait for the release thanks
Will I be able to upload multiple files at once?
a
@Barak Amar released this (thanks!), https://lakefs.slack.com/archives/C017S6YFFSP/p1607011495024000. This supports
--recursive
. If you want multiple files, I would recommend using the
aws s3 cp
client or even the
mc
client from `minio`; both are very full-features S3 clients.
b
Hey, Ariel Shaqed. The new feature is working perfectly well!
🙂 1
👏 1