Hi everyone!:) I'm from ArtemisTek (<mailto:ay@art...
# help
n
Hi everyone!:) I'm from ArtemisTek (ay@artemistek.com), and we're looking for an alternative to DVC that could support large (+100 Gb) files for data versioning. Looks like LakeFS is a golden mine šŸ‘šŸ’Æ Can someone kindly help to confirm the following questions, please: • Can we work with unstructured data? (videos, images, .wav files..) • Can we work on closed loop server configurations? We use both Azure and local options.. Example of a closed loop: client's cameras translate videostream -> passed to a commutator switch -> GPU server produces inference, stores and versions large data (on LakeFS)? • Can we actually store the files somewhere on LakeFS? Is there a limit to the file sizes? Thank you very much for your time! šŸ™
g
Hey @Neyu Smithus, Welcome aboard 😃 Would love to confirm your questions 1. lakeFS could work with any type of data. 2. lakeFS would work on a closed loop server. lakeFS is S3 compatible, any use case that would work with S3/minIO could work with lakeFS. 3. The data is not stored on lakeFS, it will be stored on a storage service of your choice (such as S3, Azure, GCS). lakeFS manages the metadata which will also be saved on your object store. For further information please see our docs and the Architecture Overview I would be happy to answer any other question, Thanks for reaching out.