• Mustafa Abdullah K.

    Mustafa Abdullah K.

    5 months ago
    im doing to sync following steps: create a branch add all files merge main branch
    Mustafa Abdullah K.
    Shimi Wieder
    12 replies
    Copy to Clipboard
  • t

    Tristan van der vlugt

    5 months ago
    Hi all, I'm trying to get a simple instance of lakefs running on azure container instances. I've got a container running lakefs, using a command tail","-f", "/dev/null" to keep it running, with the environment variables set and port set to 8000. How can I access the interface for lakefs like in https://docs.lakefs.io/setup/create-repo.html? what other components need to be setup? I'd appreciate any help I can get on this matter
    t
    Shimi Wieder
    +1
    23 replies
    Copy to Clipboard
  • v

    Vino

    5 months ago
    Hi everyone! I'm trying to setup LakeFS on my local and I'm puzzled as to how to move forward. Here's what I've done so far. 1. Installed LakeFS using docker compose (followed the steps in documentation here: https://docs.lakefs.io/quickstart/installing.html). 2. Verified that I'm able to access the LakeFS UI on http://127.0.0.1:8000/setup. 3. Created a new repo through the UI and used local as the storage namespace. 4. After this, I want to access the repo programmatically (python) instead of using the UI. So following the docs here (https://docs.lakefs.io/integrations/python.html), I installed lakefs-client (v0.63.0), ran the snippet below and hit the error
    lakefs_client.exceptions.UnauthorizedException: (401)
    What am I missing? Isn't specifying the username and password sufficient to connect to LakeFS instance running on docker?
    v
    Shimi Wieder
    17 replies
    Copy to Clipboard
  • v

    Vino

    5 months ago
    Hi, for the lakefs running on a docker instance, what is the filepath one should use to access the files (using python) ? I tried local://demo-repo/main/example.csv and lakefs://demo-repo/main/example.csv but both of them don't work. I have storage namespace as
    local
    v
    Shimi Wieder
    +1
    18 replies
    Copy to Clipboard
  • j

    Jude

    5 months ago
    Hi all, I'm currently trying to automate data ingestion using airflow. Basically, I want to be able to use spark to read a sample database and then write it to my LakeFS s3 bucket. Any advice as regards to a way I can achieve this?
    j
    Shimi Wieder
    +5
    57 replies
    Copy to Clipboard
  • v

    Vino

    4 months ago
    Hi all, quick question. I've 2 files file_a, file_b in main branch. After I create a new branch with main as a source branch, why don't I see the 2 files under new branch also? I thought this would be the expected behavior (like git).
    v
    Yoni Augarten
    4 replies
    Copy to Clipboard
  • d

    donald

    4 months ago
    I am setting up lakefs with docker compose. This linux machine has mount to one nfs storage and map it to /data. Then I start the lakefs and got error "level=fatal msg="failed to create catalog" func=cmd/lakefs/cmd.glob..func8 file="cmd/run.go:131" error="build block adapter: got error opening a local block adapter with path /home/lakefs: path provided is not writable"
    d
    Or Tzabary
    +2
    107 replies
    Copy to Clipboard
  • Shradheya Thakre

    Shradheya Thakre

    4 months ago
    Hey everyone, I am trying to figure out whats the best way to test my local code changes Currently this is what i did1. Added a random log in service.go which i want to be printed 2. Did
    make build
    3. Ran
    curl <https://compose.lakefs.io> | docker-compose -f - up
    and seeing some logs 4. Ran
    ./lakectl commit <lakefs://actions-stuff/main> -m "added webhook again 2"
    (made sure to do ./lakectl thats built ) 5. But i dont see the log in the step 3. process not sure if i am missing anything
    Shradheya Thakre
    Eden Ohana
    +1
    12 replies
    Copy to Clipboard
  • Houssam Kherraz

    Houssam Kherraz

    4 months ago
    Hi everyone! Now that I've tested a few basic functionality with lakeFS I wanted to ask: Is there a beefier image I could pull apart from the alpine based one (used with the docker-compose)? The reason I'm asking is that I wanted to run a few commands using rclone/distcp but it was painful to get things installed using ash and alpine. Might be good to have a more substantial image with some of those common tools used with lakeFS installed if it doesn't already exist. Thanks!
    Houssam Kherraz
    Jonathan Rosenberg
    4 replies
    Copy to Clipboard
  • j

    Jude

    4 months ago
    Hi everyone, Is there a way we can have BigDL tool setup on LakeFS?
    j
    Jonathan Rosenberg
    16 replies
    Copy to Clipboard