Yoni Augarten
10/09/2022, 1:20 PMVaibhav Kumar
10/09/2022, 1:24 PMcurl -LfO '<https://airflow.apache.org/docs/apache-airflow/2.4.1/docker-compose.yaml>'
Barak Amar
10/09/2022, 1:27 PM.env
file in the same directory with the values you like:
_PIP_ADDITIONAL_REQUIREMENTS=airflow-provider-lakefs
echo -e "AIRFLOW_UID=$(id -u)" >> .env
based on the docsVaibhav Kumar
10/09/2022, 1:29 PMBarak Amar
10/09/2022, 1:30 PMVaibhav Kumar
10/09/2022, 2:31 PMYoni Augarten
10/09/2022, 2:41 PMVaibhav Kumar
10/09/2022, 3:04 PMYoni Augarten
10/09/2022, 5:09 PMVaibhav Kumar
10/09/2022, 5:12 PM<https://github.com/treeverse/airflow-provider-lakeFS/blob/main/lakefs_provider/example_dags/lakefs-dag.py>
Yoni Augarten
10/09/2022, 5:13 PMVaibhav Kumar
10/09/2022, 5:17 PMYoni Augarten
10/09/2022, 5:19 PMAmit Kesarwani
10/11/2022, 4:21 PMVaibhav Kumar
10/11/2022, 5:13 PMAmit Kesarwani
10/11/2022, 5:34 PMVaibhav Kumar
10/14/2022, 2:00 PMcommand: AWS_ACCESS_KEY_ID=${{ env.KEY }} AWS_SECRET_ACCESS_KEY=${{ env.SECRET }} aws s3 cp --endpoint-url=<http://s3.local.lakefs.io:8000>
<s3://example-repo/main/path/to/_SUCCESS> -
Niro
10/15/2022, 4:11 PMVaibhav Kumar
10/15/2022, 4:17 PMNiro
10/15/2022, 4:22 PMVaibhav Kumar
10/15/2022, 5:20 PMNiro
10/15/2022, 5:24 PMmain
branch. Once this happens, the copy command succeeds and we continue to the next step.
You can follow the lakeFS DAG code to understand the flow better.