We are looking into connecting unity catalog to la...
# help
i
We are looking into connecting unity catalog to lakefs. What's the best way to approach this, trying to wrap my head around how others should know that they should use our custom S3 Endpoint to look for
Damn.. "Currently, Unity Catalog export feature exclusively supports AWS S3". Should have checked this earlier
a
Azure Blob is also supported.
i
Are there any docs on this? the docs I find is saying only AWS S3 is supported
a
I will publish Azure Databricks sample notebook in our samples repo and will let you know.
Meanwhile you can read this blog to understand the concept: https://lakefs.io/blog/lakefs-unity-catalog-integration-tutorial/
i
Is it also possible without Lua hooks?
Reason is, a platform team does the registering of the tables. We don't get the accèss ourselves to do this
i
Following our email correspondence on additional features, Potentially in the future we could run python code in hooks. Let’s discuss on our call! (FYI @Oz Katz)
🤟 1
i
It might also be interesting to connect with the databricks unity product team together?
a
@Ion I released Azure Databricks sample notebook for Unity Catalog Integration demo. There are 2 notebooks: Unity Catalog Integration Demo.ipynb and unityCatalogIntegrationDemoSetup.ipynb.
i
@Amit Kesarwani hey, thanks for the Material! I went through it and the guide but I think the setup might not work for us. • Table registration is done by a different team, we can only mention what the paths are to them. • We are not allowed to use oauth tokens Is it possible to only use the table exporter, and not do the registration with the hook, and let this be done by the other team?
Ah reading the docs, I guess I just need to use the "Delta Lake table exporter" only :)
a
Yes, you can only use table exporter
Just comment out the table registration line in the Lua sciprt
🤟 1