LakeFS file system error when trying to connect to...
# help
m
Hi everyone, I'm planning to give a demo next week to some developers in my company since I'm really enthousiastic about the possibilities of LakeFS + Databricks! However, I'm still running into the same issue as before when trying to connect to my storage account with the lakefs:// file system. Maybe for some more context, I'm trying to follow this guide: (https://lakefs.io/blog/databricks-lakefs-integration-tutorial/)
  • I've tried with a simple single line-csv file
  • I've added all spark config variables necessary to the Compute cluster advanced options
  • My storage account has no firewall in place
  • The databricks subnet has no firewall in place
  • I've added the latest versions of all required packages (see attached screenshot)
The client seems to work fine, so that makes me think that my credentials aren't the issue, but just the file system keeps throwing the following error: Caused by: io.lakefs.hadoop.shade.api.ApiException: Content type "text/html; charset=utf-8" is not supported for type: class io.lakefs.hadoop.shade.api.model.ObjectStats Any ideas as to what I could try or what causes the issue? Any help would be really appreciated!