Hi, I'm new to lakfs. I'm trying to run lakefs loc...
# help
j
Hi, I'm new to lakfs. I'm trying to run lakefs locally and use spark to write Iceberg table. But the write failed with
Copy code
exception org.apache.spark.sql.connector.catalog.CatalogNotFoundException: Catalog 'lakefs' plugin class not found: spark.sql.catalog.lakefs is not defined
        at org.apache.spark.sql.errors.QueryExecutionErrors$.catalogPluginClassNotFoundError(QueryExecutionErrors.scala:1904)
Any suggestion? Thanks!
My spark conf
Copy code
conf.set("spark.jars.packages", "org.apache.iceberg:iceberg-spark-runtime-3.3_2.12:1.3.0,io.lakefs:lakefs-iceberg:v0.1.2,org.apache.hadoop:hadoop-aws:3.3.3,io.lakefs:hadoop-lakefs-assembly:0.1.13")
conf.set("spark.sql.catalog.lakefs", "org.apache.iceberg.spark.SparkCatalog”)
conf.set("spark.sql.catalog.lakefs.catalog-impl", "io.lakefs.iceberg.LakeFSCatalog")
conf.set("spark.sql.catalog.lakefs.warehouse", f"lakefs://quickstart”)
conf.set("spark.sql.catalog.lakefs.uri", "http://127.0.0.1:8000")
conf.set("spark.sql.defaultCatalog", "lakefs")
conf.set("spark.hadoop.fs.s3.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
conf.set("spark.hadoop.fs.s3a.endpoint", "http://127.0.0.1:8000")
conf.set("spark.hadoop.fs.s3a.access.key", "xxx")
conf.set("spark.hadoop.fs.s3a.secret.key", "xxx")
conf.set("spark.hadoop.fs.s3a.path.style.access", "true")
conf.set("spark.sql.catalog.lakefs.cache-enabled", "false”)