From 9f8f5b3bb3c3d8c55470116317898a9d7ef645fa Mon Sep 17 00:00:00 2001 From: Jonny Dixon Date: Wed, 6 Nov 2024 09:08:39 +0000 Subject: [PATCH] Update README.md --- metadata-integration/java/spark-lineage-legacy/README.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/metadata-integration/java/spark-lineage-legacy/README.md b/metadata-integration/java/spark-lineage-legacy/README.md index d0120d4c6a3f1..c163fe485c74c 100644 --- a/metadata-integration/java/spark-lineage-legacy/README.md +++ b/metadata-integration/java/spark-lineage-legacy/README.md @@ -96,9 +96,10 @@ The Spark agent can be configured using Databricks Cluster [Spark configuration] - Open Databricks Cluster configuration page. Click the **Advanced Options** toggle. Click the **Spark** tab. Add below configurations under `Spark Config`. ```text - spark.extraListeners datahub.spark.DatahubSparkListener - spark.datahub.rest.server http://localhost:8080 - spark.datahub.databricks.cluster cluster-name + spark.extraListeners datahub.spark.DatahubSparkListener + spark.datahub.rest.server http://localhost:8080 + spark.datahub.stage_metadata_coalescing true + spark.datahub.databricks.cluster cluster-name ``` - Click the **Init Scripts** tab. Set cluster init script as `dbfs:/datahub/init.sh`.