Skip to content

Commit

Permalink
fixed the documentation of DLT pipeline
Browse files Browse the repository at this point in the history
  • Loading branch information
Divyansh-db committed Jul 29, 2024
1 parent 2e499f5 commit ec3148c
Show file tree
Hide file tree
Showing 3 changed files with 36 additions and 8 deletions.
26 changes: 25 additions & 1 deletion docs/resources/pipeline.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,13 +76,26 @@ The following arguments are supported:
* `library` blocks - Specifies pipeline code and required artifacts. Syntax resembles [library](cluster.md#library-configuration-block) configuration block with the addition of a special `notebook` & `file` library types that should have the `path` attribute. *Right now only the `notebook` & `file` types are supported.*
* `cluster` blocks - [Clusters](cluster.md) to run the pipeline. If none is specified, pipelines will automatically select a default cluster configuration for the pipeline. *Please note that DLT pipeline clusters are supporting only subset of attributes as described in [documentation](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-api-guide.html#pipelinesnewcluster).* Also, note that `autoscale` block is extended with the `mode` parameter that controls the autoscaling algorithm (possible values are `ENHANCED` for new, enhanced autoscaling algorithm, or `LEGACY` for old algorithm).
* `continuous` - A flag indicating whether to run the pipeline continuously. The default value is `false`.
* `development` - A flag indicating whether to run the pipeline in development mode. The default value is `true`.
* `development` - A flag indicating whether to run the pipeline in development mode. The default value is `false`.
* `photon` - A flag indicating whether to use Photon engine. The default value is `false`.
* `serverless` - An optional flag indicating if serverless compute should be used for this DLT pipeline. Requires `catalog` to be set, as it could be used only with Unity Catalog.
* `catalog` - The name of catalog in Unity Catalog. *Change of this parameter forces recreation of the pipeline.* (Conflicts with `storage`).
* `target` - The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
* `edition` - optional name of the [product edition](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-concepts.html#editions). Supported values are: `CORE`, `PRO`, `ADVANCED` (default). Not required when `serverless` is set to `true`.
* `channel` - optional name of the release channel for Spark version used by DLT pipeline. Supported values are: `CURRENT` (default) and `PREVIEW`.
* `allow_duplicate_names` - Optional boolean flag. If false, deployment will fail if name conflicts with that of another pipeline. default is `false`.
* `deployment` - Deployment type of this pipeline. Supports following attributes:
* `kind` - The deployment method that manages the pipeline.
* `metadata_file_path` - The path to the file containing metadata about the deployment.
* `filters` - Filters on which Pipeline packages to include in the deployed graph. This block consists of following attributes:
* `include` - Paths to include.
* `exclude` - Paths to exclude.
* `gateway_definition` - The definition of a gateway pipeline to support CDC. Consists of following attributes:
* `connection_id` - Immutable. The Unity Catalog connection this gateway pipeline uses to communicate with the source.
* `gateway_storage_catalog` - Required, Immutable. The name of the catalog for the gateway pipeline's storage location.
* `gateway_storage_name` - Required. The Unity Catalog-compatible naming for the gateway storage location. This is the destination to use for the data that is extracted by the gateway. Delta Live Tables system will automatically create the storage location under the catalog and schema.
* `gateway_storage_schema` - Required, Immutable. The name of the schema for the gateway pipelines's storage location.


### notification block

Expand All @@ -95,6 +108,17 @@ DLT allows to specify one or more notification blocks to get notifications about
* `on-update-fatal-failure` - a pipeline update fails with a non-retryable (fatal) error.
* `on-flow-failure` - a single data flow fails.

### ingestion_definition block

The configuration for a managed ingestion pipeline. These settings cannot be used with the `library`, `target` or `catalog` settings. This block consists of following attributes:

* `connection_name` - Immutable. The Unity Catalog connection this ingestion pipeline uses to communicate with the source. Specify either ingestion_gateway_id or connection_name.
* `ingestion_gateway_id` - Immutable. Identifier for the ingestion gateway used by this ingestion pipeline to communicate with the source. Specify either ingestion_gateway_id or connection_name.
* `objects` - Required. Settings specifying tables to replicate and the destination for the replicated tables.
* `table_configuration` - Configuration settings to control the ingestion of tables. These settings are applied to all tables in the pipeline.



## Attribute Reference

In addition to all arguments above, the following attributes are exported:
Expand Down
6 changes: 0 additions & 6 deletions exporter/importables.go
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,6 @@ import (
"github.com/databricks/terraform-provider-databricks/jobs"
"github.com/databricks/terraform-provider-databricks/mws"
"github.com/databricks/terraform-provider-databricks/permissions"

// "github.com/databricks/terraform-provider-databricks/pipelines"
"github.com/databricks/terraform-provider-databricks/repos"
tfsharing "github.com/databricks/terraform-provider-databricks/sharing"
tfsql "github.com/databricks/terraform-provider-databricks/sql"
Expand Down Expand Up @@ -1952,9 +1950,6 @@ var resourcesMap map[string]importable = map[string]importable{
pipelinesList, err := w.Pipelines.ListPipelinesAll(ic.Context, pipelines.ListPipelinesRequest{
MaxResults: 50,
})

// api := pipelines.NewPipelinesAPI(ic.Context, ic.Client)
// pipelinesList, err := api.List(50, "")
if err != nil {
return err
}
Expand Down Expand Up @@ -2026,7 +2021,6 @@ var resourcesMap map[string]importable = map[string]importable{
ID: cluster.PolicyId,
})
}
// ic.emitInitScriptsLegacy(cluster.InitScripts)
ic.emitInitScripts(cluster.InitScripts)
ic.emitSecretsFromSecretsPathMap(cluster.SparkConf)
ic.emitSecretsFromSecretsPathMap(cluster.SparkEnvVars)
Expand Down
12 changes: 11 additions & 1 deletion pipelines/resource_pipeline.go
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ func Create(w *databricks.WorkspaceClient, ctx context.Context, d *schema.Resour
if d.Get("dry_run").(bool) {
id = createdPipeline.EffectiveSettings.Id
d.SetId(id)
return nil
return fmt.Errorf("dry run succeeded; pipeline %s was not created", id)
} else {
id = createdPipeline.PipelineId
}
Expand Down Expand Up @@ -205,6 +205,11 @@ func (Pipeline) CustomizeSchema(s *common.CustomizableSchema) *common.Customizab
// ForceNew fields
s.SchemaPath("storage").SetForceNew()
s.SchemaPath("catalog").SetForceNew()
s.SchemaPath("gateway_definition", "connection_id").SetForceNew()
s.SchemaPath("gateway_definition", "gateway_storage_catalog").SetForceNew()
s.SchemaPath("gateway_definition", "gateway_storage_schema").SetForceNew()
s.SchemaPath("ingestion_definition", "connection_name").SetForceNew()
s.SchemaPath("ingestion_definition", "ingestion_gateway_id").SetForceNew()

// Computed fields
s.SchemaPath("id").SetComputed()
Expand Down Expand Up @@ -245,6 +250,7 @@ func (Pipeline) CustomizeSchema(s *common.CustomizableSchema) *common.Customizab
// ConflictsWith fields
s.SchemaPath("storage").SetConflictsWith([]string{"catalog"})
s.SchemaPath("catalog").SetConflictsWith([]string{"storage"})
s.SchemaPath("ingestion_definition", "connection_name").SetConflictsWith([]string{"ingestion_definition.0.ingestion_gateway_id"})

// MinItems fields
s.SchemaPath("library").SetMinItems(1)
Expand All @@ -259,6 +265,10 @@ func (Pipeline) CustomizeSchema(s *common.CustomizableSchema) *common.Customizab
s.SchemaPath("channel").SetValidateFunc(validation.StringInSlice([]string{"current", "preview"}, true))
s.SchemaPath("edition").SetValidateFunc(validation.StringInSlice([]string{"pro", "core", "advanced"}, true))

// RequiredWith fields
s.SchemaPath("gateway_definition").SetRequiredWith([]string{"gateway_definition.0.gateway_storage_name", "gateway_definition.0.gateway_storage_catalog", "gateway_definition.0.gateway_storage_schema"})
s.SchemaPath("ingestion_definition").SetRequiredWith([]string{"ingestion_definition.0.objects"})

return s
}

Expand Down

0 comments on commit ec3148c

Please sign in to comment.