Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add databricks_alert resource to replace databricks_sql_alert #4051

Merged
merged 24 commits into from
Oct 18, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
355a791
[Feature] Add `databricks_alert` resource to replace `databricks_sql_…
alexott Sep 27, 2024
5910dfb
Started to add unit tests
alexott Oct 3, 2024
12d5984
read and delete tests
alexott Oct 3, 2024
c9c9822
finish with the unit tests
alexott Oct 4, 2024
1ed658e
Add deprecation message to sql alert resource
alexott Oct 4, 2024
8b8e71c
Added integration test
alexott Oct 4, 2024
2ab3db1
Fix integration test template
alexott Oct 4, 2024
17dd2cc
Merge branch 'main' into new-alert-resource
alexott Oct 8, 2024
1a0ecf1
Add suppress diff for workspace files
alexott Oct 8, 2024
06bf71c
Add docs
alexott Oct 8, 2024
eabde7a
Tune schema customization
alexott Oct 8, 2024
03243fc
Add setting the owner on creation
alexott Oct 8, 2024
85d0c09
Add another custom diff for `parent_path`
alexott Oct 8, 2024
f42bc9b
Add migration guide
alexott Oct 9, 2024
81af2f8
Add an example to the migration guide
alexott Oct 9, 2024
6688eee
Merge branch 'main' into new-alert-resource
alexott Oct 14, 2024
9b4fc63
Add an example of permissions for alerts
alexott Oct 14, 2024
a6ee435
Update docs/resources/alert.md
alexott Oct 14, 2024
a9a747b
Start to address review comments
alexott Oct 14, 2024
0aebe5a
Update migration instructions for TF 1.7+
alexott Oct 15, 2024
7f3e132
reimplement permissions test
alexott Oct 15, 2024
0df79b2
Fix integration test
alexott Oct 15, 2024
2b9b5e0
Merge branch 'main' into new-alert-resource
alexott Oct 16, 2024
7147e71
Add support for `notify_on_ok` field
alexott Oct 16, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 8 additions & 1 deletion common/resource.go
Original file line number Diff line number Diff line change
Expand Up @@ -443,13 +443,20 @@ func genericDatabricksData[T, P, C any](
// WorkspacePathPrefixDiffSuppress suppresses diffs for workspace paths where both sides
// may or may not include the `/Workspace` prefix.
//
// This is the case for dashboards where at create time, the user may include the `/Workspace`
// This is the case for dashboards, alerts and queries where at create time, the user may include the `/Workspace`
// prefix for the `parent_path` field, but the read response will not include the prefix.
func WorkspacePathPrefixDiffSuppress(k, old, new string, d *schema.ResourceData) bool {
const prefix = "/Workspace"
return strings.TrimPrefix(old, prefix) == strings.TrimPrefix(new, prefix)
}

// WorkspaceOrEmptyPathPrefixDiffSuppress is similar WorkspacePathPrefixDiffSuppress but also suppresses diffs
// when the new value is empty (not specified by user).
func WorkspaceOrEmptyPathPrefixDiffSuppress(k, old, new string, d *schema.ResourceData) bool {
const prefix = "/Workspace"
return (old != "" && new == "") || strings.TrimPrefix(old, prefix) == strings.TrimPrefix(new, prefix)
}

func EqualFoldDiffSuppress(k, old, new string, d *schema.ResourceData) bool {
if strings.EqualFold(old, new) {
log.Printf("[INFO] Suppressing diff on %s", k)
Expand Down
9 changes: 9 additions & 0 deletions common/resource_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -187,6 +187,15 @@ func TestWorkspacePathPrefixDiffSuppress(t *testing.T) {
assert.False(t, WorkspacePathPrefixDiffSuppress("k", "/Workspace/1", "/Workspace/2", nil))
}

func TestWorkspaceOrEmptyPathPrefixDiffSuppress(t *testing.T) {
assert.True(t, WorkspaceOrEmptyPathPrefixDiffSuppress("k", "/Workspace/foo/bar", "/Workspace/foo/bar", nil))
assert.True(t, WorkspaceOrEmptyPathPrefixDiffSuppress("k", "/Workspace/foo/bar", "/foo/bar", nil))
assert.True(t, WorkspaceOrEmptyPathPrefixDiffSuppress("k", "/foo/bar", "/Workspace/foo/bar", nil))
assert.True(t, WorkspaceOrEmptyPathPrefixDiffSuppress("k", "/foo/bar", "/foo/bar", nil))
assert.True(t, WorkspaceOrEmptyPathPrefixDiffSuppress("k", "/foo/bar", "", nil))
assert.False(t, WorkspaceOrEmptyPathPrefixDiffSuppress("k", "/Workspace/1", "/Workspace/2", nil))
}

func TestEqualFoldDiffSuppress(t *testing.T) {
assert.True(t, EqualFoldDiffSuppress("k", "A", "a", nil))
assert.False(t, EqualFoldDiffSuppress("k", "A", "A2", nil))
Expand Down
162 changes: 162 additions & 0 deletions docs/resources/alert.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,162 @@
---
subcategory: "Databricks SQL"
---
# databricks_alert Resource

This resource allows you to manage [Databricks SQL Alerts](https://docs.databricks.com/en/sql/user/alerts/index.html). It supersedes [databricks_sql_alert](sql_alert.md) resource - see migration guide below for more details.

## Example Usage

```hcl
resource "databricks_directory" "shared_dir" {
path = "/Shared/Queries"
}

# This will be replaced with new databricks_query resource
resource "databricks_sql_query" "this" {
data_source_id = databricks_sql_endpoint.example.data_source_id
name = "My Query Name"
query = "SELECT 42 as value"
parent = "folders/${databricks_directory.shared_dir.object_id}"
}

resource "databricks_alert" "alert" {
query_id = databricks_sql_query.this.id
display_name = "TF new alert"
parent_path = databricks_directory.shared_dir.path
condition {
op = "GREATER_THAN"
operand {
column {
name = "value"
}
}
threshold {
value {
double_value = 42
}
}
}
}
```

## Argument Reference

The following arguments are available:

* `query_id` - (Required, String) ID of the query evaluated by the alert.
* `display_name` - (Required, String) Name of the alert.
* `condition` - (Required) Trigger conditions of the alert. Block consists of the following attributes:
* `op` - (Required, String Enum) Operator used for comparison in alert evaluation. (Enum: `GREATER_THAN`, `GREATER_THAN_OR_EQUAL`, `LESS_THAN`, `LESS_THAN_OR_EQUAL`, `EQUAL`, `NOT_EQUAL`, `IS_NULL`)
* `operand` - (Required, Block) Name of the column from the query result to use for comparison in alert evaluation:
* `column` - (Required, Block) Block describing the column from the query result to use for comparison in alert evaluation:
* `name` - (Required, String) Name of the column.
* `threshold` - (Optional for `IS_NULL` operation, Block) Threshold value used for comparison in alert evaluation:
* `value` - (Required, Block) actual value used in comparison (one of the attributes is required):
* `string_value` - string value to compare against string results.
* `double_value` - double value to compare against integer and double results.
* `bool_value` - boolean value (`true` or `false`) to compare against boolean results.
* `empty_result_state` - (Optional, String Enum) Alert state if result is empty (`UNKNOWN`, `OK`, `TRIGGERED`)
* `custom_subject` - (Optional, String) Custom subject of alert notification, if it exists. This includes email subject, Slack notification header, etc. See [Alerts API reference](https://docs.databricks.com/en/sql/user/alerts/index.html) for custom templating instructions.
* `custom_body` - (Optional, String) Custom body of alert notification, if it exists. See [Alerts API reference](https://docs.databricks.com/en/sql/user/alerts/index.html) for custom templating instructions.
* `parent_path` - (Optional, String) The path to a workspace folder containing the alert. The default is the user's home folder. If changed, the alert will be recreated.
* `seconds_to_retrigger` - (Optional, Integer) Number of seconds an alert must wait after being triggered to rearm itself. After rearming, it can be triggered again. If 0 or not specified, the alert will not be triggered again.
* `owner_user_name` - (Optional, String) Alert owner's username.

## Attribute Reference

In addition to all the arguments above, the following attributes are exported:

* `id` - unique ID of the Alert.
* `lifecycle_state` - The workspace state of the alert. Used for tracking trashed status. (Possible values are `ACTIVE` or `TRASHED`).
* `state` - Current state of the alert's trigger status (`UNKNOWN`, `OK`, `TRIGGERED`). This field is set to `UNKNOWN` if the alert has not yet been evaluated or ran into an error during the last evaluation.
* `create_time` - The timestamp string indicating when the alert was created.
* `update_time` - The timestamp string indicating when the alert was updated.
* `trigger_time` - The timestamp string when the alert was last triggered if the alert has been triggered before.

## Migrating from `databricks_sql_alert` resource

Under the hood, the new resource uses the same data as the `databricks_sql_alert`, but exposed via different API. This means that we can migrate existing alerts without recreating them. This operation is done in few steps:

* Record the ID of existing `databricks_sql_alert`, for example, by executing the `terraform state show databricks_sql_alert.alert` command.
* Create the code for the new implementation performing following changes:
* the `name` attribute is now named `display_name`
* the `parent` (if exists) is renamed to `parent_path` attribute, and should be converted from `folders/object_id` to the actual path.
* `options` block is converted into `condition` block with the following changes:
* the value of `op` attribute should be converted from mathematical operator into string name, like, `>` is becoming `GREATER_THAN`, `==` is becoming `EQUAL`, etc.
* the `column` attribute is becoming `operand` block
* the `value` attribute is becoming `threshold` block. **Please note that old implementation always used strings so you may have changes after import if you use `double_value` or `bool_value` inside the block.**
* the `rearm` attribute is renamed to `seconds_to_retrigger`.

For example, if we have the original `databricks_sql_alert` defined as:

```hcl
resource "databricks_sql_alert" "alert" {
query_id = databricks_sql_query.this.id
name = "My Alert"
parent = "folders/${databricks_directory.shared_dir.object_id}"
options {
column = "value"
op = ">"
value = "42"
muted = false
}
}
```

we'll have a new resource defined as:

```hcl
resource "databricks_alert" "alert" {
query_id = databricks_sql_query.this.id
display_name = "My Alert"
parent_path = databricks_directory.shared_dir.path
condition {
op = "GREATER_THAN"
operand {
column {
name = "value"
}
}
threshold {
value {
double_value = 42
}
}
}
}
```

* Remove the old resource from the state with the `terraform state rm databricks_sql_alert.alert` command.
* Import new resource with the `terraform import databricks_alert.alert <alert-id>` command.
alexott marked this conversation as resolved.
Show resolved Hide resolved
* Run the `terraform plan` command to check possible changes, like, value type change, etc.
alexott marked this conversation as resolved.
Show resolved Hide resolved

## Access Control

[databricks_permissions](permissions.md#sql-alert-usage) can control which groups or individual users can *Manage*, *Edit*, *Run* or *View* individual alerts.

```hcl
resource "databricks_permissions" "alert_usage" {
sql_alert_id = databricks_alert.alert.id
access_control {
group_name = "users"
permission_level = "CAN_RUN"
}
}
```

## Import

This resource can be imported using alert ID:

```bash
terraform import databricks_alert.this <alert-id>
```

## Related Resources

The following resources are often used in the same context:

* [databricks_sql_query](sql_query.md) to manage Databricks SQL [Queries](https://docs.databricks.com/sql/user/queries/index.html).
* [databricks_sql_endpoint](sql_endpoint.md) to manage Databricks SQL [Endpoints](https://docs.databricks.com/sql/admin/sql-endpoints.html).
* [databricks_directory](directory.md) to manage directories in [Databricks Workpace](https://docs.databricks.com/workspace/workspace-objects.html).
12 changes: 12 additions & 0 deletions docs/resources/sql_alert.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,18 @@ In addition to all arguments above, the following attributes are exported:

* `id` - unique ID of the SQL Alert.

## Access Control

[databricks_permissions](permissions.md#sql-alert-usage) can control which groups or individual users can *Manage*, *Edit*, *Run* or *View* individual alerts.

## Import

This resource can be imported using alert ID:

```bash
terraform import databricks_sql_alert.this <alert-id>
```

## Related Resources

The following resources are often used in the same context:
Expand Down
76 changes: 76 additions & 0 deletions internal/acceptance/alert_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
package acceptance

import (
"testing"
)

func TestAccAlert(t *testing.T) {
WorkspaceLevel(t, Step{
Template: `
resource "databricks_sql_query" "this" {
data_source_id = "{env.TEST_DEFAULT_WAREHOUSE_DATASOURCE_ID}"
name = "tf-{var.RANDOM}"
query = "SELECT 1 AS p1, 2 as p2"
}

resource "databricks_permissions" "alert_usage" {
sql_alert_id = databricks_alert.alert.id
access_control {
group_name = "users"
permission_level = "CAN_RUN"
}
}
alexott marked this conversation as resolved.
Show resolved Hide resolved

resource "databricks_alert" "alert" {
query_id = databricks_sql_query.this.id
display_name = "tf-alert-{var.RANDOM}"
condition {
op = "EQUAL"
operand {
column {
name = "p2"
}
}
threshold {
value {
double_value = 2
}
}
}
}
`,
}, Step{
Template: `
resource "databricks_sql_query" "this" {
data_source_id = "{env.TEST_DEFAULT_WAREHOUSE_DATASOURCE_ID}"
name = "tf-{var.RANDOM}"
query = "SELECT 1 AS p1, 2 as p2"
}

resource "databricks_permissions" "alert_usage" {
sql_alert_id = databricks_alert.alert.id
access_control {
group_name = "users"
permission_level = "CAN_RUN"
}
}

resource "databricks_alert" "alert" {
query_id = databricks_sql_query.this.id
display_name = "tf-alert-{var.RANDOM}"
condition {
op = "GREATER_THAN"
operand {
column {
name = "p2"
}
}
threshold {
value {
double_value = 3
}
}
}
}`,
})
}
2 changes: 1 addition & 1 deletion internal/acceptance/sql_alert_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import (
"testing"
)

func TestAccAlert(t *testing.T) {
func TestAccSqlAlert(t *testing.T) {
WorkspaceLevel(t, Step{
Template: `
resource "databricks_sql_query" "this" {
Expand Down
1 change: 1 addition & 0 deletions internal/providers/sdkv2/sdkv2.go
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,7 @@ func DatabricksProvider() *schema.Provider {
},
ResourcesMap: map[string]*schema.Resource{ // must be in alphabetical order
"databricks_access_control_rule_set": permissions.ResourceAccessControlRuleSet().ToResource(),
"databricks_alert": sql.ResourceAlert().ToResource(),
"databricks_artifact_allowlist": catalog.ResourceArtifactAllowlist().ToResource(),
"databricks_aws_s3_mount": storage.ResourceAWSS3Mount().ToResource(),
"databricks_azure_adls_gen1_mount": storage.ResourceAzureAdlsGen1Mount().ToResource(),
Expand Down
Loading
Loading