Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add databricks_alert resource to replace databricks_sql_alert #4051

Merged
merged 24 commits into from
Oct 18, 2024

Conversation

alexott
Copy link
Contributor

@alexott alexott commented Sep 27, 2024

Changes

The new resource uses the new Alerts API instead of the legacy one that will be deprecated. Since the new resource has a slightly different set of parameters, it was decided to create a new resource and deprecate the old one.

This resource uses old TF SDK to be compatible with TF exporter (until #4050 is implemented).

TODOs:

  • Need to discuss how to handle permissions - sql_alert permissions look like working, but not sure if we should continue to use that API
  • Support in the exporter will be in a separate PR

Tests

  • make test run locally
  • relevant change in docs/ folder
  • covered with integration tests in internal/acceptance
  • relevant acceptance tests are passing
  • using Go SDK

…alert`

New resource use the [new Alerts API](https://docs.databricks.com/api/workspace/alerts/create)
instead of legacy one that will be deprecated. New resource has slightly different set of
parameters, so it was decided to create new resource and deprecate old one.

This resource uses old TF SDK to be compatible with TF exporter (until #4050 is implemented).
@alexott alexott marked this pull request as ready for review October 8, 2024 10:42
@alexott alexott requested review from a team as code owners October 8, 2024 10:42
@alexott alexott requested review from parthban-db and mgyucht and removed request for a team October 8, 2024 10:42
alexott added a commit that referenced this pull request Oct 14, 2024
…query`

This PR is built on top of #4051 that should be merged first.

The new resource uses the new [Queries API](https://docs.databricks.com/api/workspace/queries/create) instead of the legacy one that will be deprecated. Since the new resource has a slightly different set of parameters, it was decided to create a new resource and deprecate the old one.

This resource uses old TF SDK to be compatible with TF exporter (until #4050 is implemented).

TODOs:

- Add documentation
- Need to discuss how to handle permissions - `sql_query` permissions look like working, but not sure if we should continue to use that API
- Support in the exporter will be in a separate PR
Copy link
Contributor

@mgyucht mgyucht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Few comments, but otherwise mostly LGTM. Did you try to use the plugin framework, or was it just easier to implement as an SDK resource because the original resource used the SDK?

docs/resources/alert.md Show resolved Hide resolved
docs/resources/alert.md Outdated Show resolved Hide resolved
internal/acceptance/alert_test.go Outdated Show resolved Hide resolved
sql/resource_alert.go Show resolved Hide resolved
sql/resource_alert.go Show resolved Hide resolved
sql/resource_alert.go Outdated Show resolved Hide resolved
sql/resource_alert.go Show resolved Hide resolved
sql/resource_alert.go Show resolved Hide resolved
alexott and others added 2 commits October 14, 2024 17:29
Co-authored-by: Miles Yucht <miles@databricks.com>
@alexott alexott requested a review from mgyucht October 15, 2024 06:47
Copy link
Contributor

@mgyucht mgyucht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, one small question about update mask but otherwise it's great.

return err
}
var a sql.UpdateAlertRequestAlert
updateMask := "display_name,query_id,seconds_to_retrigger,condition,custom_body,custom_subject"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we use updateMask anywhere else in the TF provider? I wonder if we should always set it for all fields, even if fields don't have any change, since the *schema.ResourceData has the desired values for all of those fields. Mainly asking from the perspective of if we need logic around setting update masks or if we can avoid it to make things simpler.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's a good question - we have updateMask in settings resources and in new alerts/queries.

@alexott alexott added this pull request to the merge queue Oct 18, 2024
Merged via the queue into main with commit b9fb47c Oct 18, 2024
7 checks passed
@alexott alexott deleted the new-alert-resource branch October 18, 2024 08:46
alexott added a commit that referenced this pull request Oct 18, 2024
…query`

This PR is built on top of #4051 that should be merged first.

The new resource uses the new [Queries API](https://docs.databricks.com/api/workspace/queries/create) instead of the legacy one that will be deprecated. Since the new resource has a slightly different set of parameters, it was decided to create a new resource and deprecate the old one.

This resource uses old TF SDK to be compatible with TF exporter (until #4050 is implemented).

TODOs:

- Add documentation
- Need to discuss how to handle permissions - `sql_query` permissions look like working, but not sure if we should continue to use that API
- Support in the exporter will be in a separate PR
alexott added a commit that referenced this pull request Oct 19, 2024
…query`

This PR is built on top of #4051 that should be merged first.

The new resource uses the new [Queries API](https://docs.databricks.com/api/workspace/queries/create) instead of the legacy one that will be deprecated. Since the new resource has a slightly different set of parameters, it was decided to create a new resource and deprecate the old one.

This resource uses old TF SDK to be compatible with TF exporter (until #4050 is implemented).

TODOs:

- Add documentation
- Need to discuss how to handle permissions - `sql_query` permissions look like working, but not sure if we should continue to use that API
- Support in the exporter will be in a separate PR
rauchy added a commit that referenced this pull request Oct 23, 2024
### New Features and Improvements

 * Add `databricks_alert` resource to replace `databricks_sql_alert` ([#4051](#4051)).
 * Added resource `databricks_custom_app_integration` ([#4124](#4124)).
 * Handle `schema` attribute in `databricks_pipeline` ([#4137](#4137)).

### Bug Fixes

 * Change repo used in test ([#4122](#4122)).

### Documentation

 * Clarify that `graviton` option of `databricks_node_type` could be used on Azure ([#4125](#4125)).
 * Fix argument in example for `databricks_custom_app_integration` ([#4132](#4132)).
 * Fix for UC on AWS guide - use `databricks_aws_unity_catalog_assume_role_policy` where necessary ([#4109](#4109)).

### Exporter

 * **Breaking change**: Move `databricks_workspace_file` to a separate service ([#4118](#4118)).
 * Exclude some system schemas from export ([#4121](#4121)).
 * Use `List` + iteration instead of call to `ListAll` ([#4123](#4123)).
github-merge-queue bot pushed a commit that referenced this pull request Oct 23, 2024
…query` (#4103)

## Changes
<!-- Summary of your changes that are easy to understand -->

This PR is built on top of #4051 which should be merged first.

The new resource uses the new [Queries
API](https://docs.databricks.com/api/workspace/queries/create) instead
of the legacy one that will be deprecated. Since the new resource has a
slightly different set of parameters, it was decided to create a new
resource and deprecate the old one.

This resource uses old TF SDK to be compatible with TF exporter (until
#4050 is implemented).

TODOs:

- Need to discuss how to handle permissions - `sql_query` permissions
look like working, but not sure if we should continue to use that API
- Support in the exporter will be in a separate PR


## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] relevant change in `docs/` folder
- [x] covered with integration tests in `internal/acceptance`
- [x] relevant acceptance tests are passing
- [x] using Go SDK
rauchy added a commit that referenced this pull request Oct 24, 2024
### New Features and Improvements

 * Add `databricks_alert` resource to replace `databricks_sql_alert` ([#4051](#4051)).
 * Add `databricks_query` resource instead of `databricks_sql_query` ([#4103](#4103)).
 * Added resource `databricks_custom_app_integration` ([#4124](#4124)).
 * Handle `schema` attribute in `databricks_pipeline` ([#4137](#4137)).

### Bug Fixes

 * Change repo used in test ([#4122](#4122)).

### Documentation

 * Clarify that `graviton` option of `databricks_node_type` could be used on Azure ([#4125](#4125)).
 * Fix argument in example for `databricks_custom_app_integration` ([#4132](#4132)).
 * Fix for UC on AWS guide - use `databricks_aws_unity_catalog_assume_role_policy` where necessary ([#4109](#4109)).

### Exporter

 * **Breaking change**: Move `databricks_workspace_file` to a separate service ([#4118](#4118)).
 * Exclude some system schemas from export ([#4121](#4121)).
 * Use `List` + iteration instead of call to `ListAll` ([#4123](#4123)).
github-merge-queue bot pushed a commit that referenced this pull request Oct 24, 2024
### New Features and Improvements

* Add `databricks_alert` resource to replace `databricks_sql_alert`
([#4051](#4051)).
* Add `databricks_query` resource instead of `databricks_sql_query`
([#4103](#4103)).
* Added resource `databricks_custom_app_integration`
([#4124](#4124)).
* Handle `schema` attribute in `databricks_pipeline`
([#4137](#4137)).


### Bug Fixes

* Change repo used in test
([#4122](#4122)).


### Documentation

* Clarify that `graviton` option of `databricks_node_type` could be used
on Azure
([#4125](#4125)).
* Fix argument in example for `databricks_custom_app_integration`
([#4132](#4132)).
* Fix for UC on AWS guide - use
`databricks_aws_unity_catalog_assume_role_policy` where necessary
([#4109](#4109)).


### Exporter

* **Breaking change**: Move `databricks_workspace_file` to a separate
service
([#4118](#4118)).
* Exclude some system schemas from export
([#4121](#4121)).
* Use `List` + iteration instead of call to `ListAll`
([#4123](#4123)).

Co-authored-by: Omer Lachish <rauchy@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants