Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Internal] Rebase and Fixed merge conflicts with main branch. Also fixed notification_destination acceptance tests. #3892

Merged
merged 99 commits into from
Aug 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
99 commits
Select commit Hold shift + click to select a range
483eb99
added isolation mode support for `databricks_external_location` & `da…
nkvuong Jun 28, 2024
03c71d0
Add terraform support for periodic triggers (#3700)
alexmos-db Jun 28, 2024
f42e1fb
Release v1.48.2 (#3722)
hectorcast-db Jun 28, 2024
d669d7a
remove references to basic auth (#3720)
nkvuong Jun 30, 2024
0c252d4
Fix invalid priviledges in grants.md (#3716)
rayalex Jun 30, 2024
1ba1772
Bump github.com/hashicorp/hcl/v2 from 2.20.1 to 2.21.0 (#3684)
dependabot[bot] Jul 1, 2024
df210b2
Refactored `databricks_cluster(s)` data sources to Go SDK (#3685)
nkvuong Jul 1, 2024
fc889cc
Renamed `databricks_catalog_workspace_binding` to `databricks_workspa…
nkvuong Jul 2, 2024
c6f949c
Exporter: fix generation of `run_as` blocks in `databricks_job` (#3724)
alexott Jul 2, 2024
ff837ab
Adds `databricks_volume` as data source (#3211)
karolusz Jul 3, 2024
0d943ea
Make the schedule.pause_status field read-only (#3692)
touchida Jul 3, 2024
75236a6
Added support for binding storage credentials and external locations …
nkvuong Jul 3, 2024
411f85c
Exporter: use Go SDK structs for `databricks_job` resource (#3727)
alexott Jul 4, 2024
e864065
Change TF registry ownership (#3736)
mgyucht Jul 4, 2024
701b5e5
Run goreleaser action in snapshot mode from merge queue (#3646)
pietern Jul 4, 2024
61b17e7
Bump golang.org/x/mod from 0.18.0 to 0.19.0 (#3739)
dependabot[bot] Jul 5, 2024
ac06fdb
Update cluster.md: add data_security_mode parameters `NONE` and `NO_I…
mkubicek Jul 5, 2024
d24adbd
Add `databricks_schema` data source (#3732)
alexott Jul 8, 2024
a55dc7f
Exporter: export libraries specified as `requirements.txt` (#3649)
alexott Jul 9, 2024
9c9bf2b
Exporter: Emit directories during the listing only if they are explic…
alexott Jul 9, 2024
ccad28f
Add new APIErrorBody struct and update deps (#3745)
renaudhartert-db Jul 9, 2024
77fc0b4
Upgrade databricks-sdk-go (#3743)
tanmay-db Jul 9, 2024
c16345e
[Internal] Improve Changelog by grouping changes (#3747)
hectorcast-db Jul 9, 2024
47b84b5
[Internal] Add Release tag (#3748)
hectorcast-db Jul 9, 2024
364f9ec
[Internal] Upgrade Go SDK to v0.43.2 (#3750)
tanmay-db Jul 9, 2024
8a7a8f0
fixed broken links in documentation (#3746)
nkvuong Jul 9, 2024
3010734
[Feature] Lakeview dashboard resource created (#3729)
Divyansh-db Jul 10, 2024
790bccc
[Exporter] clarify use of `-listing` and `-services` options (#3755)
alexott Jul 11, 2024
751719c
fix clustesr (#3760)
mgyucht Jul 11, 2024
4cd76c1
[Fix] Divyansh db/divyansh lakeview dashboard (#3763)
Divyansh-db Jul 12, 2024
9c54c38
[Fix] Tolerate OAuth errors in `databricks_mws_workspaces` when manag…
mgyucht Jul 12, 2024
e2d29e8
[Doc] Update resources diagram (#3765)
alexott Jul 15, 2024
c25414d
[Feature] Permissions for `databricks_dashboard` resource (#3762)
alexott Jul 15, 2024
6c35500
[Internal] clear stale go.sum values (#3768)
tanmay-db Jul 15, 2024
cbbac6b
Add "Owner" tag to test cluster in acceptance test (#3771)
pietern Jul 15, 2024
71e5492
[Internal] Updated Changelog for Release v1.48.3 (#3770)
tanmay-db Jul 15, 2024
c23d713
Fix integration test for restrict workspace admins setting (#3772)
pietern Jul 15, 2024
f970829
Add "Owner" tag to test SQL endpoint in acceptance test (#3774)
pietern Jul 16, 2024
b571c55
[Internal] Move PR message validation to a separate workflow (#3777)
hectorcast-db Jul 16, 2024
9e8fd30
[Exporter] Improve code generation for SQL Endpoints (#3764)
alexott Jul 16, 2024
65d1570
[Internal] Trigger the validate workflow in the merge queue (#3782)
hectorcast-db Jul 17, 2024
7c00820
[Fix] Update properties for managed SQL table on latest DBR (#3784)
pietern Jul 17, 2024
6bbaa21
[Fix] Add "Owner" tag to test SQL endpoint in acceptance test (#3785)
pietern Jul 17, 2024
abdb417
[Fix] Ignore managed property for liquid clustering integration test …
pietern Jul 17, 2024
165cda6
[Internal] Fix processing of `quoted` titles (#3790)
hectorcast-db Jul 18, 2024
ab4fd40
[Fix] Fix model serving resource (#3690)
arpitjasa-db Jul 19, 2024
86e0e35
[Doc] Improve docs for Network Connectivity Config (#3794)
alexott Jul 19, 2024
184a92d
[Doc] Document `serverless` flag in `databricks_pipeline` (#3797)
alexott Jul 19, 2024
af09787
[Exporter] Fix to support Serverless DLT (#3796)
alexott Jul 19, 2024
5d3e362
[Doc] Add description of `environment` block to `databricks_job` (#3798)
alexott Jul 19, 2024
733c998
[Release] v1.49.0 (#3752)
tanmay-db Jul 19, 2024
0e32851
[Exporter] Add support for exporting of Lakeview dashboards (#3779)
alexott Jul 20, 2024
d58f89f
[Fix] don't update `databricks_metastore` during creation if not requ…
nkvuong Jul 20, 2024
b138c0b
[Doc] Use correct names for isolation mode for storage credentials an…
alexott Jul 23, 2024
34bac74
[Internal] Refactored `databricks_zones` and `databricks_spark_versio…
nkvuong Jul 23, 2024
8fb39fb
[Doc] Clarified schedule block in `databricks_job` (#3805)
nkvuong Jul 23, 2024
165d129
[Fix] Fixed reading of permissions for SQL objects (#3800)
alexott Jul 23, 2024
324ac5a
[Doc] Fix incomplete note in `databricks_workspace_binding` resource …
alexott Jul 24, 2024
0a4a5be
[Exporter] Adding more retries for SCIM API calls (#3807)
alexott Jul 24, 2024
b3cea48
[Release] Release v1.49.1 (#3810)
tanmay-db Jul 24, 2024
01be651
[Internal] Update Go SDK (#3808)
hectorcast-db Jul 24, 2024
0111483
[Doc] Document missing task attributes in `databricks_job` resource (…
alexott Jul 26, 2024
39f96b2
[Doc] Add troubleshooting instructions for `databricks OAuth is not s…
alexott Jul 26, 2024
06c761a
[Exporter] Generate `databricks_workspace_binding` instead of legacy …
alexott Jul 26, 2024
c1ad31c
[Fix] Update Go SDK (#3826)
Divyansh-db Jul 26, 2024
a75696a
[Exporter] Add support for `databricks_online_table` (#3816)
alexott Jul 26, 2024
1df8285
[Exporter] Improve exporting of `databricks_model_serving` (#3821)
alexott Jul 26, 2024
01854e6
[Fix] reading `databricks_metastore_assignment` when importing resour…
alexott Jul 27, 2024
90a113f
[Feature] Mark attributes as sensitive in `databricks_mlflow_webhook`…
alexott Jul 27, 2024
ce00b97
[Fix] Corrected kms arn format in `data_aws_unity_catalog_policy` (#3…
nkvuong Jul 27, 2024
5bc75c6
[Feature] Add `active` attribute to `databricks_user` data source (#3…
840 Jul 28, 2024
9539a31
[Internal] Refactored `client.ClientForHost` to use Go SDK method (#3…
nkvuong Jul 28, 2024
7825dd7
[Fix] cluster key update for `databricks_sql_table` should not force …
nkvuong Jul 28, 2024
1a309c8
[Doc] clarified `spot_bid_max_price` option for `databricks_cluster` …
nkvuong Jul 29, 2024
4a5e52c
[Feature] added support for `cloudflare_api_token` in `databricks_sto…
nkvuong Jul 30, 2024
154ea38
[Doc] marked `databricks_sql_dashboard` as legacy (#3836)
nkvuong Jul 30, 2024
25b2725
[Internal] Rewriting DLT pipelines using SDK (#3792)
Divyansh-db Jul 31, 2024
cca2965
[Internal] Revert "Rewriting DLT pipelines using SDK" (#3838)
Divyansh-db Jul 31, 2024
8d628d1
[Internal] Rewrite DLT pipelines using SDK (#3839)
Divyansh-db Jul 31, 2024
5258611
[Feature] Notification Destination resource (#3820)
Divyansh-db Aug 1, 2024
7a28f57
[Exporter] Don't export model serving endpoints with foundational mod…
alexott Aug 2, 2024
9e5f71b
[Doc] Fixed documentation for `databricks_schemas` data source and `d…
nkvuong Aug 4, 2024
560e753
[Fix] Tolerate `databricks_workspace_conf` deletion failures (#3737)
mgyucht Aug 5, 2024
0f6278d
[Fix] Fixed read method of `databricks_entitlements` resource (#3858)
nkvuong Aug 7, 2024
55db1a7
[Exporter] Refactoring: remove legacy code (#3864)
alexott Aug 8, 2024
af366dc
[Exporter] Add support for Vector Search assets (#3828)
alexott Aug 8, 2024
fed7307
[Exporter] Add support for `databricks_notification_destination` (#3861)
alexott Aug 8, 2024
7342fa5
[Exporter] Ignore DLT pipelines deployed via DABs (#3857)
alexott Aug 8, 2024
fb7e4ef
[Fix] Automatically assign `IS_OWNER` permission to sql warehouse if …
nkvuong Aug 9, 2024
0564da9
[Dependency] Bump github.com/zclconf/go-cty from 1.14.4 to 1.15.0 (#3…
dependabot[bot] Aug 9, 2024
a9293e9
[Internal] Refactor exporter: split huge files into smaller ones (#3870)
alexott Aug 9, 2024
f1dc449
[Exporter] Fix generation of `autotermination_minutes = 0` (#3881)
alexott Aug 12, 2024
9037cee
[Doc] Clarify setting of permissions for workspace objects (#3884)
alexott Aug 13, 2024
9490aa8
[Fix] Fix crash when destroying `databricks_compliance_security_profi…
alexott Aug 13, 2024
81be591
[Fix] Retry cluster update on "INVALID_STATE" (#3890)
hectorcast-db Aug 13, 2024
2b9f03a
[Internal] Fixed merge conflicts with main branch
tanmay-db Aug 13, 2024
f61c56f
Merge branch 'terraform-plugin-framework' into rebase-13aug
tanmay-db Aug 13, 2024
537db9f
-
tanmay-db Aug 13, 2024
d76955e
fix test
tanmay-db Aug 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 9 additions & 2 deletions .github/workflows/message.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,20 +3,27 @@ name: Validate Commit Message
on:
pull_request:
types: [opened, synchronize, edited]
merge_group:
types: [checks_requested]

jobs:

validate:
runs-on: ubuntu-latest
# GitHub required checks are shared between PRs and the Merge Queue.
# Since there is no PR title on Merge Queue, we need to trigger and
# skip this test for Merge Queue to succeed.
if: github.event_name == 'pull_request'
steps:
- name: Checkout
uses: actions/checkout@v3
with:
fetch-depth: 0

- name: Validate Tag
env:
TITLE: ${{ github.event.pull_request.title }}
run: |
TAG=$(echo ${{ github.event.pull_request.title }} | sed -ne 's/\[\(.*\)\].*/\1/p')
TAG=$(echo "$TITLE" | sed -ne 's/\[\(.*\)\].*/\1/p')
if grep -q "tag: \"\[$TAG\]\"" .codegen/changelog_config.yml; then
echo "Valid tag found: [$TAG]"
else
Expand Down
73 changes: 73 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,78 @@
# Version changelog

## 1.49.1

### Bug Fixes
* Fixed reading of permissions for SQL objects ([#3800](https://github.com/databricks/terraform-provider-databricks/pull/3800)).
* don't update `databricks_metastore` during creation if not required ([#3783](https://github.com/databricks/terraform-provider-databricks/pull/3783)).

### Documentation
* Clarified schedule block in `databricks_job` ([#3805](https://github.com/databricks/terraform-provider-databricks/pull/3805)).
* Use correct names for isolation mode for storage credentials and external locations ([#3804](https://github.com/databricks/terraform-provider-databricks/pull/3804)).
* Fix incomplete note in databricks_workspace_binding resource ([#3806](https://github.com/databricks/terraform-provider-databricks/pull/3806))

### Internal Changes
* Refactored `databricks_zones` and `databricks_spark_versions` data sources to Go SDK ([#3687](https://github.com/databricks/terraform-provider-databricks/pull/3687)).

### Exporter
* Add support for exporting of Lakeview dashboards ([#3779](https://github.com/databricks/terraform-provider-databricks/pull/3779)).
* Adding more retries for SCIM API calls ([#3807](https://github.com/databricks/terraform-provider-databricks/pull/3807))


## 1.49.0

### New Features and Improvements
* Added `databricks_dashboard` resource ([#3729](https://github.com/databricks/terraform-provider-databricks/pull/3729)).
* Added `databricks_schema` data source ([#3732](https://github.com/databricks/terraform-provider-databricks/pull/3732)).
* Added support for binding storage credentials and external locations to specific workspaces ([#3678](https://github.com/databricks/terraform-provider-databricks/pull/3678)).
* Added `databricks_volume` as data source ([#3211](https://github.com/databricks/terraform-provider-databricks/pull/3211)).
* Make the `schedule.pause_status` field read-only ([#3692](https://github.com/databricks/terraform-provider-databricks/pull/3692)).
* Renamed `databricks_catalog_workspace_binding` to `databricks_workspace_binding` ([#3703](https://github.com/databricks/terraform-provider-databricks/pull/3703)).
* Make `cluster_name_contains` optional in `databricks_clusters` data source ([#3760](https://github.com/databricks/terraform-provider-databricks/pull/3760)).
* Tolerate OAuth errors in databricks_mws_workspaces when managing tokens ([#3761](https://github.com/databricks/terraform-provider-databricks/pull/3761)).
* Permissions for `databricks_dashboard` resource ([#3762](https://github.com/databricks/terraform-provider-databricks/pull/3762)).
* Fix model serving resource ([#3690](https://github.com/databricks/terraform-provider-databricks/pull/3690))

### Exporter
* Emit directories during the listing only if they are explicitly configured in `-listing` ([#3673](https://github.com/databricks/terraform-provider-databricks/pull/3673)).
* Export libraries specified as `requirements.txt` ([#3649](https://github.com/databricks/terraform-provider-databricks/pull/3649)).
* Fix generation of `run_as` blocks in `databricks_job` ([#3724](https://github.com/databricks/terraform-provider-databricks/pull/3724)).
* Use Go SDK structs for `databricks_job` resource ([#3727](https://github.com/databricks/terraform-provider-databricks/pull/3727)).
* Clarify use of `-listing` and `-services` options ([#3755](https://github.com/databricks/terraform-provider-databricks/pull/3755)).
* Improve code generation for SQL Endpoints ([#3764](https://github.com/databricks/terraform-provider-databricks/pull/3764))
* Fix to support Serverless DLT ([#3796](https://github.com/databricks/terraform-provider-databricks/pull/3796))

### Documentation
* Fix invalid priviledges in grants.md ([#3716](https://github.com/databricks/terraform-provider-databricks/pull/3716)).
* Update cluster.md: add data_security_mode parameters `NONE` and `NO_ISOLATION` ([#3740](https://github.com/databricks/terraform-provider-databricks/pull/3740)).
* Remove references to basic auth ([#3720](https://github.com/databricks/terraform-provider-databricks/pull/3720)).
* Update resources diagram ([#3765](https://github.com/databricks/terraform-provider-databricks/pull/3765)).
* Improve docs for Network Connectivity Config ([#3794](https://github.com/databricks/terraform-provider-databricks/pull/3794))
* Document serverless flag in databricks_pipeline ([#3797](https://github.com/databricks/terraform-provider-databricks/pull/3797))
* Add description of environment block to databricks_job ([#3798](https://github.com/databricks/terraform-provider-databricks/pull/3798))


### Internal Changes
* Add Release tag ([#3748](https://github.com/databricks/terraform-provider-databricks/pull/3748)).
* Improve Changelog by grouping changes ([#3747](https://github.com/databricks/terraform-provider-databricks/pull/3747)).
* Change TF registry ownership ([#3736](https://github.com/databricks/terraform-provider-databricks/pull/3736)).
* Refactored `databricks_cluster(s)` data sources to Go SDK ([#3685](https://github.com/databricks/terraform-provider-databricks/pull/3685)).
* Upgrade databricks-sdk-go ([#3743](https://github.com/databricks/terraform-provider-databricks/pull/3743)).
* Run goreleaser action in snapshot mode from merge queue ([#3646](https://github.com/databricks/terraform-provider-databricks/pull/3646)).
* Make `dashboard_name` random in integration tests for `databricks_dashboard` resource ([#3763](https://github.com/databricks/terraform-provider-databricks/pull/3763)).
* Clear stale go.sum values ([#3768](https://github.com/databricks/terraform-provider-databricks/pull/3768)).
* Add "Owner" tag to test cluster in acceptance test ([#3771](https://github.com/databricks/terraform-provider-databricks/pull/3771)).
* Fix integration test for restrict workspace admins setting ([#3772](https://github.com/databricks/terraform-provider-databricks/pull/3772)).
* Add "Owner" tag to test SQL endpoint in acceptance test ([#3774](https://github.com/databricks/terraform-provider-databricks/pull/3774)).
* Move PR message validation to a separate workflow ([#3777](https://github.com/databricks/terraform-provider-databricks/pull/3777)).
* Trigger the validate workflow in the merge queue ([#3782](https://github.com/databricks/terraform-provider-databricks/pull/3782)).
* Update properties for managed SQL table on latest DBR ([#3784](https://github.com/databricks/terraform-provider-databricks/pull/3784)).
* Add "Owner" tag to test SQL endpoint in acceptance test ([#3785](https://github.com/databricks/terraform-provider-databricks/pull/3785)).
* Ignore managed property for liquid clustering integration test ([#3786](https://github.com/databricks/terraform-provider-databricks/pull/3786))
* Fix processing of quoted titles ([#3790](https://github.com/databricks/terraform-provider-databricks/pull/3790))



## 1.48.3

### Internal Changes
Expand Down
2 changes: 1 addition & 1 deletion access/resource_sql_permissions.go
Original file line number Diff line number Diff line change
Expand Up @@ -272,7 +272,7 @@ func (ta *SqlPermissions) initCluster(ctx context.Context, d *schema.ResourceDat
}

func (ta *SqlPermissions) getOrCreateCluster(clustersAPI clusters.ClustersAPI) (string, error) {
sparkVersion := clustersAPI.LatestSparkVersionOrDefault(clusters.SparkVersionRequest{
sparkVersion := clusters.LatestSparkVersionOrDefault(clustersAPI.Context(), clustersAPI.WorkspaceClient(), compute.SparkVersionRequest{
Latest: true,
})
nodeType := clustersAPI.GetSmallestNodeType(compute.NodeTypeRequest{LocalDisk: true})
Expand Down
28 changes: 14 additions & 14 deletions access/resource_sql_permissions_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -184,20 +184,20 @@ var createHighConcurrencyCluster = []qa.HTTPFixture{
{
Method: "GET",
ReuseRequest: true,
Resource: "/api/2.0/clusters/spark-versions",
Response: clusters.SparkVersionsList{
SparkVersions: []clusters.SparkVersion{
Resource: "/api/2.1/clusters/spark-versions",
Response: compute.GetSparkVersionsResponse{
Versions: []compute.SparkVersion{
{
Version: "7.1.x-cpu-ml-scala2.12",
Description: "7.1 ML (includes Apache Spark 3.0.0, Scala 2.12)",
Key: "7.1.x-cpu-ml-scala2.12",
Name: "7.1 ML (includes Apache Spark 3.0.0, Scala 2.12)",
},
},
},
},
{
Method: "GET",
ReuseRequest: true,
Resource: "/api/2.0/clusters/list-node-types",
Resource: "/api/2.1/clusters/list-node-types",
Response: compute.ListNodeTypesResponse{
NodeTypes: []compute.NodeType{
{
Expand All @@ -222,7 +222,7 @@ var createHighConcurrencyCluster = []qa.HTTPFixture{
AutoterminationMinutes: 10,
ClusterName: "terraform-table-acl",
NodeTypeID: "Standard_F4s",
SparkVersion: "7.3.x-scala2.12",
SparkVersion: "11.3.x-scala2.12",
CustomTags: map[string]string{
"ResourceClass": "SingleNode",
},
Expand Down Expand Up @@ -261,20 +261,20 @@ var createSharedCluster = []qa.HTTPFixture{
{
Method: "GET",
ReuseRequest: true,
Resource: "/api/2.0/clusters/spark-versions",
Response: clusters.SparkVersionsList{
SparkVersions: []clusters.SparkVersion{
Resource: "/api/2.1/clusters/spark-versions",
Response: compute.GetSparkVersionsResponse{
Versions: []compute.SparkVersion{
{
Version: "7.1.x-cpu-ml-scala2.12",
Description: "7.1 ML (includes Apache Spark 3.0.0, Scala 2.12)",
Key: "7.1.x-cpu-ml-scala2.12",
Name: "7.1 ML (includes Apache Spark 3.0.0, Scala 2.12)",
},
},
},
},
{
Method: "GET",
ReuseRequest: true,
Resource: "/api/2.0/clusters/list-node-types",
Resource: "/api/2.1/clusters/list-node-types",
Response: compute.ListNodeTypesResponse{
NodeTypes: []compute.NodeType{
{
Expand All @@ -299,7 +299,7 @@ var createSharedCluster = []qa.HTTPFixture{
AutoterminationMinutes: 10,
ClusterName: "terraform-table-acl",
NodeTypeID: "Standard_F4s",
SparkVersion: "7.3.x-scala2.12",
SparkVersion: "11.3.x-scala2.12",
CustomTags: map[string]string{
"ResourceClass": "SingleNode",
},
Expand Down
12 changes: 6 additions & 6 deletions aws/data_aws_unity_catalog_policy.go
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ import (
"encoding/json"
"fmt"
"regexp"
"strings"

"github.com/databricks/terraform-provider-databricks/common"
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema"
Expand Down Expand Up @@ -44,16 +45,18 @@ func generateReadContext(ctx context.Context, d *schema.ResourceData, m *common.
},
}
if kmsKey, ok := d.GetOk("kms_name"); ok {
kmsArn := fmt.Sprintf("arn:aws:kms:%s", kmsKey)
if strings.HasPrefix(kmsKey.(string), "arn:aws") {
kmsArn = kmsKey.(string)
}
policy.Statements = append(policy.Statements, &awsIamPolicyStatement{
Effect: "Allow",
Actions: []string{
"kms:Decrypt",
"kms:Encrypt",
"kms:GenerateDataKey*",
},
Resources: []string{
fmt.Sprintf("arn:aws:kms:%s", kmsKey),
},
Resources: []string{kmsArn},
})
}
policyJSON, err := json.MarshalIndent(policy, "", " ")
Expand All @@ -73,9 +76,6 @@ func validateSchema() map[string]*schema.Schema {
"kms_name": {
Type: schema.TypeString,
Optional: true,
ValidateFunc: validation.StringMatch(
regexp.MustCompile(`^[0-9a-zA-Z/_-]+$`),
"must contain only alphanumeric, hyphens, forward slashes, and underscores characters"),
},
"bucket_name": {
Type: schema.TypeString,
Expand Down
57 changes: 57 additions & 0 deletions aws/data_aws_unity_catalog_policy_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,63 @@ func TestDataAwsUnityCatalogPolicy(t *testing.T) {
compareJSON(t, j, p)
}

func TestDataAwsUnityCatalogPolicyFullKms(t *testing.T) {
d, err := qa.ResourceFixture{
Read: true,
Resource: DataAwsUnityCatalogPolicy(),
NonWritable: true,
ID: ".",
HCL: `
aws_account_id = "123456789098"
bucket_name = "databricks-bucket"
role_name = "databricks-role"
kms_name = "arn:aws:kms:us-west-2:111122223333:key/databricks-kms"
`,
}.Apply(t)
assert.NoError(t, err)
j := d.Get("json").(string)
p := `{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::databricks-bucket/*",
"arn:aws:s3:::databricks-bucket"
]
},
{
"Effect": "Allow",
"Action": [
"sts:AssumeRole"
],
"Resource": [
"arn:aws:iam::123456789098:role/databricks-role"
]
},
{
"Effect": "Allow",
"Action": [
"kms:Decrypt",
"kms:Encrypt",
"kms:GenerateDataKey*"
],
"Resource": [
"arn:aws:kms:us-west-2:111122223333:key/databricks-kms"
]
}
]
}`
compareJSON(t, j, p)
}

func TestDataAwsUnityCatalogPolicyWithoutKMS(t *testing.T) {
d, err := qa.ResourceFixture{
Read: true,
Expand Down
2 changes: 1 addition & 1 deletion catalog/bindings/bindings.go
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import (
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema"
)

func AddCurrentWorkspaceBindings(ctx context.Context, d *schema.ResourceData, w *databricks.WorkspaceClient, securableName string, securableType string) error {
func AddCurrentWorkspaceBindings(ctx context.Context, d *schema.ResourceData, w *databricks.WorkspaceClient, securableName string, securableType catalog.UpdateBindingsSecurableType) error {
if d.Get("isolation_mode") != "ISOLATED" && d.Get("isolation_mode") != "ISOLATION_MODE_ISOLATED" {
return nil
}
Expand Down
4 changes: 3 additions & 1 deletion catalog/permissions/permissions.go
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,9 @@ func NewUnityCatalogPermissionsAPI(ctx context.Context, m any) UnityCatalogPermi

func (a UnityCatalogPermissionsAPI) GetPermissions(securable catalog.SecurableType, name string) (list *catalog.PermissionsList, err error) {
if securable.String() == "share" {
list, err = a.client.Shares.SharePermissions(a.context, sharing.SharePermissionsRequest{name})
list, err = a.client.Shares.SharePermissions(a.context, sharing.SharePermissionsRequest{
Name: name,
})
return
}
list, err = a.client.Grants.GetBySecurableTypeAndFullName(a.context, securable, name)
Expand Down
4 changes: 2 additions & 2 deletions catalog/resource_catalog.go
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ func ResourceCatalog() common.Resource {
}

// Bind the current workspace if the catalog is isolated, otherwise the read will fail
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, ci.Name, "catalog")
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, ci.Name, catalog.UpdateBindingsSecurableTypeCatalog)
},
Read: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
w, err := c.WorkspaceClient()
Expand Down Expand Up @@ -166,7 +166,7 @@ func ResourceCatalog() common.Resource {
d.SetId(ci.Name)

// Bind the current workspace if the catalog is isolated, otherwise the read will fail
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, ci.Name, "catalog")
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, ci.Name, catalog.UpdateBindingsSecurableTypeCatalog)
},
Delete: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
w, err := c.WorkspaceClient()
Expand Down
4 changes: 2 additions & 2 deletions catalog/resource_external_location.go
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ func ResourceExternalLocation() common.Resource {
}

// Bind the current workspace if the external location is isolated, otherwise the read will fail
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, el.Name, "external-location")
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, el.Name, catalog.UpdateBindingsSecurableTypeExternalLocation)
},
Read: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
w, err := c.WorkspaceClient()
Expand Down Expand Up @@ -134,7 +134,7 @@ func ResourceExternalLocation() common.Resource {
return err
}
// Bind the current workspace if the external location is isolated, otherwise the read will fail
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, updateExternalLocationRequest.Name, "external-location")
return bindings.AddCurrentWorkspaceBindings(ctx, d, w, updateExternalLocationRequest.Name, catalog.UpdateBindingsSecurableTypeExternalLocation)
},
Delete: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
force := d.Get("force_destroy").(bool)
Expand Down
2 changes: 1 addition & 1 deletion catalog/resource_external_location_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ func TestCreateIsolatedExternalLocation(t *testing.T) {
}, nil)
w.GetMockWorkspaceBindingsAPI().EXPECT().UpdateBindings(mock.Anything, catalog.UpdateWorkspaceBindingsParameters{
SecurableName: "abc",
SecurableType: "external-location",
SecurableType: "external_location",
Add: []catalog.WorkspaceBinding{
{
WorkspaceId: int64(123456789101112),
Expand Down
11 changes: 7 additions & 4 deletions catalog/resource_metastore.go
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,10 @@ func ResourceMetastore() common.Resource {
common.DataToStructPointer(d, s, &create)
common.DataToStructPointer(d, s, &update)
updateForceSendFields(&update)
emptyRequest, err := common.IsRequestEmpty(update)
if err != nil {
return err
}
return c.AccountOrWorkspaceRequest(func(acc *databricks.AccountClient) error {
mi, err := acc.Metastores.Create(ctx,
catalog.AccountsCreateMetastore{
Expand All @@ -75,10 +79,6 @@ func ResourceMetastore() common.Resource {
if err != nil {
return err
}
emptyRequest, err := common.IsRequestEmpty(update)
if err != nil {
return err
}
d.SetId(mi.MetastoreInfo.MetastoreId)
if emptyRequest {
return nil
Expand All @@ -97,6 +97,9 @@ func ResourceMetastore() common.Resource {
return err
}
d.SetId(mi.MetastoreId)
if emptyRequest {
return nil
}
update.Id = mi.MetastoreId
_, err = w.Metastores.Update(ctx, update)
if err != nil {
Expand Down
2 changes: 2 additions & 0 deletions catalog/resource_metastore_assignment.go
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,8 @@ func ResourceMetastoreAssignment() common.Resource {
return err
}
d.Set("metastore_id", ma.MetastoreId)
d.Set("default_catalog_name", ma.DefaultCatalogName)
d.Set("workspace_id", workspaceId)
return nil
})
},
Expand Down
Loading
Loading