Replies: 3 comments 5 replies
-
Could you share your entire script, to see what you are doing to prep the SQL Server database and the D365FO services? I have never seen any issues with this, and the noexpand warning / error seems a bit of. But on another note: If you really just need to move a database copy from a Tier1 to another Tier1, there are far better and faster tools available, than using the bacpac export / import. Let me know if you need some pointers. |
Beta Was this translation helpful? Give feedback.
-
So a few comments. It seems that you are running this from Azure DevOps pipelines, based on several "$()" where you are getting the pipeline variable and persisting that into a powershell variable. Right? It seems that you download the azcopy tool by hand. Why is that? We have packaged that for you with: https://github.com/d365collaborative/d365fo.tools/blob/master/docs/Invoke-D365InstallAzCopy.md And to help you make simple command that copies files, using AzCopy we have packaged that for you with: A comment on the: $source_account = "$(d365_rsatstaging_accountname)"
$source_key = "$(d365_rsatstaging_accountkey)"
$target_account = "$(d365_rsat_accountname)"
$target_key = "$(d365_rsat_accountkey)"
$source_context = New-AzureStorageContext -StorageAccountName "$source_account" -StorageAccountKey "$source_key"
$source_token = New-AzureStorageAccountSASToken -Context $source_context -Service Blob,File,Table,Queue -ResourceType Service,Container,Object -Permission "racwdlup" -Protocol HttpsOnly
$source_path = "https://" + $source_account + ".blob.core.windows.net/documents/" + $source_token
$target_context = New-AzureStorageContext -StorageAccountName "$target_account" -StorageAccountKey "$target_key"
$target_token = New-AzureStorageAccountSASToken -Context $target_context -Service Blob,File,Table,Queue -ResourceType Service,Container,Object -Permission "racwdlup" -Protocol HttpsOnly
$target_path = "https://" + $target_account + ".blob.core.windows.net/documents/" + $target_token I get a feeling that you are storing the AccountKey, either directly in the pipeline variables or fetching it from an Azure KeyVault earlier in the pipeline. But having that at hand, at this point in time, in the pipeline - seems counter intuitive. If you know that you need read and write access for your different storage accounts and blob containers, then create a SAS key/token that is valid 1 year, and store that (KeyVault or Pipeline variable). Then all of the above code would be obsolete. I don't know how fast/slow the Import-D365Bacpac -ImportModeTier1 operations works for you, but I would argue it would be up to 10x times faster running with old fashioned SQL bak files instead. That goes for exporting the bacpac vs. generating a SQL bak file from the source system. The same goes for importing the bacpac file vs. restoring a SQL bak file into the target system. www.dbatools.io would be the best option for that. They have the following cmdlets available:
and
Without knowing all the finer details in your environment and how things are configured, I would argue that it shouldn't be necessary to handle the "axdbadmin" & "$(d365_rsat_axdbadmin_password)" details like you do. If the VM that you are importing/restore the database into, is a fully working D365 DEVBOX, be that a Onebox (VHD from LCS) or a LCS deployed VM, the module will auto load the "axdbadmin" and the password whenever you import the module. I would argue that you should be able to utilize: Instead of your current usage of "Invoke-Sqlcmd", but that is more to help you with making sure the script uses the correct user account and it will automatically hit the AXDB database. For the comment about you not being able to fix the Azure Blobs - I'm taking it that you want to override the current DocumentHandling configuration, to make sure the environment is referencing the correct Azure Blobs for these things. Right? Did you try to update the details directly in D365FO, while doing a SQL Profile, to see what statements are executed? Does it work if you update these details directly in D365FO? If yes - and I take it that you are running RSAT tests, wouldn't it be easier to simple the RSAT test with a test case that configured these details, before running the rest of the test cases? |
Beta Was this translation helpful? Give feedback.
-
We support specifying a new administrator of the environment: We support adding any user, from any tenant:
We support enabling / disabling users:
What else in terms of users are you missing? Are you talking about system users like axdbadmin and stuff like that?
All the scripts that we run, you can execute your self: All SQL scripts are available online for reference, but will also be available locally on the machine and it should be straightforward to locate these and use them in combination with Invoke-D365SqlScript. But could you elaborate on what system tables that you feel that should be truncated? As a tool / framework, that takes a lot of the effort out of automating some of the more cumbersome things, I feel we are close to what you are looking for. You might have special needs / use cases that you need to support. But that is where you as the partner for the customer is adding value, being able to develop a set of steps / tasks, that fits their need. For every single thing that you feel should be part of the module, please create an issue and share as much detail that you can about the need. If you can solve the need in any way, but don't feel comfortable with the PR's and adding it to the module, you can share it directly with me and I'll make sure to up take the changes and make it part of the module. That is, if I see that the broader community can gain from that - which more often than not 😉 As a last note: I get a sense that the source Tier1 and target Tier1 isn't fully aligned, since your concerns are focused around making sure the different things are in place. If the source Tier1 has been working and configured correctly, and the versions between the source D365FO and the target D365FO isn't several versions apart - I would argue that a backup / restore (bak file) approach, should solve most of the mentioned things. |
Beta Was this translation helpful? Give feedback.
-
Hi
I'm exporting from a tier 1 environment with:
New-D365Bacpac -ExportModeTier1 -BacpacFile C:\Temp\Bacpac\AxDB_staging.bacpac -ExportOnly -MaxParallelism 32
And importing the bacpac in another tier 1 environment:
Beta Was this translation helpful? Give feedback.
All reactions