Can't deploy DacPac on Azure DevOps CD - visual-studio

I am building DacPac file from a database project using Visual Studio. Also, in the Debug Drop tab in Advanced Build Settings, I enabled the following options:
Do not drop credentials
Do not drop database roles
Do not drop database scoped credentials
Do not drop logins
Do not drop permissions
Do not drop role membership
Do not drop users
Do not drop server role membership
Still, I get the following error in CD SQL Deploy:
EDIT
based on the advice of Krzysztof Madej, I have put the following additional arguments (they were working on another project), but for the argument, he proposed and for these ones I got this error:
is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

In AdditionalArgument you have to add
/p:BlockOnPossibleDataLoss=false
But be aware that will delete you data.
If you use classic release it will be here:
in yaml
- task: SqlAzureDacpacDeployment#1
displayName: Execute Azure SQL : DacpacTask
inputs:
azureSubscription: '<Azure service connection>'
ServerName: '<Database server name>'
DatabaseName: '<Database name>'
SqlUsername: '<SQL user name>'
SqlPassword: '<SQL user password>'
DacpacFile: '<Location of Dacpac file in $(Build.SourcesDirectory) after compilation>'
additionalArguments: '/p:BlockOnPossibleDataLoss=false'

Related

Unable to locate executable file: 'bash'. Please verify either the file path exists or the file can be found within a directory specified by the PATH

I go the following error while print group variable in azure DevOps both yaml and classic using MS hosted agent
##[error]Unable to locate executable file: 'bash'. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable.
The yml code below
trigger:
- master
variables:
- group: myvargroup
pool:
vmImage: ubuntu-latest
stages:
- stage: "Test"
jobs:
- job:
steps:
- script: echo $(fname)
displayName: 'Run a one-line script'
The job parameter below enter image description here
Error screenshot enter image description here
I tested your sample, it works well now. There is a recently event of availability degradation of Azure DevOps, which could affected these services, and it has been resolved. If you want to know more information, please click here. You can try again to see if the problem still exists.

sqitch deploy command fails when deploying changes to azure

Hi Guys I am trying to apply the deploy command on a database that is hosted on azure. Nevertheless, I got the following error:
sqitch deploy db:pg://cmurcia%40dataplatform:*****#dataplatform.postgres.database.azure.com:5432/dataplatform_metadata_service
Adding registry tables to db:pg://cmurcia%40dataplatform:#dataplatform.postgres.database.azure.com:5432/dataplatform_metadata_service
psql:/usr/share/perl5/App/Sqitch/Engine/pg.sql:4: ERROR: permission denied for database dataplatform_metadata_service
"/usr/bin/psql" unexpectedly returned exit value 3
I tested with psql and I can both log in and modify tables in the database that is accessed with the mentioned URI (db:pg://cmurcia%40dataplatform:*****#dataplatform.postgres.database.azure.com:5432/dataplatform_metadata_service).
I also tried
sqitch deploy -t postgresql://cmurcia%40dataplatform:Welcome0518%21#dataplatform.postgres.database.azure.com:5432/dataplatform_metadata_service
Adding registry tables to db:postgresql://cmurcia%40dataplatform:#dataplatform.postgres.database.azure.com:5432/dataplatform_metadata_service
psql:/usr/share/perl5/App/Sqitch/Engine/pg.sql:4: ERROR: permission denied for database dataplatform_metadata_service
"/usr/bin/psql" unexpectedly returned exit value
3
I would like to ask if you have any hints about how to solve this. Thank you!
FYI I am using an ubuntu linux VM hosted on azure to run the command where I installed sqitch, sqitch is working locally.
The first thing Sqitch does when it connects to a database is create the registry if it does not yet exist. Usually this is a schema named sqtich. Have a look at the Postgres registry script. Be sure you have permission to create the schema. If you don't, have someone else create it and give you the permission to create objects in it as well as your project schema.

Azure DevOps ThirdParty Tools for build / Deployment

List item
pipelines:
default:
- step:
name: Push changes to Commerce Cloud
script:
- dcu --putAll $OCCS_CODE_LOCATION --node $OCCS_ADMIN_URL --applicationKey $OCCS_APPLICATION_KEY
- step:
name: Publish changes Live Storefront
image: Python 3.5.1
script:
python publishDCUAuthoredChanges.py -u $OCCS_ADMIN_URL -k $OCCS_APPLICATION_KEY
environment variables:
$OCCS_CODE_LOCATION: Path to location of all OCCS code
$OCCS_ADMIN_URL: URL for the administration interface on the target Commerce Cloud instance
$OCCS_APPLICATION_KEY: application key to use to log into the target Commerce Cloud administration interface
So I want to use Azure Dev Repository to CI / CD.
in the above code block if you see I have specified - dcu & python code in two task.
dcu is nodejs third party oracle tool which needed to be used to migrate code to cloud system. I want to know how to use that tool in azure dev ops,
Second python (or) nodejs which I want to invoke to REST api to publish the changes.
So where to place those files and how do we invoke it.
*********** Update **************
I hosted the self pool agent and able to access the system.
Just start executing basic bash code, but end up in two issue -
1) the git extract files from the repository it is going to _work/1/s, not sure how that path is decided. How can I change that location s
2) I did 'pwd' to the correct path but it fails in 'dcu' command. I tried with npm and other few commands it fails. But things like mkdir , rmdir it create & remove folder correctly from the desired path. when I tried the 'dcu' cmd from the terminal manually from the system it works fine as expected.
You can follow below steps to use DCU tool and python in azure pipelines.
1, create a azure git repo to include dcu zip file and your .py files. You can follow the steps in this thread to create a azure git repo and push local files to azure repo.
2, create azure build pipeline. Please check here to create a yaml pipeline. Here is a good tutorial for you to get started.
To create a classic UI pipeline, please choose Use the classic editor in the pipeline setup wizard, and choose start with an Empty job to start with an empty pipeline and add your own steps.(I will use classic UI pipeline in below example.)
3, Click "+" and search for Extract files task to unzip the DCU zip file. Click the 3dots on the Destination folder field to select a destination folder for extracted dcu files. eg. $(agent.builddirectory). Please check my answer in this thread more information about predefined variables
4, click "+" to add a powershell task. Run below script in screenshot to install dcu and run dcu command. For environment variables (like $OCCS_CODE_LOCATION), please click the variables tab in below screenshot to define them
cd $(agent.builddirectory) #the folder where the unzipped dcu files reside. eg. $(agent.builddirectory)
npm install -g
.\dcu.cmd --putAll $(OCCS_CODE_LOCATION) --node $(OCCS_ADMIN_URL) --applicationKey $(OCCS_APPLICATION_KEY)
5, add Use python version task to define a python version to execute your .py file.
6, add Python script task to run your .py file. Click the 3dots on Script path field to locate your publishDCUAuthoredChanges.py file(this py file and the dcu zip file have been pushed to azure git repo in the above step 1).
You should be able to run the script of above question in the azure devops pipeline.
Update:
_work/1/s is the default working folder for the agent. You cannot change it. Though there are ways to change the location where the source code is cloned from git, the tasks' workingdirectory is still from the default folder.
However, You can change the workingdirectory inside the tasks. And there are predefined variables you can use to refer to the places in the agents. For below example:
$(Agent.BuildDirectory) is mapped to c:\agent_work\1
%(Build.ArtifactStagingDirectory) is mapped to c:\agent_work\1\a
$(Build.BinariesDirectory) is mapped to c:\agent_work\1\b
$(Build.SourcesDirectory) is mapped to c:\agent_work\1\s
The .sh scripts in the _temp folder are generated automatically by the agent which contains the scripts in the bash task.
For above dcu command not found error. You can try adding dcu command path to the system variables path for your local machine's environment variables. (path in user variables cannot be found by agent jobs, For the agent use a different user account to connect to local machine)
.
Or you can use the physically path to dcu command in the bash task. For example let's say the dcu.cmd in the c:\dcu\dcu.cmd on local machine. Then in the bash task use below script to run dcu command.
c:/dcu/dcu.cmd --putAll ...

custom action fails to execute

During the installation that I made I run in custom action file called ConfigurationUtility.exe witch should create a SQL database with some parameters. It should run some scripts in directory \scripts where the utility is copied. But I have this error in event log: "Action ConfigurationUtility.exe, location: C:\Windows\Installer\MSI4724.tmp, command: -dbname NewDB -username sa -password .....
I think it's happens because the installer trying to run it from C:\Windows\Installer\MSI4724.tmp but not from the Installation folder.
Setup package is built with Advanced Installer.
How can I fix it?
Thanks.
You have not configured the custom action correctly.
If you want it to run some scripts from the installation folder where it is placed you should call the EXE using the custom action "Launch installed file". You should not launch it as an attached file custom action (only this type of custom actions get extracted as temp files and launched as in your example)
Also, since this is an EXE I recommend you give it full admin rights to run, otherwise the system might stop it from running. To do this configure the custom action to "When the system is being modified" and "Run under the LocalSystem account with full privileges", and make sure it is scheduled to run after "Add Resources" group (where Advanced Installer ads it by default)

VSDBCMD remotely

Can I run the VSDBCMD command remotely? I mean without copying the files to the SQL server? I am trying to create a dbschema file to use it as a reference in a database project.
I tried to run the command on my machine, and I get the following error: "TSD An error was received from SQL Server while attempting to reverse engineer elements of type Microsoft.Data.Schema.Sql.SchemaModel.ISql100DatabaseEncryptionKey: The user does not have permission to perform this action. An unexpected failure occurred: Access to the path 'C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\Pivotal_dev_ed.dbschema' is denied."
Do I need special permission on the SQl server?
I found the answer, it seems you can run it remotely, you just have to specify the path to the folder where you want the schema to be saved. I got the error mentioned above because I didn't had permissions to write on the server, but specifying a path to a folder where I could write solved the problem

Resources