I've read many questions on this such as:
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/da4bdb11-fe42-49db-bb8d-288dd1bb72a2/sqlcmd-vars-in-create-table-script?forum=ssdt
and
How to run different pre and post SSDT pubish scripts depending on the deploy profile
What I'm trying to achieve is a way of defining a set of scripts based on the environment being deployed to. The idea is that the environment is passed in as a SQLCMD variable as part of the azure-devops pipeline into a variable called $(ServerName), which I've setup in the sql server database project under properties with a default of 'DEV'.
This is then used in the post deployment script like this:
:r .\PostDeploymentScripts\$(ServerName)\index.sql
This should therefore pick up the correct index.sql file based on the $(ServerName) variable. When testing this by publishing and entering 'QA' for the $(ServerName) variable and generating the script it was still displaying the 'DEV' scripts. However, the top of the script showed the variable had been set correctly:
How do I get the post deployment script to reference the $(ServerName) variable correctly so I can dynamically set the correct reference path?
Contrary to this nice post: https://stackoverflow.com/a/54482350/11035005 , it appears that the :r directive is evaluated at compile time and inserted into the DACPAC before the xml profiles are even evaluated so this is not possible as explained.
The values used are the defaults or locals from the build config and can only be controlled from there.
Related
I posted this in the Jenkins users Google group, but thought I'd post it here too.
I have a Jenkins Pipeline job, and in its Configuration page, I use a "Pipeline script from SCM" as my pipeline. One of this block's parameters is "Branch to build" of course. How can I used an environment variable for the text block? I tried, for example, $branchToBuild, ${branchToBuild} or "${branchToBuild}" and it just takes those as literal values and does not interpolate the string. I do have that variable defined and use it in other jobs.
Someone suggested using ${env.branchToBuild}, so I tried env.branchToBuild, $env.branchToBuild, ${env.branchToBuild}, and "${env.branchToBuild}" all to NO avail, that is, they are also just taken as literal strings and not interpolated.
Is it just not possible to do this?
You have to uncheck Lightweight checkout box in order to use a variable as Branch name to build.
It's a known Jenkins bug, here is more information : How to pass project parameter as branch name to build in Jenkins
Apparently the code path is very different if you are using the
lightweight checkout, and that has not been resolved, apparently.
Another source : https://cleverbuilder.com/notes/jenkins-dynamic-git-branch/
I search a lot on the web, almost all links says define JVM custom variables in jvm.options also placed it on ${server.config.dir}/jvm.options.For example I added a variable called -DAPP_ENV=PROD. But this is getting as NULL after server startup.
Any idea?
It looks like you want to define an environment variable, so you have two options.
1. Use an Environment variable
In this case, you can define an environment variable (like $PATH) and load it in your app. Note this is not a JVM argument, and it will be set in the bin/server shell command used to start the server.
In the file:${server.config.dir}/server.env
Add the following line: APP_ENV=PROD
Access the value with:
System.getenv("APP_ENV"); -> PROD
2. Use a System property
This is what you are trying to do, so I am not sure why it doesn't work for you, but here's how:
In the file:${server.config.dir}/jvm.options
Add the following line: -DAPP_ENV=PROD
Access the value with:
System.getProperty("APP_ENV"); -> PROD
Note that in both cases these values are set at server start-up, and they are not changed dynamically (most Liberty configuration is dynamic). The JVM options and environment are sourced and set during the start script so a restart is required if you want to change either one.
My personal recommendation is go to the server.env route - its more generic and (to me) feels more appropriate since you are trying to influence the execution environment of the process, rather than defining behaviors or configuration of the JVM.
I'm trying to understand about variables in teamcity. My understanding is there are 3 kinds of variables(System,Env,Config)
But in jetbrains documentation I saw more variables. Looks like agent variables and server side variables are separate.
But in TeamCity, parameters section when I select "kind" config or system or env, all kinds of values are populated (I Expected only relevant values should come)
Not really clear about when we have to use which variable. Is TeamCity having 6 variables for parameters(Serverside:env,sys,config and Agent:env,sys,config).
There are three types of parameters, they differ in a way they might be used in a build:
env parameters are passed to the build process (spawned by TeamCity) as environment variable
sys parameters set tool-specific variables (and therefore passed to the build scripts of the supported runners)
config parameters are meant to be used for build configuration customization
There're predefined parameters exposing server build properties, agent properties, agent build properties etc. These parameters are passed to the build as system parameters, some of them are also copied to the environment variables.
In addition, parameters might be defined
for a certain build via "Run Custom Build" dialog
in Parameters section of build configuration/project or build configuration/project template
in buildAgent.properties file on the agent
More details might be found in the docs.
Is it possible to deploy different sets of seed data for different publish profiles using visual studio Sql Server Data tools database project?
We know you can deploy seed data using a post deployment script.
We know you can deploy to different environments using the publish profiles facility.
What we don't know is how you can deploy different seed data to the different environments.
Why would we want to do this?
We want to be able to do this so we can have a small explicit set of seed data for unit testing against.
We need a wider set of data to deploy to the test team's environment for the test team to test the whole application against
We need a specific set of seed data for the pre-prod environment.
There are a few ways you can achieve this, the first approach is to check for the environment in the post deploy script such as..
if ##servername = 'dev_server'
begin
insert data here
end
A slightly cleaner version is to have different script files for each environment and importing them via the :r import sqlcmd script so you could have:
PostDeploy.sql
DevServer.sql
QAServer.sql
then
if ##servername = 'dev_server'
begin
:r DevServer.sql
end
if ##servername = 'qa_server'
begin
:r QAServer.sql
end
You will need to make sure the paths to the .sql files are correct and you copy them with the dacpac.
You don't have to use ##servername you can use sqlcmd variables and pass them in for each environment which again a little cleaner than hardcoded server names.
The second approach is to moodify the dacpac to change the post delpoy script with your environment specific one, this is the my preferred and works best as part of a CI build, my process is:
Check-in changes
Build Server builds dacpac
Build takes dacpac, copies to the dev,qa,prod, etc env folders
Build replaces the post-deploy script in each with the env specific script
I call the scripts PostDeploy.dev.sql, PostDeploy.Qa.sql etc and set the Build action to "None" or they are added as "Script, Not in Build".
To replace the post-deploy script you just need to use the .net Packaging API or for some examples take a look at my Dir2Dac demo which does that and more:
https://github.com/GoEddie/Dir2Dac
more specifically:
https://github.com/GoEddie/Dir2Dac/blob/master/src/Dir2Dac/DacCreator.cs
var part = package.CreatePart(new Uri("/postdeploy.sql", UriKind.Relative), "text/plain");
using (var reader = new StreamReader(_postDeployScript))
{
reader.BaseStream.CopyTo(part.GetStream(FileMode.OpenOrCreate, FileAccess.ReadWrite));
}
I have solved this by writing a Powershell script that gets executed automatically when publishing, by an Exec Command in the Project-file.
It creates a script file, which includes all scripts found in a folder in the project (the folder is named like the target Environment).
This script is then included in the post-deploy script.
I have a Test::Unit configuration and at runtime, I would like a prompt in which i enter an environment(e.g. platform, staging or production) and the test runs in the specified environment. The related links for all environments are in environments.yml in the codebase
Right now, if I manually add an environment variable called DOMAIN and assign its value to 'platform', the test runs in the specified environment.
To do what I need to do, I have tried the following till now:
Created a Shell script which sets the DOMAIN env variable
In Before launch in IDEA, I set it to an external tool.
That tool calls the script created in step 1. A prompt is created and this takes in the environment to run the test. That env is passed as a parameter to the shell script.
The problem is, Idea does not pick up the change. Is there an easy way of doing this?