In BW6 profile files(.substvar) we need to use substitution parameter which will be replaced by CI-CD platform just before deploy to any environment,
for example if we have three environment : dev, test, production, for those environments we have a sftp access, will have three profile files for each environment, developper will need to put values like this in profiles so the CI-CD platform replace them as needed for each environment :
My problem is with non string property, for example integer or password, how to deal with them because for example can't use #port# in an integer field using business studio, when we open as xml it works normally but in business studio can't set it.
Any best practice to deal with this ?
If you click on the value, you should see a "config" icon. Click on Container Configuration:
Once you click on it, the field will convert to "String" and you can enter the name of the variable:
Related
In Quarkus, You can define environment variables that will represent as configuration to your Quarkus app. For example:
DATABASE_PASSWORD=test123
Outside the quarkus app, there is a file that contains a password. The content of this file (being the password) needs to be set to a property in the quarkus app.
Would it be possible to define an environment variable (i.e. a property) that runs a 'cat' command on that file (as the quarkus app is running in a linux environment) and the value is stored in that environment variable?
e.g.
DATABASE_PASSWORD=`cat /var/common/database_secret.txt`
Is this possible? If not would anyone be able to make suggestions for an alternative solution (where we have to have a property whereby the value is from the contents of a file)?
Many thanks.
We are currently set path of properties file which contains secret/access key for Credentials File for AWSCredentialsProviderControlerService . Issue, is we are changing properties path for prod and non prod each time we run nifi workflow. trying to come up no change on Configuration on Credential File path, so that access/secret key would be read regardless of prod and non prod. Since credential file wont support Nifi Expresion language, trying to make use of ACCESS KEY/SECRET properties ${ENV:equalsIgnoreCase("prod"):ifElse(${ACESS_PROD},${ACESS_NONPROD})} Issue we are facing, we are not able to store these access key/secret keys to the registry. Hence unable to implement this change. Is there any way to read access/secret key regardless of environment in Nifi. Curently we are using 1 property file for non prod nifi and 2nd property file for prod properties. In this set up, manually need to changed to credential file path when switching from prod to non prod. Trying to seamlessly work without changing path of credential file. Is there any way to make this happen?
enter image description here
The process that uses the AWSCredentialsProviderControlerService does not support param or variables, but the AWSCredentialsProviderControlerService "credential file" property supports "Parameters Context" as entries, make use of this for your solution.
Example:
Trigger something --> RouteOnAttribute --> If Prod (run executestreamcmd and change the Parameter Context Value to point to prod credfile) else if DEV(run executestreamcmd and change the Parameter Context Value to point to prod credfile) --> then run you AWS Processor.
You can use the toolkit client to edit the parameter context, or event nipyapi python module. It will not be fast tohu.
I've read many questions on this such as:
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/da4bdb11-fe42-49db-bb8d-288dd1bb72a2/sqlcmd-vars-in-create-table-script?forum=ssdt
and
How to run different pre and post SSDT pubish scripts depending on the deploy profile
What I'm trying to achieve is a way of defining a set of scripts based on the environment being deployed to. The idea is that the environment is passed in as a SQLCMD variable as part of the azure-devops pipeline into a variable called $(ServerName), which I've setup in the sql server database project under properties with a default of 'DEV'.
This is then used in the post deployment script like this:
:r .\PostDeploymentScripts\$(ServerName)\index.sql
This should therefore pick up the correct index.sql file based on the $(ServerName) variable. When testing this by publishing and entering 'QA' for the $(ServerName) variable and generating the script it was still displaying the 'DEV' scripts. However, the top of the script showed the variable had been set correctly:
How do I get the post deployment script to reference the $(ServerName) variable correctly so I can dynamically set the correct reference path?
Contrary to this nice post: https://stackoverflow.com/a/54482350/11035005 , it appears that the :r directive is evaluated at compile time and inserted into the DACPAC before the xml profiles are even evaluated so this is not possible as explained.
The values used are the defaults or locals from the build config and can only be controlled from there.
I am trying to use Google Cloud CLOUD NATURAL LANGUAGE API.
I already have Google cloud running Account.
I enabled CLOUD NATURAL LANGUAGE API service and generated Service account keys and downloaded locally.
I ham using Goggle default program
LanguageServiceClient language = LanguageServiceClient.create();
// The text to analyze
String text = "My stay at this hotel was not so good";
Document doc = Document.newBuilder().setContent(text).setType(Type.PLAIN_TEXT).build();
// Detects the sentiment of the text
Sentiment sentiment = language.analyzeSentiment(doc).getDocumentSentiment();
System.out.printf("Text: %s%n", text);
System.out.printf("Sentiment: %s, %s%n", sentiment.getScore(), sentiment.getMagnitude());
I am using Eclipse as IDE on Mac
When I run application I got error
java.io.IOException: The Application Default Credentials are not available. They are available if running in Google Compute E
ngine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs
/application-default-credentials for more information.
I even added GOOGLE_APPLICATION_CREDENTIALS as export in Terminal and on using "printenv" it shows the Path like this
GOOGLE_APPLICATION_CREDENTIALS=/Users/temp/Downloads/Sentiment-0e556940c1d8.json
Still it wasn't working with some hit and trial method I found out that in eclipse we can configure run.
There I have added environment variable and after that when I run program it works fine.
Now MY problem is I am implementing that code inside J2EE project and that ear file is to deploy in Wildfly.
I am again getting same error. Now I dont know where to set enviromnet variable in Wildfly or where???
Finally I found a way to set up GOOGLE_APPLICATION_CREDENTIALS as environment variable inside Wildfly
If you are running server through Eclipse
Open Wildfly Server setting by double clicking your server inside
Server Tab
Click "Open Launch Configuration"
Move to "Environment" tab and add new variable as key value pair
eg
GOOGLE_APPLICATION_CREDENTIALS /Users/temp/Downloads/Sentiment-0e556940c1d8.json
If you are running server using terminal
By default Wildfly looks for additional setting inside standalone.conf file.
just open wildfly/bin/standalone.conf file and add following line
GOOGLE_APPLICATION_CREDENTIALS=/Users/temp/Downloads/Sentiment-0e556940c1d8.json
Thats it. You are good to go.....
Is it possible to deploy different sets of seed data for different publish profiles using visual studio Sql Server Data tools database project?
We know you can deploy seed data using a post deployment script.
We know you can deploy to different environments using the publish profiles facility.
What we don't know is how you can deploy different seed data to the different environments.
Why would we want to do this?
We want to be able to do this so we can have a small explicit set of seed data for unit testing against.
We need a wider set of data to deploy to the test team's environment for the test team to test the whole application against
We need a specific set of seed data for the pre-prod environment.
There are a few ways you can achieve this, the first approach is to check for the environment in the post deploy script such as..
if ##servername = 'dev_server'
begin
insert data here
end
A slightly cleaner version is to have different script files for each environment and importing them via the :r import sqlcmd script so you could have:
PostDeploy.sql
DevServer.sql
QAServer.sql
then
if ##servername = 'dev_server'
begin
:r DevServer.sql
end
if ##servername = 'qa_server'
begin
:r QAServer.sql
end
You will need to make sure the paths to the .sql files are correct and you copy them with the dacpac.
You don't have to use ##servername you can use sqlcmd variables and pass them in for each environment which again a little cleaner than hardcoded server names.
The second approach is to moodify the dacpac to change the post delpoy script with your environment specific one, this is the my preferred and works best as part of a CI build, my process is:
Check-in changes
Build Server builds dacpac
Build takes dacpac, copies to the dev,qa,prod, etc env folders
Build replaces the post-deploy script in each with the env specific script
I call the scripts PostDeploy.dev.sql, PostDeploy.Qa.sql etc and set the Build action to "None" or they are added as "Script, Not in Build".
To replace the post-deploy script you just need to use the .net Packaging API or for some examples take a look at my Dir2Dac demo which does that and more:
https://github.com/GoEddie/Dir2Dac
more specifically:
https://github.com/GoEddie/Dir2Dac/blob/master/src/Dir2Dac/DacCreator.cs
var part = package.CreatePart(new Uri("/postdeploy.sql", UriKind.Relative), "text/plain");
using (var reader = new StreamReader(_postDeployScript))
{
reader.BaseStream.CopyTo(part.GetStream(FileMode.OpenOrCreate, FileAccess.ReadWrite));
}
I have solved this by writing a Powershell script that gets executed automatically when publishing, by an Exec Command in the Project-file.
It creates a script file, which includes all scripts found in a folder in the project (the folder is named like the target Environment).
This script is then included in the post-deploy script.