using environment variables in flywayMigrate with intellij - gradle

I configured flyway to use environment variables like so
flyway.url=${JDBC_DATABASE_URL}
flyway.locations=filesystem:db/migrations
running JDBC_DATABASE_URL=... ./gradlew flyMigrate is working as expected. similarly, set -a && source dev.env && ./gradlew flyMigrate is working as expected.
The challange is using Intellij. I've set the environment varables with the command set -a && source dev.env && idea When I run the flyMigrate gradle task, it is not able to read the env variables.
However the following code is working as expected
public class TestEnv {
public static void main(String[] args) {
System.out.println(System.getenv("JDBC_DATABASE_URL"));
}
}
What am I missing?

Related

How can I run my grails-5 app's cucumber tests as part of ./gradlew check?

I am writing a Grails-5 and I am using cucumber for BDD. I've followed the tutorial at: https://www.baeldung.com/java-cucumber-gradle
I can run my unit tests using:
$ ./gradlew check
And I can run my cucumber BDD tests by starting the server in one shell:
$ ./gradlew server:bootRun
And invoking the tests in another:
$ ./gradlew cucumberCli
Is it possible to configure build.gradle in such a way as to have ./gradlew check run the unit tests, then start the server, then run the cucumber tests, and finally bring the server back down?
If at all possible, it would be even better if the cucumber infrastructure could start and stop the server between each test. That way each test would start in a known state.
I managed to get this working by adding #Before and #After steps to my StepDefinitions.groovy file:
def serverProcess
private String getBaseUrl() {
return "http://localhost/"
}
#Before
public void startServer() {
try {
serverProcess=Runtime.getRuntime().exec("../gradlew bootRun")
} catch (IOException e) {
e.printStackTrace()
}
def done = false
while(!done) {
try {
done = new URL(getBaseUrl()).openConnection().getResponseCode() == 200
} catch(Exception e) {
Thread.sleep(500)
}
}
}
#After
public void stopServer() {
serverProcess.destroy()
}

Spring Config Not Picking Up Env Variable

I have an env variable $MONGODB_URI
echo $MONGODB_URI
mongodb://localhost:27017
My application.properties
mongoUri=${MONGODB_URI}
My Config.java
#Configuration
public class Config {
#Value("${mongoUri}")
private String mongoUri;
..
}
When I try to start up the app in IntelliJ Idea, I get
Could not resolve placeholder 'MONGODB_URI' in value "${MONGODB_URI}"
The app starts up fine with
./gradlew bootRun
How can I properly configure IntelliJ to read from the environment? I'll need to swap out the db url depending on if it's prod, local, etc.
You need to run to your application with environment configuration as shown below in image:
sample code to verify
#RestController
public class StatusController {
private final Environment environment;
public StatusController(Environment environment) {
this.environment = environment;
}
#GetMapping("/env")
public String envValue() {
return environment.getProperty("MONGODB_URI");
}
}

Pass Docker arguments to Spring boot properties

In a project locally that I've created to test passing Docker arguments to my spring boot application.properties I have in the application.properties: test.name=${name}
and in the application
#SpringBootApplication
public class RestServiceApplication {
#Value("${test.name}")
private String test;
public static void main(String[] args) {
SpringApplication.run(RestServiceApplication.class, args);
}
#Bean(name = "test")
public String getTest() {
return this.test;
}
}
My Dockerfile:
FROM openjdk:8-jdk-alpine
ARG NAME
ENV TEST_NAME=${NAME}
RUN echo $TEST_NAME
ARG JAR_FILE=target/*.jar
RUN echo JAR_FILE
ADD ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
Building it with sudo docker build -t gs-rest-service --build-arg NAME=Alex . This works fine locally
Now on a different project my application.properties:
bulk.api.username=${BULK_USERNAME}
bulk.api.password=${BULK_PASSWORD}
and my Dockerfile:
FROM maven:3.6.0-jdk-8 AS build
ARG ARTIFACTORY_USER
ARG ARTIFACTORY_PASSWORD
WORKDIR /build
COPY . .
RUN mvn -s settings.xml clean package
ARG BULK_USERNAME
ARG BULK_PASSWORD
ENV BULK_API_USERNAME=${BULK_USERNAME}
ENV BULK_API_PASSWORD=${BULK_PASSWORD}
RUN echo $BULK_API_PASSWORD
FROM openjdk:8-jre-alpine
WORKDIR /opt/cd-graph-import
COPY --from=build /build/target/abc-svc.jar .
ENTRYPOINT ["java", "-jar", "abc-svc.jar"]
My spring boot class:
#SpringBootApplication
public class SpringBootApplication {
#Value("${bulk.api.username}")
private String bulkApiUsername;
#Value("${bulk.api.password}")
private String bulkApiPassword;
public static void main(String[] args) {
SpringApplication.run(SpringBootApplication.class, args);
}
#Bean
public RestTemplate restTemplate() {
return new RestTemplateBuilder().basicAuthentication(bulkApiUsername, bulkApiPassword).build();
}
#Bean(name = "bulkApiUrl")
public String getBulkApiUrl() {
return this.bulkApiUrl;
}
}
When this runs in gitlab with:
docker build -t $SERVICE_IMAGE --build-arg ARTIFACTORY_USER=$ARTIFACTORY_USER --build-arg ARTIFACTORY_PASSWORD=$ARTIFACTORY_PASSWORD --build-arg BULK_USERNAME=$BULKAPI_USERNAME --build-arg BULK_PASSWORD=$BULKAPI_PASSWORD .
I see that $BULK_API_PASSWORD is set properly but when running the application I get the error: `Error creating bean with name
'someApplication': Injection of autowired dependencies failed; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'BULK_USERNAME' in value "${BULK_USERNAME}"
What am I missing?
According to spring boot application properties docs, you are able to set application properties by an environment variable. You should make env variable with the same name as a property name but if you use env variable, it's recommended to use '_' instead of '.'. For example if you want to set test.name you have to set TEST_NAME env argument.
In the first example, you set TEST_NAME env parameter in your docker file. When your application is starting, spring gets property from env variable (spring gets TEST_NAME, not NAME variable) and passes to the application, which replaces your default value "${name}". Please note here that spring not inject NAME variable, but replace property test.name by value from env variable.
In the second case, you haven't set BULK_API_USERNAME and BULK_API_PASSWORD env properties and spring doesn't replace default values and use ${API_USERNAME} and ${API_PASSWORD} as a properties value. You should set BULK_API_USERNAME and BULK_API_PASSWORD for passing your values to the app.
And also leave the value of properties in application.properties are empty. 'bulk.api.username=' and 'bulk.api.password='

Why won't Gradle pass runtime args to Groovy?

Gradle 2.14 and Groovy 2.4.7 here. I have the following Groovy:
#Slf4j
class Driver {
static void main(String[] args) {
log.info("Fizz prop value is: ${System.properties['fizz']}.")
}
}
I'm using the Gradle Application plugin. When I run:
./gradlew run -Pfizz=buzz
I get he following console output:
[main] INFO com.me.myapp.Driver - Fizz prop value is: null.
Why does my app think the fizz property is null, when I am passing its value as buzz on the command-line?
You are passing it with -P which are properties for gradle
Try passing it with -D
try without the System.properties and passing -Pfizz=buzz:
log.info("Fizz prop value is: ${fizz}.")

Command 'hadoop jar' Doesn't Take -Dfile.encoding=UTF-8?

Consider the following main class for map-reduce job:
public class App extends Configured implements Tool {
public static void main(String[] args) throws Exception {
ToolRunner.run(new App(), args);
}
#Override
public int run(String[] args) throws Exception {
System.out.println(Charset.defaultCharset().toString());
return 0;
}
}
When using in interactive shell, it outputs 'UTF-8'. When using in crontab, it's 'US-ASCII'.
But use 'java -Dfile.encoding=UTF-8 -jar xxx.jar', it works fine in crontab. However, 'hadoop jar' command doesn't take this parameter:
hadoop jar xxx.jar -Dfile.encoding=UTF-8
In crontab, it still outputs US-ASCII.
One solution is export a LC_ALL env:
0 * * * * (export LC_ALL=en_US.UTF-8; hadoop jar xxx.jar)
Is there another way?
Update
Another env I find useful is HADOOP_OPTS:
0 * * * * (export HADOOP_OPTS="-Dfile.encoding=UTF-8"; hadoop jar xxx.jar)
Try setting the environment variable HADOOP_OPTS to contain args like this. They are really arguments to java. See the bin/hadoop script; it will add these to the java command.
We found that the problem was that the mapper java processes didn't have -Dfile.encoding=UTF-8 . We had to add that to "mapreduce.map.java.opts". Same for "mapreduce.reduce.java.opts".
You can do it in the XML config files, as well as in Java like:
config.set("mapreduce.map.java.opts","-Xmx1843M -Dfile.encoding=UTF-8");
See http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/ClusterSetup.html for config details.

Resources