In a project locally that I've created to test passing Docker arguments to my spring boot application.properties I have in the application.properties: test.name=${name}
and in the application
#SpringBootApplication
public class RestServiceApplication {
#Value("${test.name}")
private String test;
public static void main(String[] args) {
SpringApplication.run(RestServiceApplication.class, args);
}
#Bean(name = "test")
public String getTest() {
return this.test;
}
}
My Dockerfile:
FROM openjdk:8-jdk-alpine
ARG NAME
ENV TEST_NAME=${NAME}
RUN echo $TEST_NAME
ARG JAR_FILE=target/*.jar
RUN echo JAR_FILE
ADD ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
Building it with sudo docker build -t gs-rest-service --build-arg NAME=Alex . This works fine locally
Now on a different project my application.properties:
bulk.api.username=${BULK_USERNAME}
bulk.api.password=${BULK_PASSWORD}
and my Dockerfile:
FROM maven:3.6.0-jdk-8 AS build
ARG ARTIFACTORY_USER
ARG ARTIFACTORY_PASSWORD
WORKDIR /build
COPY . .
RUN mvn -s settings.xml clean package
ARG BULK_USERNAME
ARG BULK_PASSWORD
ENV BULK_API_USERNAME=${BULK_USERNAME}
ENV BULK_API_PASSWORD=${BULK_PASSWORD}
RUN echo $BULK_API_PASSWORD
FROM openjdk:8-jre-alpine
WORKDIR /opt/cd-graph-import
COPY --from=build /build/target/abc-svc.jar .
ENTRYPOINT ["java", "-jar", "abc-svc.jar"]
My spring boot class:
#SpringBootApplication
public class SpringBootApplication {
#Value("${bulk.api.username}")
private String bulkApiUsername;
#Value("${bulk.api.password}")
private String bulkApiPassword;
public static void main(String[] args) {
SpringApplication.run(SpringBootApplication.class, args);
}
#Bean
public RestTemplate restTemplate() {
return new RestTemplateBuilder().basicAuthentication(bulkApiUsername, bulkApiPassword).build();
}
#Bean(name = "bulkApiUrl")
public String getBulkApiUrl() {
return this.bulkApiUrl;
}
}
When this runs in gitlab with:
docker build -t $SERVICE_IMAGE --build-arg ARTIFACTORY_USER=$ARTIFACTORY_USER --build-arg ARTIFACTORY_PASSWORD=$ARTIFACTORY_PASSWORD --build-arg BULK_USERNAME=$BULKAPI_USERNAME --build-arg BULK_PASSWORD=$BULKAPI_PASSWORD .
I see that $BULK_API_PASSWORD is set properly but when running the application I get the error: `Error creating bean with name
'someApplication': Injection of autowired dependencies failed; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'BULK_USERNAME' in value "${BULK_USERNAME}"
What am I missing?
According to spring boot application properties docs, you are able to set application properties by an environment variable. You should make env variable with the same name as a property name but if you use env variable, it's recommended to use '_' instead of '.'. For example if you want to set test.name you have to set TEST_NAME env argument.
In the first example, you set TEST_NAME env parameter in your docker file. When your application is starting, spring gets property from env variable (spring gets TEST_NAME, not NAME variable) and passes to the application, which replaces your default value "${name}". Please note here that spring not inject NAME variable, but replace property test.name by value from env variable.
In the second case, you haven't set BULK_API_USERNAME and BULK_API_PASSWORD env properties and spring doesn't replace default values and use ${API_USERNAME} and ${API_PASSWORD} as a properties value. You should set BULK_API_USERNAME and BULK_API_PASSWORD for passing your values to the app.
And also leave the value of properties in application.properties are empty. 'bulk.api.username=' and 'bulk.api.password='
Related
I configured flyway to use environment variables like so
flyway.url=${JDBC_DATABASE_URL}
flyway.locations=filesystem:db/migrations
running JDBC_DATABASE_URL=... ./gradlew flyMigrate is working as expected. similarly, set -a && source dev.env && ./gradlew flyMigrate is working as expected.
The challange is using Intellij. I've set the environment varables with the command set -a && source dev.env && idea When I run the flyMigrate gradle task, it is not able to read the env variables.
However the following code is working as expected
public class TestEnv {
public static void main(String[] args) {
System.out.println(System.getenv("JDBC_DATABASE_URL"));
}
}
What am I missing?
My app works fine from the IDE. But when I docarize the app, it can't find the applicaiton resoruce. How to docarize the app correctly that I can access resocur folder. Here is my code:-
Dockerfile
FROM eclipse-temurin:17-jre-alpine
RUN mkdir -p /app
COPY target/Toolkit-*.jar /app/toolkit.jar
WORKDIR /app/
ENTRYPOINT java -jar toolkit.jar
I am reading propety as this:
#Log
public class PropertyLoader {
private static String RESOURCE_FILENAME = "application.properties";
private static PropertyLoader instance;
private Properties properties;
private PropertyLoader() {
// works fine form IDE. Docker inputStream is null
try (InputStream inputStream = ClassLoader.getSystemResourceAsStream(RESOURCE_FILENAME)) {
properties = new Properties();
properties.load(inputStream);
} catch (IOException e) {
log.log(Level.SEVERE, e.getMessage());
}
}
}
How do I read application resorces in docker continaer?
You need to application.properties to the image as well. add this to your Dockerfile
RUN mkdir -p /app/config
COPY <your file> /app/config/application.properties
I have 2 issues trying to connect to dynamodb in aws. It's working locally:
#Configuration
class DynamoDbConfig {
#Value("${amazon.access.key}")
private String awsAccessKey;
#Value("${amazon.access.secret.key}")
private String awsSecretKey;
#Value("${amazon.dynamodb.endpoint}")
private String awsDynamoDBEndPoint;
#Value("${amazon.dynamodb.region}")
private String awsDynamoDBRegion;
#Bean
public AWSCredentials amazonAWSCredentials() {
return new BasicAWSCredentials(awsAccessKey, awsSecretKey);
}
public AWSCredentialsProvider amazonAWSCredentialsProvider() {
return new AWSStaticCredentialsProvider(amazonAWSCredentials());
}
#Bean
public DynamoDB dynamoDB() {
AmazonDynamoDB amazonDynamoDB = AmazonDynamoDBClientBuilder.standard()
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(awsDynamoDBEndPoint, awsDynamoDBRegion))
.withCredentials(amazonAWSCredentialsProvider())
.build();
return new DynamoDB(amazonDynamoDB);
}
}
application-local.properties
amazon.access.key=key1
amazon.access.secret.key=key2
amazon.dynamodb.endpoint=http://localhost:8000
amazon.dynamodb.region=us-east-1
application-prod.properties
amazon.access.key=${AWS_ACCESS_KEY_ID}
amazon.access.secret.key=${AWS_SECRET_ACCESS_KEY}
amazon.dynamodb.endpoint=dynamodb.us-east-1.amazonaws.com
amazon.dynamodb.region=${AWS_DEFAULT_REGION}
I already got credentials and my .aws/credentials looks good:
[default]
aws_access_key_id = MyKeyId
aws_secret_access_key = MySecretKey
aws_session_token = blablabla
disney_session_expiration = This is also ok
1 Issue) It looks like always take the application-local.properties profile, if I show the awsAccessKey and awsSecretKey in the class DynamoDbConfig, I get key1 and key2. I tried with these 2 commands:
mvn spring-boot:run -Dspring.profiles.active=prod
mvn spring-boot:run -Pprod
2 Issue) I renamed application-prod.properties as application.properties to make spring takes that config file and I get this error message:
Could not resolve placeholder 'AWS_SECRET_ACCESS_KEY' in value "${AWS_SECRET_ACCESS_KEY}"
I guess the profile is not an issue, the values are not set/defined for the following keys
${AWS_ACCESS_KEY_ID}
${AWS_SECRET_ACCESS_KEY}
${AWS_DEFAULT_REGION}
I have an env variable $MONGODB_URI
echo $MONGODB_URI
mongodb://localhost:27017
My application.properties
mongoUri=${MONGODB_URI}
My Config.java
#Configuration
public class Config {
#Value("${mongoUri}")
private String mongoUri;
..
}
When I try to start up the app in IntelliJ Idea, I get
Could not resolve placeholder 'MONGODB_URI' in value "${MONGODB_URI}"
The app starts up fine with
./gradlew bootRun
How can I properly configure IntelliJ to read from the environment? I'll need to swap out the db url depending on if it's prod, local, etc.
You need to run to your application with environment configuration as shown below in image:
sample code to verify
#RestController
public class StatusController {
private final Environment environment;
public StatusController(Environment environment) {
this.environment = environment;
}
#GetMapping("/env")
public String envValue() {
return environment.getProperty("MONGODB_URI");
}
}
I have a maven project that uses spring and I would like to insert a filename into Spring whose contents has various database information. The filename can either be test.properties, prod.properties or dev.properties. I have seen posted that one can type
"mvn install -DfileTarget=test.properties"
and have the system property be set when a user builds project. However I am getting the following error when I deploy the project on Tomcat 8.
Caused by: java.lang.IllegalArgumentException: Could not resolve placeholder 'fileTarget' in string value "file:${jamesTarget}"
My properties (test, prod, dev) files contains DB related config values:
db.driver = jdbc:\\.....
db.url = url of database.
db.user = username
db.passwd = passwd
My Java code is as follows. Is there something I need to add into the pom.xml? It seems like when deploying on Tomcat, the system property is not found, how would I set this leaving the ability for me to install different files at build time?
Thanks in advance
#Configuration
#PropertySource({ "classpath:${fileTarget}" })
public class DbConfig {
#Autowired
private Environment env;
#Bean
public DataSource dataSource() {
System.out.println("driver : " + env.getProperty("db.driver"));
System.out.println("driver : " + env.getProperty("db.driver"));
System.out.println("user : " + env.getProperty("db.user"));
System.out.println("passwd : " + env.getProperty("db.passwd"));
}
...