Register a SCDF task with yaml file configuration - spring

I have a local implementation of a SCDF server and the task that I want to register uses a yaml file to populate a map, like this:
aplication:
mapValues:
value1:
foo: bar
bar: foo
value2:
foo: bar
bar: foo
If I execute this project from my IDE the autowired object is correctly populated with the values defined on my yaml file. If I register the application and try to execute from the SCDF dashboard, I get a NullPointerException when I try to use the autowired object since the registered app is not populating the object with the values from the yaml file.
My question is: how do I avoid this from happening? How do I tell a registered task to use the values from an yaml file? Is there an specific step in which I tell the application to use the yaml file, like during the mvn package or during the app registration using the SCDF shell? Is there an option to set the app yaml file before executing the task on the SCDF dashboard?
Any help with this situation would be greatly appreciated.

The task being executed is a Spring Boot application. If that application includes an application.properties|yml (eg. in src/main/resources) those properties will be available to the task.
For a sanity test you can add a very simple property in application.yml such as
foo: bar
then in your main class something like this:
#Value("${foo:UNKNOWN}")
private String fooValue;
#PostConstruct
void init() {
System.out.println("***** " + fooValue);
}
What do you see when you run the app directly in IDEA? And what about when you run the app from Dataflow UI?

Related

Spring Cloud Contract generates message verifier test from Groovy to YAML

Currently I'm using Spring Cloud Contract to test my publishing of events from one microservices to another. After the test run succesfully, I would publish the stubs to be use by other microservices.
In the publisher microservice, I write a Groovy test with Contract.make and specify the input and outputMessage like below.
input {
triggeredBy "triggerEventMethod()"
}
outputMessage {
sentTo "somewhere"
body """
{
"field": "sampleData"
}
"""
headers {
header("contentType", applicationJson())
header("eventName", "SampleEventName")
}
}
In my BaseClass, I annotated the class with #ImportAutoConfiguration(NoOpContractVerifierAutoConfiguration::class) to configure the MessageVerifier and declare my triggerEventMethod() to run some method which would trigger the ApplicationContext to publish the SampleEventName. I have my own custom implementation class of MessageVerifierReceiver which would listen to the SampleEventName and store the message in a ThreadLocal then my override of the receive method just try to retrieve the message available in the ThreadLocal. Then the generated test by spring-cloud-contract would just verify the content of the body and headers of the event.
As far as it goes, this test has been working for more than 2 years and currently after upgrading spring-cloud-contract from 2.2.3.RELEASE to 3.1.1, I'm experiencing errors while running the generated test.
My observation on the newly generated version of test is that it would generate a YAML version of my Groovy contract and place it relatively with the generated test class in generated-test-sources folder. When running the test, it would try to reference the .yml file and hit FileNotFoundException of the .yml file. I try manually remove the the logic where it look for the file and my test case would pass after that, but since it is a generated test, it technically should not be edited this way. Sample referencing of the .yml file as below.
ContractVerifierMessage response = contractVerifierMessaging.receive("somewhere",
contract(this, "sampleEvent.yml"));
Does anyone know any way which I could exclude the generation of .yml or referencing of the YAML contract in my generated test? Or if you have better suggestions of how I could refactor my test of ensuring applicationContext publishing works?
The Groovy files are then generated to stubs for my other microservices to do contract testing and serve as mock data.

How to get gitlab tag of deployed service?

I'm building a spring boot service. Everyone of my releases gets tagged with a specific version number
Now I can chose one of these releases to deploy them.
But is there a way to ask the deployed service for his version number? I would like to be able to send e.g. an API request to my deployed service (GET service/version) and receive the tagname (e.g. v2.0.0) as response.
One idea was to trigger a new pipeline for each new tagged version, there I have to rebuild the service and add the tagname into a file in the root directory of my service. Then I can build an API to send the file to me when requested. Is there maybe an easier way that I'm overlooking?
I solved it like this:
Tell Spring Boot to look for a variable e.g. RELEASE_VERSION
-> Spring will not find it within it's code and automatically look for it in the deployment environment
-> In my case I had a manifest-my-company.yml file where I created a environment variable
RELEASE_VERSION: '((RELEASE_VERSION))'. But I'm not sure if that's specific to
my company.
Create an API that returns the value of this variable:
#RestController
public class GetVersionController implements GetVersionApi {
#Value("${RELEASE_VERSION:no version number available}") //"no version number available" is the dafault value
private String releaseVersion;
#Override
public String getVersion() throws IOException {
return releaseVersion;
}
}
Now pass the commit tag value as that variable in when deploying your service. In my case I used a company script for deployment to Cloud Foundry and could simply add the variable in my gitlab-ci.yml specification:
deploy to prod:deploy to prod:
extends: .deploy_to_prod
script:
- deploy_to_cf.sh RELEASE_VERSION=${CI_COMMIT_TAG}
...

What is advantage of using #value annotation in Spring Boot

I am new to Spring Boot and I am doing code cleanup for my old Spring Boot application.
Below code is using #Value annotation to inject filed value from properties file.
#Value("${abc.local.configs.filepath}")
private String LOCAL_ABC_CONFIGS_XML_FILEPATH;
My doubt is instead of getting value from properties file, can we not directly hardcode the value in same java class variable?
Example: private String LOCAL_ABC_CONFIGS_XML_FILEPATH="/abc/config/abc.txt"
It would be easier for me to modify the values in future as it will be in same class.
What is advantage of reading from properties file, does it make the code decoupled ?
This technique is called as externalising configurations. You are absolutely right that you can have your constants defined in the very same class files. But, sometimes, your configurations are volatile or may change with respect to the environment being deployed to.
For Example:
Scene 1:
I have a variables for DB connection details which will change with the environment. Remember, you will create a build out of your application and deploy it first to Dev, then take same build to stage and finally to the production.
Having your configurations defined externally, helps you to pre-define them at environment level and have same build being deployed everywhere.
Scene 2:
You have already generated a build and deployed and found something was incorrect with the constants. Having those configurations externalised gives you a liberty to just override it on environment level and change without rebuilding your application.
To understand more about externalising techniques read: https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-external-config.html
Here #value is used for reading the values from properties file (it could be a any environment like dev, qa, prod) but we are writing #value on multiple fields it is not recomonded so thats instead of #value we can use #configurableProperties(prefix="somevalue>) and read the property values suppose `
#configurableProperties(prefix="somevalue")
class Foo{
string name;
string address;
}
application.properties:
somevalue.name="your name"
somevalue.address="your address"
`

Spring Boot configuration behaviour with #ConfigurationProperties and Command Line arguments

I seem to be having some funny behaviour with Spring boot on yaml property files im trying to load.
I have a Settings bean that is setup as follows :
#ConfigurationProperties(location = 'config.yml', prefix='settings')
public class Settings {
private String path;
...
}
I've explicitly told spring to look in the config.yml file for property values to bind to the Settings bean. This looks like this:
settings:
path: /yaml_path
This works well, however, I don't seem to be able to override these values from the command line i.e.
java -jar my.jar --settings.path=test
The value that is bound to the settings bean is still /yaml_path but would've expected that the --settings.path=test would override the settings in the yaml.
Interestingly, I've noticed that if i take comment out the path setting from the yaml file, the commandline argument value of test comes through.
Additionally, I've also noticed that if i change my config file from config.yml to application.yml and remove the 'location' attribute from the configuration properties file this gives me the desired desired behaviour, but means that I can't have multiple application.yml files in the classpath as it breaks my multi module application which has configuration files throughout.
Ideal world I would like be able to have modules read configuration from yaml files that contain safe values for that module (i.e. module.yml) and be able to override these values from the commandline if needed. Has anyone figured out how to get commandline arguments passed into the beans this way?
I have created a project on git hub to show case the issue
https://github.com/vcetinick/spring-boot-yaml-test
Running the application displays logging information about what settings are applied. i.e.
java -jar spring-boot-yaml-test-0.0.1-SNAPSHOT.jar --config.path=/test
should override the settings, however, the default /var/tmp is displayed
additionally, when using the application.yml configuration
java -jar spring-boot-yaml-test-0.0.1-SNAPSHOT.jar --app.path=/test
seems to behave as expected where the command line argument overrides the value but only works because its value is defined in the application.yml file.
Looks like the locations attribute is working as designed, however, seems to be at odds with the standard configuration paradigm setup by spring boot (https://github.com/spring-projects/spring-boot/issues/5111). It is meant to override the settings. It looks like this this feature may be removed in a future release of spring boot anyway (https://github.com/spring-projects/spring-boot/issues/5129)

OSGi bundle config without managed service or factory

Neil Bartlett's article http://njbartlett.name/2010/07/19/factory-components-in-ds.html shows the way to set config for bundles without using managed service or managed factory.
Search for examples of actually setting the config for this method either point to felix file install or to examples using managed service.
In answer to the question OSGi Declarative Services vs. ManagedService for configuring service? Neil Bartlett states "Note that DS never actually creates a ManagedService or ManagedServiceFactory for your component. It works by listening to Config Admin with a ConfigurationListener. However the internal details are unimportant... simply create configs with PID/factoryPID matching the component.name and it "just works"
I think the technique involves placing a pid entry in the config dictionary but I have no idea how this would be used with config admin.
A guide or simple example of how to set the configuration using this method would be very helpful.
I know it is some time since the question was asked, but I ran into the same problem when trying to create a ManagedServiceFactory-like Component with Declarative Services. So I want to share my solution. Maybe others find it useful. My problem was like this:
I have defined a component (annotated with #Component). On each configuration I add using felix file-install, I want an instance of that component created with the given configuration and activated immediately.
First I tried messing with the properties factory and configurationPid of #Component, but all that is not needed and even returns wrong results (felix annotation processor in the maven plugin seems to have a bug when handling configurationPid).
The solution I came up with:
package com.example.my;
#Component(
name = MyExampleComponent.FACTORY_PID,
configurationPolicy = ConfigurationPolicy.REQUIRE,
property = {"abc=", "exampleProp="}
)
public class MyExampleComponent {
public static final String FACTORY_PID = "com.example.my.component";
#Activate
protected void activate(BundleContext context, Map<String,Object> map) {
// ...
}
}
Then I created a config file for felix file-install named com.example.my.component-test1.cfg:
abc = Hello World
exampleProp = 123
When deployed this automatically creates a folder structure in the configuration folder like com/example/my/component containing the files:
factory.config
contents:
factory.pid="com.example.my.component"
factory.pidList=[ \
"com.example.my.component.525ca4fb-2d43-46f3-b912-8765f639c46f", \
]
.
525ca4fb-2d43-46f3-b912-8765f639c46f.config
contents:
abc="Hello World"
exampleProp="123"
felix.fileinstall.filename="file:/..._.cfg"
service.factoryPid="com.example.my.component"
service.pid="com.example.my.component.525ca4fb-2d43-46f3-b912-8765f639c46f"
The 525ca4fb-2d43-46f3-b912-8765f639c46f seems to be some randomly generated ID (possibly UUID).

Resources