ClassPath resource can not be accessed in Docker Spring - spring

I have a userService file which includes this code:
package com.example.demo.services;
import com.example.demo.entity.User;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.core.io.ClassPathResource;
import org.springframework.stereotype.Service;
import java.io.IOException;
import java.util.List;
import java.util.concurrent.atomic.AtomicReference;
#Service
public class UserService {
public static AtomicReference<UserService> INSTANCE = new AtomicReference<UserService>();
public static List<User> userList ;
public UserService() throws IOException {
final UserService previous = INSTANCE.getAndSet(this);
userList = new ObjectMapper().readValue(
new ClassPathResource("db/user.json").getFile(),
new ObjectMapper().getTypeFactory().constructCollectionType(List.class, User.class));
if(previous != null)
throw new IllegalStateException("Second singleton " + this + " created after " + previous);
}
public static UserService getInstance() {
return INSTANCE.get();
}
}
In this Service I am loading a JSON file into the variable userList and the JSON file is stored in src/main/resources/db/user.json. This works fine in the IDE. But when I creates a Jar file either via Intellij or manually, the db/user.json is stored in BOOT-INF/classes/db/user.json (inspected via this command: jar tf backend.jar).
The Docker Image could not get started as it could not find the file. So how would I change this so that it should work both in normal debug and Docker image?
To be noted, userList converts the JSON file in the List.

Related

How to fetch file from azure blob using spring boot

I want to fetch files from Azure blob storage. Following code does it fine-
package com.<your-resource-group>.<your-artifact-name>;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.io.Resource;
import org.springframework.core.io.WritableResource;
import org.springframework.util.StreamUtils;
import org.springframework.web.bind.annotation.*;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.charset.Charset;
#RestController
#RequestMapping("blob")
public class BlobController {
#Value("azure-blob://<your-container-name>/<your-blob-name>")
private Resource blobFile;
#GetMapping("/readBlobFile")
public String readBlobFile() throws IOException {
return StreamUtils.copyToString(
this.blobFile.getInputStream(),
Charset.defaultCharset());
}
#PostMapping("/writeBlobFile")
public String writeBlobFile(#RequestBody String data) throws IOException {
try (OutputStream os = ((WritableResource) this.blobFile).getOutputStream()) {
os.write(data.getBytes());
}
return "file was updated";
}
}
My Question -
The #Value annotation provides value to the Resource that is static (i.e I cannot put any variable containing my path as a string inside #Value).
How can I implement the this
In application properties try storing the path
#application.properties
blob.path=
We can use #Value("${...property's name}") annotation to access the above property in the Java class as follows:
import org.springframework.beans.factory.annotation.Value;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
#RestController
public class ValueController {
#Value("${blob.path}")
private String path;
#GetMapping("")
..
}
}
Here try to use blob uri complete path in application properties and use the same in #value annotation as variable by map datatype
//
#Value("${blob.path}")
private Map<String, String> blobPath;
See this > java - How to read external properties based on value of local variable in Spring Boot? - Stack Overflow
& Value Spring: Spring #Value annotation tricks - DEV Community
Also see Requestmapping
Other references :
Spring Azure Blob Storage | DevGlan
spring batch - How to read the csv files from Azure Blob Storage in
Java with folder structure as 'dir1 / dir2 / dir3 /..'? - Stack Overflow

How to post data as csv file to rest entpoint in spring boot using WebClient

I'm trying to migrate data from an in house database to a software. The software has a REST api for this purpose, that expects a csv file.
A working curl call for this API endpoint looks like this:
curl -isk POST -H "customHeaderName:customHeaderValue" -H "Authorization: bearer $TOKEN" -F "data=#accounts.csv" <apiBaseUrl>/gate/account/import/group-accounts
My plan is to post the data directly to the REST endpoint with a spring boot application, without crating a physical csv file first.
My implementation looks like this, with "csvString" beeing a csv formatted String (e.g.: "acc_id,acc_name,acc_desc\r\n1,john.doe,this is john\r\n2,peter.parker,this is peter"):
(I removed this code and added the current version below.)
When I call postAccountsAndGroups(csvString); I get a 415 response indicating that my request Body is not a propper csv file.
EDIT:
It seems like the API endpoint requires a Multipart Form. Therfore I came up with something like this:
import static org.springframework.web.util.UriComponentsBuilder.fromUriString;
import my.package.common.configuration.WebClientConfig;
import java.net.URI;
import java.nio.charset.StandardCharsets;
import lombok.extern.slf4j.Slf4j;
import org.hibernate.service.spi.ServiceException;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.io.ByteArrayResource;
import org.springframework.core.io.Resource;
import org.springframework.http.HttpStatus;
import org.springframework.http.MediaType;
import org.springframework.http.client.MultipartBodyBuilder;
import org.springframework.stereotype.Service;
import org.springframework.web.reactive.function.BodyInserters;
import org.springframework.web.reactive.function.client.WebClient;
import reactor.core.publisher.Mono;
#Service
#Slf4j
public class MyApiImpl implements MyApi {
private final WebClient client;
private final String apiBaseUrl;
public MyApiImpl(
#Qualifier(WebClientConfig.MY_API_CLIENT_CONFIG) WebClient client,
#Value("${external.api.myapi.baseUrl}") String apiBaseUrl) {
this.client = client;
this.apiBaseUrl = apiBaseUrl;
}
#Override
public Mono<HttpStatus> postAccountsAndGroups(String csvString) {
MultipartBodyBuilder builder = new MultipartBodyBuilder();
Resource byteArrayResource = new ByteArrayResource(csvString.getBytes(StandardCharsets.UTF_8));
builder.part("data", byteArrayResource);
return client.post()
.uri(createAccountsUri())
.header("customHeaderName", "customHeaderValue")
.contentType(MediaType.MULTIPART_FORM_DATA)
.body(BodyInserters.fromMultipartData(builder.build()))
.exchangeToMono(response -> {
if (response.statusCode().equals(HttpStatus.OK)) {
return response.bodyToMono(HttpStatus.class).thenReturn(response.statusCode());
} else {
throw new ServiceException("Error uploading file");
}
});
}
private URI createAccountsUri() {
return fromUriString(apiBaseUrl).path("/gate/account/import/group-accounts").build().toUri();
}
}
Now I get 400 Bad Request as response though.
I stil havend found a way to implement my prefered solution. However I came up with this workaround, that relies on persisting the csv file:
In my case I chose "/tmp/account.csv" as file path since my application runs in a docker container with linux os. On a Windows machine you could use something like "C:/myapp/account.csv". The file path is injected vie the application.properties file using the custom value "migration.files.accounts" so it can be configured later.
import static org.springframework.web.util.UriComponentsBuilder.fromUriString;
import my.package.common.configuration.WebClientConfig;
import java.io.File;
import java.io.PrintWriter;
import java.net.URI;
import lombok.extern.slf4j.Slf4j;
import org.hibernate.service.spi.ServiceException;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.io.FileSystemResource;
import org.springframework.http.HttpStatus;
import org.springframework.http.MediaType;
import org.springframework.http.client.MultipartBodyBuilder;
import org.springframework.stereotype.Service;
import org.springframework.web.reactive.function.BodyInserters;
import org.springframework.web.reactive.function.client.WebClient;
import reactor.core.publisher.Mono;
#Service
#Slf4j
public class PrimedexApiImpl implements PrimedexApi {
private final WebClient client;
private final String apiBaseUrl;
private final FileSystemResource accountsFile;
private final String accountsFilePath;
public PrimedexApiImpl(
#Qualifier(WebClientConfig.MY_API_CLIENT_CONFIG) WebClient client,
#Value("${external.api.api.baseUrl}") String apiBaseUrl,
#Value("${migration.files.accounts}") String accountsFilePath) {
this.client = client;
this.apiBaseUrl = apiBaseUrl;
this.accountsFilePath = accountsFilePath;
this.accountsFile = new FileSystemResource(accountsFilePath);
}
#Override
public Mono<HttpStatus> postAccountsAndGroups(String csvString) {
File csvOutputFile = new File(accountsFilePath);
if (csvOutputFile.delete()) {
log.info("An old version of '{}' was deleted.", accountsFilePath);
}
try (PrintWriter pw = new PrintWriter(csvOutputFile)) {
pw.print(csvString);
} catch (Exception e) {
log.error(e.getMessage(), e);
}
MultipartBodyBuilder builder = new MultipartBodyBuilder();
builder.part("data", accountsFile);
return client.post()
.uri(createAccountsUri())
.header("customHeaderName", "customHeaderValue")
.contentType(MediaType.MULTIPART_FORM_DATA)
.body(BodyInserters.fromMultipartData(builder.build()))
.exchangeToMono(response -> {
if (response.statusCode().equals(HttpStatus.OK)) {
return response.releaseBody().thenReturn(response.statusCode());
} else {
throw new ServiceException("Error uploading file");
}
});
}
private URI createAccountsUri() {
return fromUriString(apiBaseUrl).path("/gate/account/import/group-accounts").build().toUri();
}
}
I used spring-boot-starter-parent version 2.6.3 for this project.

Running a quarkus main (command line like) from an AWS lambda handler method

I have a quarkus-camel batch application that needs to run under a lambda in AWS. This is working fine with pure java and spring-boot.
I need to be able to start the Quarkus Application from the AWS lambda handler method.
Running in batch works fine, but under lambda I get the following error:
Caused by: io.quarkus.bootstrap.BootstrapException: Failed to determine the Maven artifact associated with the application /var/task
This is the main java class. I need to know what to do in the handleRequest method to start the Quarkus (CAMEL) application.
package com.example;
import io.quarkus.runtime.annotations.QuarkusMain;
import io.quarkus.runtime.Quarkus;
import io.quarkus.runtime.QuarkusApplication;
import io.quarkus.arc.Arc;
import io.quarkus.runtime.QuarkusApplication;
import org.apache.camel.quarkus.core.CamelRuntime;
import javax.inject.Inject;
import org.apache.camel.CamelContext;
import org.apache.camel.ProducerTemplate;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
#QuarkusMain
public class Main {
private static final Logger logger = LoggerFactory.getLogger(Main.class);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
public static void main(String... args) {
Quarkus.run(CamelApp.class, args);
}
public static class CamelApp implements QuarkusApplication {
#Inject
ProducerTemplate camelProducer;
#Inject
CamelContext camelContext;
#Override
public int run(String... args) throws Exception {
System.out.println("Hello Camel");
CamelRuntime runtime = Arc.container().instance(CamelRuntime.class).get();
runtime.start(args);
camelProducer.sendBody("direct:lambda", "how about this?");
return runtime.waitForExit();
}
}
public Object handleRequest(final Object input, final Context context) {
logger.info("input: {}", gson.toJson(input));
logger.info("context: {}", gson.toJson(context));
Quarkus.run(CamelApp.class);
// CamelRuntime runtime = Arc.container().instance(CamelRuntime.class).get();
// runtime.start(new String[] {"A","B","C"});
// camelProducer.sendBody("direct:lambda", "how about this?");
// runtime.waitForExit();
return input;
}
}

Feign Client couldn't find custom registered eureka service

I will tell little bit about what I am trying to achieve. I have a spring boot application which is an Eureka client and registers itself as a data-service service. After this application startup (ApplicationReadeEvent.class) I am registering also another custom created Eureka client and seems the registration is successful. I am able to see that newly registered service (workflow-service) when accessing to http://localhost:8761. The reason I decided to do it inside data-service application is because I needn't it out of this context and I need it only in DEV environment. Later instead of it would be plugged the real workflow-service developed by other team.
The problem here is that when I trying to access to this service through a feign client I am receiving an exception:
com.netflix.client.ClientException: Load balancer does not have available server for client: workflow-service
Here is my custom service registration code:
package XXX;
import com.netflix.appinfo.ApplicationInfoManager;
import com.netflix.appinfo.HealthCheckHandler;
import com.netflix.appinfo.InstanceInfo;
import com.netflix.discovery.DiscoveryClient;
import org.mockserver.integration.ClientAndServer;
import org.mockserver.model.HttpRequest;
import org.mockserver.model.HttpResponse;
import org.springframework.beans.factory.ObjectProvider;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.condition.ConditionalOnExpression;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.cloud.commons.util.InetUtils;
import org.springframework.cloud.commons.util.InetUtilsProperties;
import org.springframework.cloud.netflix.eureka.*;
import org.springframework.cloud.netflix.eureka.serviceregistry.EurekaRegistration;
import org.springframework.cloud.netflix.eureka.serviceregistry.EurekaServiceRegistry;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.event.ContextClosedEvent;
import org.springframework.context.event.EventListener;
import java.net.SocketException;
#Configuration
#ConditionalOnExpression("${workflow.engine.mock.enabled:false}")
public class MockWorkflowEngineConfiguration {
#Value("${workflow.engine.mock.application.name}") private String workflowEngineApplicationName;
#Value("${workflow.engine.mock.application.port}") private Integer workflowEnginePort;
#Autowired private EurekaInstanceConfigBean originalInstanceConfig;
#Autowired private EurekaClientConfigBean originalClientConfig;
#Autowired private ApplicationInfoManager applicationInfoManager;
#Autowired private ApplicationContext applicationContext;
#Autowired private ApplicationEventPublisher applicationEventPublisher;
#Autowired private ObjectProvider<HealthCheckHandler> healthCheckHandler;
#Autowired private EurekaServiceRegistry eurekaServiceRegistry;
private EurekaRegistration workflowEngineEurekaRegistration;
private DiscoveryClient workflowEngineDiscoveryClient;
private ClientAndServer workflowEngineMockClient;
#EventListener(ApplicationReadyEvent.class)
public void initializeMockWorkflowEngine() throws SocketException{
workflowEngineDiscoveryClient = new CloudEurekaClient(
createWorkflowEngineAppInfoManager(),
duplicateEurekaClientConfig(),
applicationEventPublisher);
workflowEngineEurekaRegistration = EurekaRegistration.builder((CloudEurekaInstanceConfig) workflowEngineDiscoveryClient.getApplicationInfoManager().getEurekaInstanceConfig())
.with(workflowEngineDiscoveryClient)
.with(workflowEngineDiscoveryClient.getApplicationInfoManager())
.with(healthCheckHandler).build();
eurekaServiceRegistry.register(workflowEngineEurekaRegistration);
workflowEngineMockClient = new ClientAndServer(workflowEnginePort);
workflowEngineMockClient.when(
HttpRequest.request()
.withMethod("GET")
.withPath("/job")
)
.respond(
HttpResponse.response()
.withStatusCode(200)
.withBody("{ id: '1', name: 'default'}")
);
}
#EventListener(ContextClosedEvent.class)
public void shutdownMockWorkflowEngine(){
workflowEngineDiscoveryClient.shutdown();
eurekaServiceRegistry.deregister(workflowEngineEurekaRegistration);
workflowEngineMockClient.stop(true);
}
private ApplicationInfoManager createWorkflowEngineAppInfoManager() throws SocketException {
EurekaInstanceConfigBean newInstanceConfig =
new EurekaInstanceConfigBean(new InetUtils(new InetUtilsProperties()));
newInstanceConfig.setEnvironment(applicationContext.getEnvironment());
newInstanceConfig.setAppname(workflowEngineApplicationName);
newInstanceConfig.setInstanceId(applicationInfoManager.getInfo().getHostName() + ":" + workflowEngineApplicationName + ":" + workflowEnginePort);
newInstanceConfig.setInitialStatus(InstanceInfo.InstanceStatus.UP);
newInstanceConfig.setNonSecurePortEnabled(originalInstanceConfig.isNonSecurePortEnabled());
newInstanceConfig.setNonSecurePort(workflowEnginePort);
newInstanceConfig.setHostname(applicationInfoManager.getInfo().getHostName());
newInstanceConfig.setSecurePortEnabled(originalInstanceConfig.isSecurePortEnabled());
newInstanceConfig.setSecurePort(originalInstanceConfig.getSecurePort());
newInstanceConfig.setDataCenterInfo(originalInstanceConfig.getDataCenterInfo());
newInstanceConfig.setHealthCheckUrl(originalInstanceConfig.getHealthCheckUrl());
newInstanceConfig.setSecureHealthCheckUrl(originalInstanceConfig.getSecureHealthCheckUrl());
newInstanceConfig.setHomePageUrl(originalInstanceConfig.getHomePageUrl());
newInstanceConfig.setStatusPageUrl(originalInstanceConfig.getStatusPageUrl());
newInstanceConfig.setStatusPageUrlPath(originalInstanceConfig.getStatusPageUrlPath());
newInstanceConfig.setIpAddress(originalInstanceConfig.getIpAddress());
newInstanceConfig.setPreferIpAddress(originalInstanceConfig.isPreferIpAddress());
ApplicationInfoManager manager =
new ApplicationInfoManager(newInstanceConfig, (ApplicationInfoManager.OptionalArgs) null);
return manager;
}
private EurekaClientConfigBean duplicateEurekaClientConfig() {
EurekaClientConfigBean newConfig = new EurekaClientConfigBean();
newConfig.setFetchRegistry(false);
newConfig.setEurekaServerPort(originalClientConfig.getEurekaServerPort());
newConfig.setAllowRedirects(originalClientConfig.isAllowRedirects());
newConfig.setAvailabilityZones(originalClientConfig.getAvailabilityZones());
newConfig.setBackupRegistryImpl(originalClientConfig.getBackupRegistryImpl());
newConfig.setServiceUrl(originalClientConfig.getServiceUrl());
return newConfig;
}
}
And here is my feign client code:
#FeignClient(name = "workflow-service", configuration = FeignClientConfiguration.class)
public interface WorkflowService {
#RequestMapping(value = "/job", method = RequestMethod.GET, consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE)
ResponseEntity<List<WorkflowJobDTO>> listJobs();
Here is the feign client usage through which I am trying to access to another service:
#GetMapping(path = "/workflow-jobs", produces = "application/json")
public ResponseEntity<List<WorkflowJobDTO>> getAllJobs() {
return workflowService.listJobs();
}
}
This has been fixed by just setting virtual host name.
newInstanceConfig.setVirtualHostname(workflowEngineApplicationName);

Spring Boot & Hibernate Validation's ConstraintMappingContributor

The hibernate validations documentation describes how to create ConstraintMappingContributors here.
It states:
You then need to specify the fully-qualified class name of the
contributor implementation in META-INF/validation.xml, using the
property key hibernate.validator.constraint_mapping_contributors. You
can specify several contributors by separating them with a comma.
Given I have many of these, what would be the most appropriate way to auto-discover these i.e. via #Component and add them dynamically at runtime to the ConstrainMappingConfiguration during Spring Boot startup.
For example.. if a developer creates a new ConstraintMappingContributor, it should be picked up and added automatically when spring boot starts, requiring no other file changes.
This is what I came up with, seems to be working for me.
package...
import org.hibernate.validator.spi.cfg.ConstraintMappingContributor;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.validation.beanvalidation.LocalValidatorFactoryBean;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
#Configuration
public class ValidationConfiguration {
private final List<ConstraintMappingContributor> contributors;
public ValidationConfiguration(Optional<List<ConstraintMappingContributor>> contributors) {
this.contributors = contributors.orElseGet(ArrayList::new);
}
#Bean
public LocalValidatorFactoryBean validatorFactory() {
return new ValidatorFactoryBean(this.contributors);
}
}
package...
import org.hibernate.validator.HibernateValidatorConfiguration;
import org.hibernate.validator.internal.cfg.context.DefaultConstraintMapping;
import org.hibernate.validator.spi.cfg.ConstraintMappingContributor;
import org.springframework.validation.beanvalidation.LocalValidatorFactoryBean;
import javax.validation.Configuration;
import java.util.List;
public class ValidatorFactoryBean extends LocalValidatorFactoryBean {
private final List<ConstraintMappingContributor> contributors;
ValidatorFactoryBean(List<ConstraintMappingContributor> contributors) {
this.contributors = contributors;
}
#Override
protected void postProcessConfiguration(Configuration<?> cfg) {
if (cfg instanceof HibernateValidatorConfiguration) {
HibernateValidatorConfiguration configuration = (HibernateValidatorConfiguration) cfg;
this.contributors.forEach(contributor -> contributor.createConstraintMappings(() -> {
DefaultConstraintMapping mapping = new DefaultConstraintMapping();
configuration.addMapping(mapping);
return mapping;
}));
}
}
}
I invoke it like this...
if(SpringValidatorAdapter.class.isInstance(this.validatorFactory)){
SpringValidatorAdapter.class.cast(this.validatorFactory).validate(entity, errors);
}

Resources