How to Read File from PersistentVolumeClaims in SpringBoot - spring-boot

I have "file.txt" in my OpenShift PersistentVolumeClaims.
My StorageClasses is NFS.
Is it possible to access and read "file.txt" from my spring-boot code?
(This spring-boot code will be deployed on OpenShift and I'll mount the PVC to the DeploymentConfigs)
If yes, how can I do that? (I'm confused in how to retrieve the file from persistent volume from inside the code)
Any help will be appreciated. Thanks.

What #larsks said in the comment is correct:
your application doesn't know or care about the fact that you're using
a persistent volume
However, I used:
Resource resource = new ClassPathResource("/pvc/mount/path/file.txt");
InputStream inputStream = resource.getInputStream();
and:
Resource resource = resourceLoader.getResource("/pvc/mount/path/file.txt");
InputStream inputStream = resource.getInputStream();
Both doesn't work.
Below is what I use in the end:
Path file = Paths.get("/pvc/mount/path/file.txt");
InputStream stream = Files.newInputStream(file);

Related

Files not found in JAR FIle

I am trying to find a folder containning PDF's . Everything works when the backend and frontent is not a Jar file.
My question is how do you locate files ,if it is in a JAR, all I am recieving is a file not found exception
Tried adding the file to my resources folder but still nothing.
What am I missing?
This can be accomplished in a number of ways, here are a couple:
// get the resources as a URL
URL resource = getClass().getClassLoader().getResource(
"static/test/TrainingDocuments/SuperUser/2/PartnerInformation/PartnerInformation.pdf");
// get the resource as a classPathResource
ClassPathResource classPathResource = new ClassPathResource(
"static/test/TrainingDocuments/SuperUser/2/PartnerInformation/PartnerInformation.pdf");
// get the resource directly as a stream
InputStream resourceAsStream = getClass().getClassLoader()
.getResourceAsStream("static/test/TrainingDocuments/SuperUser/2/PartnerInformation/PartnerInformation.pdf");

How to copy/access file in Kubernetes

Working on Spring boot microservice and we have file in the folder file/Service.wsdl
And we are accessing the file using
#Value("file:file/Service.wsdl")
String WSDL_LOCATION;
In Kubernetes, we are getting "file/directory not found"...How to solve this.
You can try this #Value("${file:file/Service.wsdl}")
String WSDL_LOCATION;
Or else you can create config map of using this file and use that path .

Spring kafka not able to read truststore file from classpath

I am building an kafka consumer app which needs SASL_SSL config. Some how apache kafka is not recognizing truststore file located in classpath and looks like there is an open request to enhance it in kafka(KAFKA-7685).
In the mean time what would be the best way to solve this problem. Same app needs to deployed in PCF too so solution should work both during local windows based development and PCF (linux).
Any solution would be highly appreciated.
Here is the code which does file copy to java temp dir
String tempDirPath = System.getProperty("java.io.tmpdir");
System.out.println("Temp dir : " + tempDirPath);
File truststoreConf = ResourceUtils.getFile("classpath:Truststore.jks");
File truststoreFile = new File(tempDirPath + truststoreConf.getName());
FileUtils.copyFile(truststoreConf, truststoreFile);
System.setProperty("ssl.truststore.location", truststoreFile.getAbsolutePath());
You could use a ClassPathResource and FileCopyUtils to copy it from the jar to a file in a temporary directory in main() before creating the SpringApplication.
Root cause of this issue was resource filtering enabled. Maven during resource filtering corrupts the binary file. So if you have that enabled, disable it

Updating Tomcat WAR file on Windows

I have inherited a Maven/Tomcat(8.5) web application that is locally hosted on different machines. I am trying to get the application to update from a war file that is either stored locally (on a USB drive) or downloaded from AWS on Windows more reliably. On the Linux version, the software is able to upload the new war file via curl using the manager-script by doing the following:
String url = "http://admin:secret#localhost:8080/manager/text/deploy?path=/foo&update=true";
CommandRunner.executeCommand(new String[] { "curl", "--upload-file", war, url });
For Windows, the current current method tries to copy over the war file in the /webapps directory and has tomcat auto deploy the new war file after restarting either tomcat or the host machine. The problem is that the copied war file ends up being 0kb and there is nothing to deploy. This appears to happen outside of my install function because the file size after FileUtils.copyFile() for the /webapps/foo.war is the correct size.
I have tried to implement a PUT request for the manager-script roll from reading the Tomcat manager docs and another post:
File warfile = new File(request.getWar());
String warpath = warfile.getAbsolutePath().replace("\\", "/");
//closes other threads
App.terminate();
String url = "http://admin:secret#localhost:8080/manager/text/deploy?path=/foo&war=file:" + warpath + "&update=true";
HttpClient client = HttpClientBuilder.create().build();
HttpPut request;
try {
request = new HttpPut(url);
HttpResponse response = client.execute(request);
BufferedReader rd = new BufferedReader(
new InputStreamReader(response.getEntity().getContent()));
StringBuffer result = new StringBuffer();
String line = "";
while ((line = rd.readLine()) != null) {
result.append(line);
}
System.err.println(result.toString());
} catch (Exception e){
LOGGER.info("war install failed");
throw new ServiceException("Unable to upload war file.", e);
}
However, the process still ends up with a 0kb war file to /webapps. The only error in my log is when tomcat tries to deploy the empty war. I have tried relaxing file read/write permissions in my tomcat installation and it doesn't seem to help. I also don't know if I can guarantee that all Windows installations will have access to curl if I try that implementation. Has anyone ran into similar issues?Thank you for the help!
Will, you are openning the file for sure but you are not writing anything to it, the result string is wrote to the stdout, so you got to replace System.err.println(result.toString()); with fr = new FileWriter(file); br = new BufferedWriter(fr); br.write(result);
and don't forget to close your resources at the end (br.close(); fr.close(); ) :)
PS: check if there are some informations to be filtered (headers or staff like that)

How to make Terraform archive_file resource pick up changes to source files?

Using TF 0.7.2 on a Win 10 machine.
I'm trying to set up an edit/upload cycle for development of my lambda functions in AWS, using the new "archive_file" resource introduced in TF 0.7.1
My configuration looks like this:
resource "archive_file" "cloudwatch-sumo-lambda-archive" {
source_file = "${var.lambda_src_dir}/cloudwatch/cloudwatchSumologic.js"
output_path = "${var.lambda_gen_dir}/cloudwatchSumologic.zip"
type = "zip"
}
resource "aws_lambda_function" "cloudwatch-sumo-lambda" {
function_name = "cloudwatch-sumo-lambda"
description = "managed by source project"
filename = "${archive_file.cloudwatch-sumo-lambda-archive.output_path}"
source_code_hash = "${archive_file.cloudwatch-sumo-lambda-archive.output_sha}"
handler = "cloudwatchSumologic.handler"
...
}
This works the first time I run it - TF creates the lambda zip file, uploads it and creates the lambda in AWS.
The problem comes with updating the lambda.
If I edit the cloudwatchSumologic.js file in the above example, TF doesn't appear to know that the source file has changed - it doesn't add the new file to the zip and doesn't upload the new lambda code to AWS.
Am I doing something wrong in my configuration, or is the archive_file resource not meant to be used in this way?
You could be seeing a bug. I'm on 0.7.7 and the issue now is the SHA changes even when you don't make changes. Hashicorp will be updating this resource to a data source in 0.7.8
https://github.com/hashicorp/terraform/pull/8492

Resources