I am using some of the methods of Files class like (delete, copy methods) to do upload and delete of file. Below is the code to perform these operations.
public String uploadFile(MultipartFile file) {
try {
String fileName = file.getOriginalFilename()
// Copy file to the target location (Replacing existing file with the same name)
Path targetLocation = Paths.get("uploadPath" + File.separator + StringUtils.cleanPath(fileName));
Files.copy(file.getInputStream(), targetLocation, StandardCopyOption.REPLACE_EXISTING);
return fileName;
} catch (IOException ex) {
throw new FileStorageException("Not able to upload", ex);
}
}
But for this source code I am not able to write JUnit tests because not able to mock Files class. For mocking final classes we can use PowerMock which supports to mock static and final methods. But here if I do using PowerMock still it is not mocking. I am using Spring Framework 5.2.1.RELEASE , Is there any change in JUnit with this version to mock final classes or methods? Or can any one help me on writing the unit tests for this code (versions I am using Spring Framework 5.2.1 and JUnit4.12).
Mocking static and final classes is indeed possible only with tools like PowerMock or PowerMockito, and its not related to JUnit or Spring frameworks.
I think you should not Mock Files.copy operation.
Instead consider the following strategy:
Define an interface for working with files, a kind of DAO but for file system:
public interface FileSystemDAO {
void copy(InputStream is, Path target, StandardCopyOption ... options);
}
public class FileSystemDAOImpl implements FileSystemDAO {
void copy(InputStream is, Path target, StandatadCopyOption ... options) {
Files.copy(...)
}
}
Now use dependency injection in all the places that work with files (if you're using spring as you've said - define FileSystemDAOImpl as a bean).
class MySampleUploadService {
private final FileSystemDAO fileSystemDao;
public MySampleUploadService(FileSystemDAO dao) {
this.fileSystemDao = dao;
}
public String uploadFile(MultipartFile file) {
try {
String fileName = file.getOriginalFilename()
// Copy file to the target location (Replacing existing file with the same name)
Path targetLocation = Paths.get("uploadPath" + File.separator +
StringUtils.cleanPath(fileName));
fileSystemDao.copy(file.getInputStream(), targetLocation, StandardCopyOption.REPLACE_EXISTING);
return fileName;
} catch (IOException ex) {
throw new FileStorageException("Not able to upload", ex);
}
}
}
Now with this approach you can easily test the Upload service by mocking the FileSystemDao interface.
I want to upload a multipart file to AWS S3. So, i have to convert it.
But new File method needs a local location to get the file.
I am able to do in local. But running this code in every machine seems like a issue.
Please find both scenarios.
Working
private File convertMultiPartToFile(MultipartFile multipartFile) throws IOException {
File convFile = new File("C:\\Users\\" + multipartFile.getOriginalFilename());
multipartFile.transferTo(convFile);
return convFile;
}
Not working
private File convertMultiPartToFile(MultipartFile multipartFile) throws IOException {
File convFile = new File(multipartFile.getOriginalFilename());
multipartFile.transferTo(convFile);
return convFile;
}
Error received :
java.io.FileNotFoundException: newbusiness.jpg (Access is denied)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at java.io.FileOutputStream.<init>(FileOutputStream.java:162)
You could use Spring Content S3. This will hide the implementation details so you don't need to worry about them.
There are Spring Boot starter alternatives but as you are not using Spring Boot add the following dependency to your pom.xml
pom.xml
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>spring-content-s3</artifactId>
<version>0.0.11</version>
</dependency>
Add the following configuration that creates a SimpleStorageResourceLoader bean:
#Configuration
#EnableS3Stores
public class S3Config {
#Autowired
private Environment env;
public Region region() {
return Region.getRegion(Regions.fromName(env.getProperty("AWS_REGION")));
}
#Bean
public BasicAWSCredentials basicAWSCredentials() {
return new BasicAWSCredentials(env.getProperty("AWS_ACCESS_KEY_ID"), env.getProperty("AWS_SECRET_KEY"));
}
#Bean
public AmazonS3 client(AWSCredentials awsCredentials) {
AmazonS3Client amazonS3Client = new AmazonS3Client(awsCredentials);
amazonS3Client.setRegion(region());
return amazonS3Client;
}
#Bean
public SimpleStorageResourceLoader simpleStorageResourceLoader(AmazonS3 client) {
return new SimpleStorageResourceLoader(client);
}
}
Create a "Store":
S3Store.java
public interface S3Store extends Store<String> {
}
Autowire this store into where you need to upload resources:
#Autowired
private S3Store store;
WritableResource r = (WritableResource)store.getResource(getId());
InputStream is = // plug your input stream in here
OutputStream os = r.getOutputStream();
IOUtils.copy(is, os);
is.close();
os.close();
When your application starts it will see the dependency on spring-content-s3 and your S3Store interface and inject an implementation for you, therefore, you don't need to worry about implementing this yourself.
IF you writing some sort of web application or microservice and you need a REST API then you can also add this dependency:
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>spring-content-rest</artifactId>
<version>0.0.11</version>
</dependency>
Update your S3Config.java as follows:
#Configuration
#EnableS3Stores
#Import(RestConfiguration.class)
public class S3Config {
...
Update your store as follows:
S3Store.java
#StoreRestResource(path="s3docs")
public interface S3Store extends Store<String> {
}
Now when your application starts it will see your Store interface and also inject an #Controller implementation that will forward REST request onto your store. This replaces the autowiring code above obviously.
Then:
curl -X POST /s3docs/example-doc
with a multipart/form-data request will store the image in s3.
curl /s3docs/example-doc
will fetch it again and so on. This controller supports full CRUD and video streaming by the way.
If you want to associate this "content" with JPA Entity or something like that then you can have your S3Store extend AssociateStore or ContentStore and you have additional methods available that provide for associations.
There are a couple of getting started guides here. The s3 reference guide is here. And there is a tutorial video here. The coding bit starts about 1/2 way through.
HTH
Since it needs a temporary location to place files. Below code worked after deploying war on AWS.
private File convertMultiPartToFile(MultipartFile multipartFile) throws IOException {
File convFile = new File(System.getProperty("java.io.tmpdir") + System.getProperty("file.separator") +
multipartFile.getOriginalFilename());
multipartFile.transferTo(convFile);
return convFile;
}
You have problems with relative Paths
You can do this
public class UploadStackoverflow {
private String location = "upload-dir";
private Path rootLocation;
public File convertFile(MultipartFile file) throws IOException {
rootLocation = Paths.get(location);
Files.createDirectories(rootLocation);
String filename = StringUtils.cleanPath(file.getOriginalFilename());
InputStream inputStream = file.getInputStream();
Files.copy(inputStream, this.rootLocation.resolve(filename),
StandardCopyOption.REPLACE_EXISTING);
return new File(this.rootLocation.resolve(filename).toAbsolutePath().toString());
}
}
I am upgrading Spring 2.5 to 4.2.
The issue is with one bean which has property type org.springframework.core.io.ClassPathResource. The resource value is defined in xml as p:location="classpath:/<the resource path>"
This worked perfect and bean property was populated with the resource. But in 4.2 the value is not getting set.
So I debugged the code and found that the class org.springframework.beans.BeanWrapperImpl was manipulating the value and removing classpath: string from actual value in Spring 2.5.
However the same is not true in 4.2 and class org.springframework.beans.BeanWrapperImpl isn't modifying the value which results in spring not finding the resource.
Anyone faced similar situation? What solution did you apply?
Thanks,
Hanumant
EDIT 1: code sample
spring config file
<bean class="com.test.sample.TestBean" id="testBean"
p:schemaLocation="classpath:/com/test/sample/Excalibur_combined.xsd" />
TestBean.java
public class TestBean {
private ClassPathResource schemaLocation;
public ClassPathResource getSchemaLocation() {
return schemaLocation;
}
public void setSchemaLocation(ClassPathResource schemaLocation) {
this.schemaLocation = schemaLocation;
}
}
App.java
public class App {
public static void main(String[] args) {
ApplicationContext ap = new ClassPathXmlApplicationContext("classpath:/com/test/sample/spring-config.xml");
TestBean tb = (TestBean) ap.getBean("testBean");
try {
URL url = tb.getSchemaLocation().getURL();
System.out.println(url);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Error Message
INFO: Loading XML bean definitions from class path resource
[com/test/sample/spring-config.xml] java.io.FileNotFoundException:
class path resource
[classpath:/com/test/sample/Excalibur_combined.xsd] cannot be resolved
to URL because it does not exist at
org.springframework.core.io.ClassPathResource.getURL(ClassPathResource.java:187)> at com.test.sample.App.main(App.java:20)
However if I remove the classpath: from bean definition it works.
So is classpth: necessary in bean definition xml file?? And why it was working fine in Spring 2.5??
The main issue is that you aren't programming to interfaces. Instead of the concrete org.springframework.core.io.ClassPathResource use org.springframework.core.io.Resource. When doing to the org.springframework.core.io.ResourceEditor will kick in and convert the String into a Resource instance. The location you are providing classpath:/<the resource path> will be passed to the ResourceLoader which will get the resource or throw an error if it doesn't exist.
If, however, you are using the concrete type ClassPathResouce directly this mechanism doesn't kick in and the location is set to what you provide classpath:/<the resource path>. However this is actually not a valid location for the URL class and that will eventually fail with the message you see.
It worked in earlier versions due to a hack/workaround/patch in the BeanWrapperImpl to strip the prefix.
Basically it now fails because you where doing things you shouldn't have been doing in the first place.
While using Jasypt in Spring the logger level in the load properties method of the PropertiesLoaderSupport class is set to Info
protected void loadProperties(Properties props) throws IOException {
if (this.locations != null) {
for (Resource location : this.locations) {
if (logger.isInfoEnabled()) {
logger.info("Loading properties file from " + location);
}
The above code block returns true while using Jasypt whereas it returns false with Spring due to while I get lot of unwanted log message.
Can somebody suggest how can I make it to return false through configurations .
I am using log4j for my application as well .
Thanks
I was able to find out the result after a lot of search ... To resolve this issue we have to set the Logger properties for Jasypt in the Logger configuration being used for the application . For example
name="log4j.logger.org.jasypt">ERROR
That will resolve the issue .
All
I created a jar file with the following MANIFEST.MF inside:
Manifest-Version: 1.0
Ant-Version: Apache Ant 1.8.3
Created-By: 1.6.0_25-b06 (Sun Microsystems Inc.)
Main-Class: my.Main
Class-Path: . lib/spring-core-3.2.0.M2.jar lib/spring-beans-3.2.0.M2.jar
In its root there is a file called my.config which is referenced in my spring-context.xml like this:
<bean id="..." class="...">
<property name="resource" value="classpath:my.config" />
</bean>
If I run the jar, everything looks fine escept the loading of that specific file:
Caused by: java.io.FileNotFoundException: class path resource [my.config] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/D:/work/my.jar!/my.config
at org.springframework.util.ResourceUtils.getFile(ResourceUtils.java:205)
at org.springframework.core.io.AbstractFileResolvingResource.getFile(AbstractFileResolvingResource.java:52)
at eu.stepman.server.configuration.BeanConfigurationFactoryBean.getObject(BeanConfigurationFactoryBean.java:32)
at eu.stepman.server.configuration.BeanConfigurationFactoryBean.getObject(BeanConfigurationFactoryBean.java:1)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:142)
... 22 more
classes are loaded the from inside the jar
spring and other dependencies are loaded from separated jars
spring context is loaded (new ClassPathXmlApplicationContext("spring-context/applicationContext.xml"))
my.properties is loaded into PropertyPlaceholderConfigurer ("classpath:my.properties")
if I put my .config file outside the file system, and change the resource url to 'file:', everything seems to be fine...
Any tips?
If your spring-context.xml and my.config files are in different jars then you will need to use classpath*:my.config?
More info here
Also, make sure you are using resource.getInputStream() not resource.getFile() when loading from inside a jar file.
In the spring jar package, I use new ClassPathResource(filename).getFile(), which throws the exception:
cannot be resolved to absolute file path because it does not reside in the file system: jar
But using new ClassPathResource(filename).getInputStream() will solve this problem. The reason is that the configuration file in the jar does not exist in the operating system's file tree,so must use getInputStream().
I know this question has already been answered. However, for those using spring boot, this link helped me - https://smarterco.de/java-load-file-classpath-spring-boot/
However, the resourceLoader.getResource("classpath:file.txt").getFile(); was causing this problem and sbk's comment:
That's it. A java.io.File represents a file on the file system, in a
directory structure. The Jar is a java.io.File. But anything within
that file is beyond the reach of java.io.File. As far as java is
concerned, until it is uncompressed, a class in jar file is no
different than a word in a word document.
helped me understand why to use getInputStream() instead. It works for me now!
Thanks!
The error message is correct (if not very helpful): the file we're trying to load is not a file on the filesystem, but a chunk of bytes in a ZIP inside a ZIP.
Through experimentation (Java 11, Spring Boot 2.3.x), I found this to work without changing any config or even a wildcard:
var resource = ResourceUtils.getURL("classpath:some/resource/in/a/dependency");
new BufferedReader(
new InputStreamReader(resource.openStream())
).lines().forEach(System.out::println);
I had similar problem when using Tomcat6.x and none of the advices I found was helping.
At the end I deleted work folder (of Tomcat) and the problem gone.
I know it is illogical but for documentation purpose...
I was having an issue recursively loading resources in my Spring app, and found that the issue was I should be using resource.getInputStream. Here's an example showing how to recursively read in all files in config/myfiles that are json files.
Example.java
private String myFilesResourceUrl = "config/myfiles/**/";
private String myFilesResourceExtension = "json";
ResourceLoader rl = new ResourceLoader();
// Recursively get resources that match.
// Big note: If you decide to iterate over these,
// use resource.GetResourceAsStream to load the contents
// or use the `readFileResource` of the ResourceLoader class.
Resource[] resources = rl.getResourcesInResourceFolder(myFilesResourceUrl, myFilesResourceExtension);
// Recursively get resource and their contents that match.
// This loads all the files into memory, so maybe use the same approach
// as this method, if need be.
Map<Resource,String> contents = rl.getResourceContentsInResourceFolder(myFilesResourceUrl, myFilesResourceExtension);
ResourceLoader.java
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.Charset;
import java.util.HashMap;
import java.util.Map;
import org.springframework.core.io.Resource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import org.springframework.core.io.support.ResourcePatternResolver;
import org.springframework.util.StreamUtils;
public class ResourceLoader {
public Resource[] getResourcesInResourceFolder(String folder, String extension) {
ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
try {
String resourceUrl = folder + "/*." + extension;
Resource[] resources = resolver.getResources(resourceUrl);
return resources;
} catch (IOException e) {
throw new RuntimeException(e);
}
}
public String readResource(Resource resource) throws IOException {
try (InputStream stream = resource.getInputStream()) {
return StreamUtils.copyToString(stream, Charset.defaultCharset());
}
}
public Map<Resource, String> getResourceContentsInResourceFolder(
String folder, String extension) {
Resource[] resources = getResourcesInResourceFolder(folder, extension);
HashMap<Resource, String> result = new HashMap<>();
for (var resource : resources) {
try {
String contents = readResource(resource);
result.put(resource, contents);
} catch (IOException e) {
throw new RuntimeException("Could not load resource=" + resource + ", e=" + e);
}
}
return result;
}
}
For kotlin users, I solved it like this:
val url = ResourceUtils.getURL("classpath:$fileName")
val response = url.openStream().bufferedReader().readText()
The answer by #sbk is the way we should do it in spring-boot environment (apart from #Value("${classpath*:})), in my opinion. But in my scenario it was not working if the execute from standalone jar..may be I did something wrong.
But this can be another way of doing this,
InputStream is = this.getClass().getClassLoader().getResourceAsStream(<relative path of the resource from resource directory>);
I was having an issue more complex because I have more than one file with same name, one is in the main Spring Boot jar and others are in jars inside main fat jar.
My solution was getting all the resources with same name and after that get the one I needed filtering by package name.
To get all the files:
ResourceLoader resourceLoader = new FileSystemResourceLoader();
final Enumeration<URL> systemResources = resourceLoader.getClassLoader().getResources(fileNameWithoutExt + FILE_EXT);
In Spring boot 1.5.22.RELEASE Jar packaging this worked for me:
InputStream resource = new ClassPathResource("example.pdf").getInputStream();
"example.pdf" is in src/main/resources.
And then to read it as byte[]
FileCopyUtils.copyToByteArray(resource);
I had the same issue, ended up using the much more convenient Guava Resources:
Resources.getResource("my.file")
While this is a very old thread, but I also faced the same issue while adding FCM in a Spring Boot Application.
In development, the file was getting opened and no errors but when I deployed the application to AWS Elastic beanstalk , the error of FileNotFoundException was getting thrown and FCM was not working.
So here's my solution to get it working on both development env and jar deployment production.
I have a Component class FCMService which has a method as follows:
#PostConstruct
public void initialize() {
log.info("Starting FCM Service");
InputStream inputStream;
try {
ClassPathResource resource = new ClassPathResource("classpath:fcm/my_project_firebase_config.json");
URL url = null;
try {
url = resource.getURL();
} catch (IOException e) {
}
if (url != null) {
inputStream = url.openStream();
} else {
File file = ResourceUtils.getFile("classpath:fcm/my_project_firebase_config.json");
inputStream = new FileInputStream(file);
}
FirebaseOptions options = FirebaseOptions.builder().setCredentials(GoogleCredentials.fromStream(inputStream))
.build();
FirebaseApp.initializeApp(options);
log.info("FCM Service started");
} catch (IOException e) {
log.error("Error starting FCM Service");
e.printStackTrace();
}
}
Hope this helps someone looking for a quick fix with implementing FCM.
Can be handled like:
var serviceAccount = ClassLoader.getSystemResourceAsStream(FB_CONFIG_FILE_NAME);
FirebaseOptions options = new FirebaseOptions.Builder()
.setCredentials(GoogleCredentials.fromStream(serviceAccount))
.build();
Where FB_CONFIG_FILE_NAME is name of file in your 'resources' folder.