I am using some of the methods of Files class like (delete, copy methods) to do upload and delete of file. Below is the code to perform these operations.
public String uploadFile(MultipartFile file) {
try {
String fileName = file.getOriginalFilename()
// Copy file to the target location (Replacing existing file with the same name)
Path targetLocation = Paths.get("uploadPath" + File.separator + StringUtils.cleanPath(fileName));
Files.copy(file.getInputStream(), targetLocation, StandardCopyOption.REPLACE_EXISTING);
return fileName;
} catch (IOException ex) {
throw new FileStorageException("Not able to upload", ex);
}
}
But for this source code I am not able to write JUnit tests because not able to mock Files class. For mocking final classes we can use PowerMock which supports to mock static and final methods. But here if I do using PowerMock still it is not mocking. I am using Spring Framework 5.2.1.RELEASE , Is there any change in JUnit with this version to mock final classes or methods? Or can any one help me on writing the unit tests for this code (versions I am using Spring Framework 5.2.1 and JUnit4.12).
Mocking static and final classes is indeed possible only with tools like PowerMock or PowerMockito, and its not related to JUnit or Spring frameworks.
I think you should not Mock Files.copy operation.
Instead consider the following strategy:
Define an interface for working with files, a kind of DAO but for file system:
public interface FileSystemDAO {
void copy(InputStream is, Path target, StandardCopyOption ... options);
}
public class FileSystemDAOImpl implements FileSystemDAO {
void copy(InputStream is, Path target, StandatadCopyOption ... options) {
Files.copy(...)
}
}
Now use dependency injection in all the places that work with files (if you're using spring as you've said - define FileSystemDAOImpl as a bean).
class MySampleUploadService {
private final FileSystemDAO fileSystemDao;
public MySampleUploadService(FileSystemDAO dao) {
this.fileSystemDao = dao;
}
public String uploadFile(MultipartFile file) {
try {
String fileName = file.getOriginalFilename()
// Copy file to the target location (Replacing existing file with the same name)
Path targetLocation = Paths.get("uploadPath" + File.separator +
StringUtils.cleanPath(fileName));
fileSystemDao.copy(file.getInputStream(), targetLocation, StandardCopyOption.REPLACE_EXISTING);
return fileName;
} catch (IOException ex) {
throw new FileStorageException("Not able to upload", ex);
}
}
}
Now with this approach you can easily test the Upload service by mocking the FileSystemDao interface.
Is there a way using ResourceLoader to get a list of "sub resources" in a directory in the jar?
For example, given sources
src/main/resources/mydir/myfile1.txt
src/main/resources/mydir/myfile2.txt
and using
#Autowired
private ResourceLoader resourceLoader;
I can get to the directory
Resource dir = resourceLoader.getResource("classpath:mydir")
dir.exists() // true
but not the files within the dir. If I could get the file, I could call dir.getFile().listFiles(), but
dir.getFile() // explodes with FileNotFoundException
But I can't find a way to get the "child" resources.
You can use a ResourcePatternResolver to get all the resources that match a particular pattern. For example:
Resource[] resources = resourcePatternResolver.getResources("/mydir/*.txt")
You can have a ResourcePatternResolver injected in the same way as ResourceLoader.
Based on Bohemian's comment and another answer, I used the following to get an input streams of all YAMLs under a directory and sub-directories in resources (Note that the path passed doesn't begin with /):
private static Stream<InputStream> getInputStreamsFromClasspath(
String path,
PathMatchingResourcePatternResolver resolver
) {
try {
return Arrays.stream(resolver.getResources("/" + path + "/**/*.yaml"))
.filter(Resource::exists)
.map(resource -> {
try {
return resource.getInputStream();
} catch (IOException e) {
return null;
}
})
.filter(Objects::nonNull);
} catch (IOException e) {
logger.error("Failed to get definitions from directory {}", path, e);
return Stream.of();
}
}
I'm developing a Gradle plugin and I need to read and write to gradle.properties project file. I have tried this:
#TaskAction
public void myAction() {
Properties properties = new Properties();
try {
properties.load(new FileInputStream(getProject().file("gradle.properties")));
} catch (IOException e) {
e.printStackTrace();
}
}
But I get FileNotFoundException. How can I get the file?
I found the error. Gradle API Documentation says that .file(Object path) method in Project class:
Resolves a file path relative to the project directory of this
project.
My gradle.properties file was in root project, instead of in the module where plugin is applied.
I have a Propertyfile config.properties in which I store a Path which a class loads to read Files.
The properties are loaded like this:
public class PropertyConfig {
private static final Properties properties = new Properties();
static {
try {
ClassLoader loader = Thread.currentThread().getContextClassLoader();
properties.load(Thread.currentThread().getContextClassLoader().getResourceAsStream("config.properties"));
} catch (IOException e) {
throw new ExceptionInInitializerError(e);
}
}
public static String getSetting(String key) {
return properties.getProperty(key);
}
}
and the call in the relevant class is like this:
private static File savedGamesFolder = new File(PropertyConfig.getSetting("folder_for_saved_games"));
For testing purposes I want to be able to change the path to a test directory, or change the whole Property-file in a jUnit-TestCase. How can achieve this?
I'm using Maven if that helps.
Assuming you have your config.properties in
src/main/resources/config.properties
Note: you should nevertheless have your properties files somewhere in src/main/resources
Place your test configuration in
src/main/test/config.properties
That's it. No need to change your code.
All
I created a jar file with the following MANIFEST.MF inside:
Manifest-Version: 1.0
Ant-Version: Apache Ant 1.8.3
Created-By: 1.6.0_25-b06 (Sun Microsystems Inc.)
Main-Class: my.Main
Class-Path: . lib/spring-core-3.2.0.M2.jar lib/spring-beans-3.2.0.M2.jar
In its root there is a file called my.config which is referenced in my spring-context.xml like this:
<bean id="..." class="...">
<property name="resource" value="classpath:my.config" />
</bean>
If I run the jar, everything looks fine escept the loading of that specific file:
Caused by: java.io.FileNotFoundException: class path resource [my.config] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/D:/work/my.jar!/my.config
at org.springframework.util.ResourceUtils.getFile(ResourceUtils.java:205)
at org.springframework.core.io.AbstractFileResolvingResource.getFile(AbstractFileResolvingResource.java:52)
at eu.stepman.server.configuration.BeanConfigurationFactoryBean.getObject(BeanConfigurationFactoryBean.java:32)
at eu.stepman.server.configuration.BeanConfigurationFactoryBean.getObject(BeanConfigurationFactoryBean.java:1)
at org.springframework.beans.factory.support.FactoryBeanRegistrySupport.doGetObjectFromFactoryBean(FactoryBeanRegistrySupport.java:142)
... 22 more
classes are loaded the from inside the jar
spring and other dependencies are loaded from separated jars
spring context is loaded (new ClassPathXmlApplicationContext("spring-context/applicationContext.xml"))
my.properties is loaded into PropertyPlaceholderConfigurer ("classpath:my.properties")
if I put my .config file outside the file system, and change the resource url to 'file:', everything seems to be fine...
Any tips?
If your spring-context.xml and my.config files are in different jars then you will need to use classpath*:my.config?
More info here
Also, make sure you are using resource.getInputStream() not resource.getFile() when loading from inside a jar file.
In the spring jar package, I use new ClassPathResource(filename).getFile(), which throws the exception:
cannot be resolved to absolute file path because it does not reside in the file system: jar
But using new ClassPathResource(filename).getInputStream() will solve this problem. The reason is that the configuration file in the jar does not exist in the operating system's file tree,so must use getInputStream().
I know this question has already been answered. However, for those using spring boot, this link helped me - https://smarterco.de/java-load-file-classpath-spring-boot/
However, the resourceLoader.getResource("classpath:file.txt").getFile(); was causing this problem and sbk's comment:
That's it. A java.io.File represents a file on the file system, in a
directory structure. The Jar is a java.io.File. But anything within
that file is beyond the reach of java.io.File. As far as java is
concerned, until it is uncompressed, a class in jar file is no
different than a word in a word document.
helped me understand why to use getInputStream() instead. It works for me now!
Thanks!
The error message is correct (if not very helpful): the file we're trying to load is not a file on the filesystem, but a chunk of bytes in a ZIP inside a ZIP.
Through experimentation (Java 11, Spring Boot 2.3.x), I found this to work without changing any config or even a wildcard:
var resource = ResourceUtils.getURL("classpath:some/resource/in/a/dependency");
new BufferedReader(
new InputStreamReader(resource.openStream())
).lines().forEach(System.out::println);
I had similar problem when using Tomcat6.x and none of the advices I found was helping.
At the end I deleted work folder (of Tomcat) and the problem gone.
I know it is illogical but for documentation purpose...
I was having an issue recursively loading resources in my Spring app, and found that the issue was I should be using resource.getInputStream. Here's an example showing how to recursively read in all files in config/myfiles that are json files.
Example.java
private String myFilesResourceUrl = "config/myfiles/**/";
private String myFilesResourceExtension = "json";
ResourceLoader rl = new ResourceLoader();
// Recursively get resources that match.
// Big note: If you decide to iterate over these,
// use resource.GetResourceAsStream to load the contents
// or use the `readFileResource` of the ResourceLoader class.
Resource[] resources = rl.getResourcesInResourceFolder(myFilesResourceUrl, myFilesResourceExtension);
// Recursively get resource and their contents that match.
// This loads all the files into memory, so maybe use the same approach
// as this method, if need be.
Map<Resource,String> contents = rl.getResourceContentsInResourceFolder(myFilesResourceUrl, myFilesResourceExtension);
ResourceLoader.java
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.Charset;
import java.util.HashMap;
import java.util.Map;
import org.springframework.core.io.Resource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import org.springframework.core.io.support.ResourcePatternResolver;
import org.springframework.util.StreamUtils;
public class ResourceLoader {
public Resource[] getResourcesInResourceFolder(String folder, String extension) {
ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
try {
String resourceUrl = folder + "/*." + extension;
Resource[] resources = resolver.getResources(resourceUrl);
return resources;
} catch (IOException e) {
throw new RuntimeException(e);
}
}
public String readResource(Resource resource) throws IOException {
try (InputStream stream = resource.getInputStream()) {
return StreamUtils.copyToString(stream, Charset.defaultCharset());
}
}
public Map<Resource, String> getResourceContentsInResourceFolder(
String folder, String extension) {
Resource[] resources = getResourcesInResourceFolder(folder, extension);
HashMap<Resource, String> result = new HashMap<>();
for (var resource : resources) {
try {
String contents = readResource(resource);
result.put(resource, contents);
} catch (IOException e) {
throw new RuntimeException("Could not load resource=" + resource + ", e=" + e);
}
}
return result;
}
}
For kotlin users, I solved it like this:
val url = ResourceUtils.getURL("classpath:$fileName")
val response = url.openStream().bufferedReader().readText()
The answer by #sbk is the way we should do it in spring-boot environment (apart from #Value("${classpath*:})), in my opinion. But in my scenario it was not working if the execute from standalone jar..may be I did something wrong.
But this can be another way of doing this,
InputStream is = this.getClass().getClassLoader().getResourceAsStream(<relative path of the resource from resource directory>);
I was having an issue more complex because I have more than one file with same name, one is in the main Spring Boot jar and others are in jars inside main fat jar.
My solution was getting all the resources with same name and after that get the one I needed filtering by package name.
To get all the files:
ResourceLoader resourceLoader = new FileSystemResourceLoader();
final Enumeration<URL> systemResources = resourceLoader.getClassLoader().getResources(fileNameWithoutExt + FILE_EXT);
In Spring boot 1.5.22.RELEASE Jar packaging this worked for me:
InputStream resource = new ClassPathResource("example.pdf").getInputStream();
"example.pdf" is in src/main/resources.
And then to read it as byte[]
FileCopyUtils.copyToByteArray(resource);
I had the same issue, ended up using the much more convenient Guava Resources:
Resources.getResource("my.file")
While this is a very old thread, but I also faced the same issue while adding FCM in a Spring Boot Application.
In development, the file was getting opened and no errors but when I deployed the application to AWS Elastic beanstalk , the error of FileNotFoundException was getting thrown and FCM was not working.
So here's my solution to get it working on both development env and jar deployment production.
I have a Component class FCMService which has a method as follows:
#PostConstruct
public void initialize() {
log.info("Starting FCM Service");
InputStream inputStream;
try {
ClassPathResource resource = new ClassPathResource("classpath:fcm/my_project_firebase_config.json");
URL url = null;
try {
url = resource.getURL();
} catch (IOException e) {
}
if (url != null) {
inputStream = url.openStream();
} else {
File file = ResourceUtils.getFile("classpath:fcm/my_project_firebase_config.json");
inputStream = new FileInputStream(file);
}
FirebaseOptions options = FirebaseOptions.builder().setCredentials(GoogleCredentials.fromStream(inputStream))
.build();
FirebaseApp.initializeApp(options);
log.info("FCM Service started");
} catch (IOException e) {
log.error("Error starting FCM Service");
e.printStackTrace();
}
}
Hope this helps someone looking for a quick fix with implementing FCM.
Can be handled like:
var serviceAccount = ClassLoader.getSystemResourceAsStream(FB_CONFIG_FILE_NAME);
FirebaseOptions options = new FirebaseOptions.Builder()
.setCredentials(GoogleCredentials.fromStream(serviceAccount))
.build();
Where FB_CONFIG_FILE_NAME is name of file in your 'resources' folder.