So I have a bunch of icons in a folder structure under project resources
- src/main/resources/META-INF/resources/
- icons/
- icon-set-alpha/
- a1.svg
- a2.svg
- ...
- icon-set-beta/
- b1.svg
- b2.svg
- ...
- ...
I now want to write a task, that when copying those resources takes whatever's there and bundles it (not zip) and only copies the bundled versions into the build directory.
I.e.the build result should look like
- build/resources/main/META-INF/resources/
-icons/
- icon-set-alpha.svg
- icon-set-beta.svg
I can of course write a task that after copying the folders to the build directory goes through all the folders and creates bundled versions there and deletes the folders.
Is there a way to do it on-the-fly, though?
Related
I have a CMake project using Makefiles on Windows, with a folder structure that looks like this (the build takes place in build):
project
|- build
|- ...
|- otherfolder
|- stuff
|- more stuff
As a build step (pre- or post doesn't matter), I want to make a copy of project into build (excluding the build folder), like so:
project
|- build
|- ...
|- project
|- otherfolder
|- stuff
|- more stuff
|- otherfolder
|- stuff
|- more stuff
Other options might be acceptable as well, e.g. copying to a temporary directory outside the project root before moving it into place, but CMake seemingly has no builtin support for generating temporary directories.
Things I've tried: xcopy has support for excluding certain files and directories, but refuses to copy even if I explicitly exclude the build folder. cmake -E copy_directory does not (from what I'm able to find) support excluding certain directories.
CMake's file(COPY ... PATTERN build EXCLUDE ... copies successfully, but it runs at CMake configure time and I haven't been able to find a way to make it run at build time.
I might resort to using Python and shutil, but it would be nice if it could be done without additional dependencies, so I'd prefer a batch file solution.
There are several ways for doing selectable directory copiing.
You can use cmake -P for execute cmake script at any time. E.g:
copy_to_build.cmake:
file(COPY . DESTINATION build PATTERN build EXCLUDE)
CMakeLists.txt:
add_custom_command(... COMMAND cmake -P copy_to_build.cmake)
You can prepare list of subdirectories (and files) at configuration stage, and then copy every element of that list using xcopy. This approach uses fact, that everything outside of build directory is not changed. Here iteration is done on configuration stage (by CMake). I am not sure, whether "for" loop works under COMMAND of add_custom_command. If it works, you can use it for iterate over entries in the shell.
CMakeLists.txt:
# List of elements in source directory.
file(GLOB entries RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} ${CMAKE_CURRENT_SOURCE_DIR}/*)
# List of commands for pass to `add_custom_command` as is.
# `COMMAND` keyword is included into list.
set(copy_commands)
foreach(entry ${entries})
list(APPEND copy_commands COMMAND xcopy /s /i
${CMAKE_CURRENT_SOURCE_DIR}/${entry} ${CMAKE_CURRENT_SOURCE_DIR}/build/${entry}
endforeach()
add_custom_command(... ${copy_commands})
I have a solution containing multiple projects. I can use project variables within Build Events, for example $(TargetDir) to access paths and files of the current project.
Solution 'MyApp'
MyApp
MyApp.Core
MyApp.Setup
I want to add a Pre-Build-Event in MyApp.Setup. The $(TargetDir) would return D:\MyApp\MyApp.Setup\bin\Debug, but I want to gather all files from MyApp-Output Directory and put them into the setup directory.
So is it possible to access variables from other projects within build events?
Something like that:
copy "$(MyApp.TargetDir)\*.*" "$(ProjectDir)\externals"
I propose you build all projects of solution into one folder, and inside that folder create specific project subfolders. For now your projects have TargetDir variable with values:
MyApp - D:\MyApp\MyApp\bin\Debug
MyApp.Core - D:\MyApp\MyApp.Core\bin\Debug
MyApp.Setup - D:\MyApp\MyApp.Setup\bin\Debug
But you can make common target directory, so TargetDir value:
MyApp - D:\MyApp\Debug\MyApp\bin
MyApp.Core - D:\MyApp\Debug\MyApp.Core\bin
MyApp.Setup - D:\MyApp\Debug\MyApp.Setup\bin
And in Pre-Build-Event you can refer to it:
copy "$(TargetDir)\..\MyApp\*.*" "$(ProjectDir)\externals"
I'm working on a gradle multi project (java), where some of the sub-projects are applications (have a main class, and are executable) and some others are libraries (e.g, ends up being packaged as a jar, which some other sub-projects define a dependency on). It's working fine.
I'd like to be able to package (tar) the entire project for production, according to some structure that will make it easy for my users to deploy and use later on.
Currently, the distTar task creates a build/distribution/project-name.tar for each application project, and a build/libs/project-name.jar for each non application project, under each project build directory. That's a step in the right direction, but I'd like to consolidate the contents into one thing I can distribute.
As an example, right now after running distTar:
myapp/
README
docs/
services/
service1/
build/
libs/service1.jar
<other build dirs I don't want to distribute>
service2/
build/
distributions/service2.tar
<other build dirs I don't want to distribute>
service3/
build/
distributions/service3.tar
<other build dirs I don't want to distribute>
and the contents of service2.tar are:
service2/lib/service2.jar
service2/lib/some-service2-dependency.jar
service2/bin/service2 (start script)
service2/config.yml
(and similarly for service3.tar).
I'd like my final result to be a single myapp.tar(.gz) that includes a similar directory structure, but only with the production files:
README
docs/
services/
service1/
lib/service1.jar
service2/
lib/service2.jar
lib/some-service-dependency.jar
bin/service2 (start script)
config.yml
service3/
lib/service3.jar
lib/some-service-dependency.jar
bin/service3 (start script)
config.yml
Not sure what is the best way to achieve such a task. Do I create a parent level task that depends on distTar and copies files around, untar'ing stuff, etc? (tried that unsuccessfully).
Many thanks!
UPDATE:
I started doing something along these lines:
distributions {
main {
contents {
into('scripts') {
from {'scripts/sbin'}
}
into('service1') {
from {tarTree(tarFileForProject("service1"))}
}
into('service2') {
from {tarTree(tarFileForProject("service2"))}
}
into ...
}
}
}
distTar.dependsOn([subprojects.clean, subprojects.distTar])
(where tarFileForProject is a simple function that returns the path to the build/distributions/*.tar file of the given subproject name).
It seems to work, but also seems ugly. I wonder if there's a cleaner way to do this.
I am having my directory structure as
src/
- main/
- java/
- com/
- resources/ // Folder where I have kept my properties file
- class1.java
- class2.java
When I am trying to build the JAR using maven the resources folder is getting skipped and I am having only this structure left
src/
- main/
- java/
- com/
- class1.class
- class2.class
I know that maven uses a conventional dir. structure for the resources as src/main/resources. But I want my resources folder to be at the same level where my class files are getting generated.
Can anyone please help me in this regard.
Thanks in advance.
The key is what you said - you want your resources to be at the same level where the class files are being generated. That doesn't mean they have to start in the same folder. They must simply have the same package structure. You may do this following the normal Maven file layout conventions as follows.
- src
- main
| - java
| - com
| - mycompany
| - Class1.java
| - Class2.java
| - resources
| - com
| - mycompany
| - someResources.properties
| - anotherResource.jpg
Note that the directory/package structure under /src/main/java and /src/main/resources is the same. When the maven-compiler-plugin and maven-resources-plugin have finished running, the result will be
- target
- classes
- com
- mycompany
- Class1.class
- Class2.class
- someResources.properties
- anotherResource.jpg
which is what you want.
I'm having trouble with a directory dependency in a parallel build in SCons.
Consider two projects with a single SConstruct in the following (simplified) hierarchy:
- SConstruct
- project1
- src
- project2
- src
- build
- project1
- project2
- dist
- project1
- project2
Each of project1 and project2 are supposed to be built under the relevant build directory (using variant dir) and several targets needs to be installed under the relevant dist directory.
Project 2 depends on Project 1's dist. I've states this dependency explicitly using the Depends() statement like so:
Depends('project2', 'dist/project1')
When I use a non-parallel build, there's no problem. Project 1 is fully built, targets are installed in the dist directory, and only then project 2 is built. However, when I use multiple jobs (4), project 2 is being built simultaneously to the Install() builder being run for the files needed to be installed in project 1's dist directory.
So, my questions are:
Does the Depends(project2, dist/project1) statement refers to the creation of the dist/project1 directory or to the completion of building all the directory's children?
How should I solve this issue?
Thank you very much,
BugoK.
Instead of specifying the actual directories as strings in the Depends() function, try specifying the actual targets as returned by the SCons project1 and project2 builders. Every SCons builder (or at least most of them) returns the affected target as an object, and its better to use this object instead of the file/directoy name since if you dont use the exact same file/directory path, it wont be considered as the same target.
Here is an example, fill in content accordingly:
project2Target = Install()
# Im not sure how you're building project1, so replace the builder
project1Target = Proj1DistBuiler()
Depends(project2Target, project1Target)