I added API request/response files in the target folder (following existing project structure) and didn't realize that running maven clean was going to delete them. Is there a way to recover files deleted in the target folder after a maven clean? I spent a long time working on them and don't want to have to remake everything. This was done in intellij for reference.
No, not unless you have some sort of disk backup running on your computer.
Related
I am trying to maintain my changes to config files of resource.bundle directories from remote cocoapods repositories.
While working on the implementation I am able to make changes locally but I do not own the external repository.I would like to be able to refer to the code owners tags implementing the pods from their repos in my project while maintaining my configuration changes.
It has been suggested to me to create a script phase in my build process that would copy files from a "assets folder" within the project to the finished pod directory after the remote pull and build.
This sounds feasible but I am not sure where to start in this process or what the script would like.
essentially I would have a
root/assetsfolder/resource.bundle
that would need to be copied to
Pods/ExternalPodName/Core/resource.bundle
Any help would be appreciated.
I want to store some additional files in the JAR that gets created. Those files are in a directory that is a subdirectory of a repository which is pulled in via a git submodule.
I want to copy that submodule to my src resources directory before compiling, but I also want to make sure that any old files at that location are removed first.
How can that be achieved best with Maven plugins? I did not find any option to remove any destination files with the copy-resources goal of the maven-resources-plugin and I could not get the maven-clean-plugin to run right before the copy-resources either. So how does one accomplish such a trivial task with Maven?
UPDATE: as mentioned above, the reason why I want to do this is because what is copied should become part of what gets added to the resulting jar (and could potentially be part of what gets compiled). So I need to copy these files into the src directory and NOT the target directory. What should get copied before each build is the input to the build, not an additional output.
There is one flaw in your approach, and it probably explains most of the obstructions you encountered.
During a build, the only directory in which you may write is target. Copying files to src or changing them is strictly discouraged.
The target folder is erased by clean, so no need to tidy up yourself or to manage old files.
I have a teamcity installation running on a windows server
I have a build process, that is building a .net application. As part of the build process it is running a gulp task, and is installing various node modules to build all the sass and js files.
Within my teamcity solution i have my clean up rules for "Everything" to be older that the 4th-th successful build
However teamcity does not appear to be cleaning everything up, because my artifacts folder is massive ( and caused the server to run out of disk space )
Now I believe that the issue is because one of the gulp task and all the node_modules being installed. Because of how nested node installs everything.
On windows machines, in most instances you cannot just delete a node_modules folder as you get the path longer than 256 char warning, and you have to use the robocopy trick of copying across an empty folder to delete the node_modules folder
What have others done to resolve this problem of artifacts not being correctly cleaned up ?
I was thinking about another step that would delete the node_modules folder from the artifact folder after the step to process the gulp tasks, but I cannot see in the list of teamcity parameters once that points to the artifact folder for the current build.
If the builds should be deleted by the server clean-up, but you still see the builds in TeamCity UI and they are not pinned, the most common reason for that is having other builds snapshot- or artifact-depending on the builds in question. Check build's Dependencies tab and related TeamCity settings.
I am trying to use caching in gitlab runner, which builds a Maven Java project. Currently Gitlab runner only allow caching specific paths defined in gitlab yaml file in the cache: clause. When maven builds projects, it generate everything inside target/ folder, which are untracked files in git. So I can simply use untracked: true option to cache everything under target/ folder. The purpose of caching is to skip compiling the files, which have already been compiled by maven under the target/ folder.
However this cache amounts to about 6GB, which is completely unreasonable for its size and time required to create and restore such a giant cache. It caches all jar and war artifacts built during compiling multi-module maven project. However, maven only needs .class files to check changes for re-compilation
So if their was some way using which I can cache only *.class files, and make them available in subsequent builds, then maven could check the .class files and skip re-compiling unchanged files and cache size would also be pretty small. Currently gitlab-runner only allow specifying absolute paths for caching. It does not support regex patterns for paths such as \.class$ (which would have been very useful).
Is there any way I could cache only specific file types using gitlab runner yaml settings?
So based on cascaval's comment, I was able to figure out a solution.
At the end of maven build I ran a command to clean all build artifacts created by maven, which are not used for checking stale status of .java resources. Here is what I wrote -
cd ./projects/directory
find . | grep --perl-regexp --regexp='\/target\/(?!classes|maven)' | xargs rm --recursive --force
This saves all .class files in target/classes folder, including folder structure and also files in maven-status folder, which are probably used by maven to check file status for recompilation.
I need to make some changes to Qt 4.7.1, so I need to add it to my subversion server to track my changes. However, once it's added, the configure script fails. I'm guessing it's choking on the .svn files.
I'm using Windows. Is there any way to add Qt to subversion, delete all the .svn folders, configure and build it, recreate the .svn folders, and then submit my changes?
Or is there any other work around? The error I get is 'Couldn't update default mkspec'
Here is what I would have done:
Install Qt in some folder.
Make sure that auto-props and global-ignores are set up properly.
Rename the whole folder.
Create an empty repository.
Create an empty folder having the same name as the original one.
Import the empty folder into the repository.
Remove the folder.
Check out the folder.
Copy the contents of the backup to the working copy.
Carefully add everything you want to be source controlled, probably using the -N or --depth options.
Put everything else into appropriate svn:ignore properties.
Commit.
Compare the working copy and the backup.
If there are any differences, wipe both the working directory and the repository, then repeat from the step 2, correcting the mistakes.
It may seem a bit of overkill, but importing such a large project into an SVN repository isn't a trivial task.
The problem is if .svn folders exist in /mkspecs/default and /mkspecs-win32-msvc2008 then configure fails to run with the error 'Couldn't update default mkspec'
If I move the .svn folders, configure, then replace them, I can then build.