I just started using CCNet, and in the process of getting my build projects set up I racked up a lot of build history from trial and error. I really don't want to keep that old stuff around, but I can't seem to see where/how to get rid of it. I'm sure this is a silly question, and I apologize if I'm overlooking something that should be obvious. I did RTM and Google for about a half hour, and poked around my CCNet installation, but it's not jumping out at me. I deleted the state files for the projects (don't know if that has anything to do with it), but the old builds are still there if I drill into a project's stats from the dashboard. Any suggestions? Thanks.
Answered: I had explicitly set the artifacts directory to a location that was not under the CCNet server directory and consequently never looked in it again... went looking and, disco, there's the build histories.
Don't forget you can use Artifact Cleanup Publisher to keep your build history from growing to the size of Mars over time.
Assuming you have a project called "Dev" and you've installed CCNet into the default location, you'll have a folder called:
c:\Program Files\CruiseControl.NET\server\Dev
and a Dev.state file in:
c:\Program Files\CruiseControl.NET\server
Just delete both the folder and the state file.
What you're looking for are the "artifacts" folders. Check your ccnet.config file for the tag
Stop your service, delete the artifact directory folder, and restart your service.
The logs are stored in the artifacts directories under artifacts\MyProjectName\Build\log*.xml.
The State file stores things like the last build date, time, info.
Best to stop the service, and then delete the .state in ProgFiles\CC.net\server and also delete the artifacts\MyProjectName\Build\log.xml files.
As mentioned above, use the Artifact Cleanup Publisher to keep the number of artifacts to a sensible level.
If you have a lot of projects and need to do a retrospective cleanup, you could use the following Powershell script to remove old log files:
$limit = (Get-Date).AddDays(-60)
get-childitem -Path D:\Builds -filter MatchMyProjects.* | %{
$projectPath=$_.FullName
$logsPath=$projectPath + "\Logs"
write-host Removing logs from folder $logsPath
Get-ChildItem -Path $logsPath -Force -Filter *.xml | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
}
Thanks to this answer: Delete files older than 15 days using PowerShell
Related
Could someone help me: How to get latest version of multiple files in TFS?
I'm starting to try automating the publishing of a project that is often changing, start with this problem.
So let's say that I have a list of 10 changed files in TFS that is not in the same changeset, and I need some tool/add-in/script that can get latest version of all those 10 files in one click/enter.
Anyway to do that, or any suggestion for me to start searching? Thanks for any help!
You can call the tf.exe command like this:
tf get file1.cs
tf get file2.cs
...
And then do that for every file. If they are all in the same folder you can just specify the folder and all files will be refreshed. You can also specify their common ancestor and add the /recursive option:
tf get ancestorFolder /recursive
Here is the full reference for the tfs get command.
Assuming that you know which files you need, the below script can help.
tf get "tfs path to file/folder" /force /recurse
use /force when you need to overwrite existing file. use /recurse when you want to get all files within a folder.
you can run this inside dos for loop command. Something like:
FOR /F ["options"] %%variable IN (<file name containing list of files>) DO tf get "%%variable" /force /recurse
You would need to study ["options"] as it's difficult over here to explain everything.
Alternatively you can use same logic inside powershell but there you would need to load tfs assemblies and run the required commands but will give you more flexibility and control with what you want to do.
Get Latest Version of Folder from TFS, using Powershell
How to get latest version of specific project in TFS using Powershell.
I have an Eclipse setup with m2eclipse and subversive. I have imported a maven2 project from svn. But I get the error message that a whole bunch of artifacts are missing (for instance: Missing artifact org.springframework:spring-test:jar:3.0.1.RELEASE:test).
If I look in my repository I see the jar files there but they have an extra extension .lastUpdated. Why is maven appending .lastUpdated to the jars? And more importantly: how can I fix this?
There is no mention of the type lastUpdated in my POMs.
These files indicate to Maven that it attempted to obtain the archive by download, but was unsuccessful. In order to save bandwidth it will not attempt this again until a certain time period encoded in the file has elapsed. The command line switch -U force maven to perform the update before the retry period. This may be necessary if you attempted to build while disconnected from the network.
The method of removing the files works with most versions of maven, but since the files are internal mementos to maven, I would not recommend this method. There is no guarantee that this information is not referenced or held elsewhere and such manipulation can damage the system.
As rperez said, I use to delete all those .lastUpdated files. In Linux I have created a little script to keep it simple:
find -name \*.lastUpdated -exec rm -fv {} +
Just create a file with the previous content and put it on your local Maven repository. Usually it will be ~/.m2/repository.
I installed Maven2 and ran mvn compile from the command line. This seems to have resolved the problem
you might have a problem with some of the artifacts to be retrieved from the repository. for example spring framework has its own repository. this xtension is appended when the artifact cannot fully downloaded. add the spring framework repository to your pom or settings.xml, delete the folder that include the broken jars and start again
If you hit this problem and you're using Nexus, it might be the case that you have a routing rule defined, which is incorrect. I hit this myself and the files it was downloading were correctly named, at the proper URL-s it was looking at, but they were all with the .lastUpdated extension and an error message as contents.
Open your terminal, navigate to your Eclipse's project directory and run:
mvn install
If mvn install doesn't update your dependencies, then call it with a switch to force update:
mvn install -U
This is a much safer approach compared to tampering with maven files as you delete ".lastUpdated".
Use this command inside the .m2/repository dir to rename all files:
for file in `find . -iname *.lastUpdated`; do renamed=$(echo $file | rev | cut -c13- | rev); echo renaming: $file to $renamed; mv $file $renamed; done
This is usefull to not download all sources again.
This not work... The .jar is lost. :(
What I do when I encounter this issue:
Make sure you have the version of the latest 'maven-source-plugin' plugin:
https://maven.apache.org/plugins/maven-source-plugin/usage.html
$ mvn source:jar install
Now if the file *.lastUpdate exist in your local ~/.m2/repositories/your-lib/0.0.1/ directory you can just remove it then run the command above again.
This is a side-effect of a failure to successfully extract from the repository. To get the actual content you want into your repository, check for correct paths to the repository/repositories within your pom file, and resolve certificate/security issues, if any. It is almost invariably one or the other of these issues.
There is no need to delete the .lastUpdated entries, and doing so won't solve your problem.
I am running jenkins on windows under a domain user with administration rights. Part of my build script delete previous artefacts. Jenkins will successfully build once, after that it fails with
You do not have sufficient access rights to perform this operation.
On the delete artefacts line which looks like this:
Remove-Item -Path $pathArtifacts -Recurse
Jenkins created these artifacts so why can't it delete them? The job config is very simple and just runs a build.bat after svn checkout. The files are local and just on the C drive
-force was the answer. Simple
Thanks Christopher
I have a number of daily and in-response-to-svn-change builds that all derive from the same configuration template. I can set it so I can choose what branch to look at, what build steps to carry out and of course what triggers the build.
However I would like the daily builds to do a completely clean checkout, whereas the svn dependent ones (that obviously happen throughout the day) I am happy for them to simply to an update.
Simply un-setting the option in the template does not let me set them in each of the derived builds. Is there a build parameter that I can use to switch on clean builds for those builds that require it?
In each Project's "Version Control Settings", look under "Checkout setting". There are options to specify the checkout dir and also a checkbox to clean files before a build.
There is an Additional Build Feature (Swabra) you can add that does a clean checkout if required.I have enabled this for our nightly builds but have not yet investigated the consequences.
See here for more detail on Swabra
Seems to me that you need one template for daily and one for nightly builds.
I so disliked the Accepted Answer that I put a conditional scorch into my built script.
Then I you can either set a default in the template and override in the one that needs the scorching or define it on an opt-out basis.
Get-ChildItem . -Include obj,bin -Recurse -Force | Remove-Item -Recurse -Force
Remove-Item Artifacts -Recurse -Force
Why answer with such an unsightly hack? I'm hoping someone will chime in with a real solution and [then downvote this or preferably] comment.
There's a way to declare variables in the template and substitute them with configuration parameters in each build project. Please refer more details at [link] http://blogs.jetbrains.com/teamcity/2010/10/14/overriding-template-settings/
I use Teamcity to build different packages and want to save those Packages as Artifacts. My Artifact Path in TeamCity is the following:
%system.teamcity.build.workingDir%\**\Release**/*.wsp => Solution
Now TeamCity collects all WSP-Files in any Release-Directory after building correctly. But it is saved including all subdirectories like:
I only want the .wsp-File directly under "solution" without the directory tree.
From TeamCity docs:
wildcard — to publish files matching Ant-like wildcard pattern ("" and
"*" wildcards are only supported). The wildcard should represent a
path relative to the build checkout directory. The files will be
published preserving the structure of the directories matched by the
wildcard (directories matched by "static" text will not be created).
That is, TeamCity will create directories starting from the first
occurrence of the wildcard in the pattern.
http://confluence.jetbrains.net/display/TCD65/Configuring+General+Settings#ConfiguringGeneralSettings-artifactPaths
In your build script ( or additional final build step) you will have to copy the necessary files to a single folder and publish that folder as Artifacts
Instead of copying as #manojlds suggests, you might be able to achieve something by modifying the OutputPath in yout .csproj file, or feeding in an OutDir property override when building a .sln (if you are). Be warned that neither of these approaches are perfect - for example, TeamBuild (the CI server in the Visual Studio ALM Tooling) redirects everything into one directory, which can cause a complete mess and only works for the most simple cases.
I had this issue where I wanted to gather various install files from subdirectories. Adding a PowerShell runner as a build step is quite powerful and solves this nicely...
get-childitem -Recurse -Include *.wsp | Move-Item -destination .
This will move them to the root prior to TeamCity looking at the artifacts, where the basic artifact paths like *.wsp can pick it up for the final output.