We have a VSTS build setup. Currently we have a single repository hosting multiple services. We have a build definition per service, and triggers each only when the current service is touched, by a trigger pattern.
Now the issue is that each build definition hence the single repository makes the GetSource download the whole repo and also we do a clean.
I have been searching to see if there is a solution like the trigger where we can set a pattern, to just get a part of the repository downloadet. This should be to reduce build/download time.
A workaround might be to not make a clean each time or make multiple repositories. At the moment we would like to avoid the latter.
Let me hear if anyone knows of a good solution.
There is no way to specify the files to be downloaded in Get Sources step. Instead, VSTS will download the whole repo.
The workaround is set Clean option as false so that only the modified files and new added files will be updated in Get Sources step.
Related
When a private agent build starts in VSTS, it gets assigned a directory, e.g. C:\vstsagent_work\1\s
Is there a way to set this to a different path? On other CI servers, like Jenkins, I can define a custom workspace for a job. I'm dealing with a huge monorepo and have dozens of build definitions around the same repository. It makes sense (to me anyway) to share a single directory on the build agent computer.
The benefit to me is that my builds can use pre-built components from upstream repositories, if they have already been built.
Thanks for any help
VSTS build always creates a working directory per build definition. This leaves you two options:
Create a single build definition and use conditionals on steps to skip certain steps in order to only run what is needed. This allows you to use the standard steps and may require a powershell script to figure out which steps to run and which ones to skip. Set variables from powershell using the special logging commands.
Disable the get sources step and add a step that manually fetches sources. You'll need to clean the working directory, checkout the right commit, basically replicating the actions in the get sources step manually. It may require some fidgeting to get all the behavior correctly for normal build, pull request builds etc. That way you can take full control over the location where sources are checked out.
I'd also recommend you investigate the 2017 project formats that use the new <packageReference> in the project files to fetch packages. The new system supports configuring a version range which can always fetch the latest available version of packages. It's a better long-term solution.
No, it isn’t available in VSTS build system.
You can change working directory of agent (C:\vstsagent_work) (Re-configure it and specify another working folder), but it won’t uses the same source folder for different build definitions, the folder would be 1, 2, 3 ….
I do have a project in TeamCity, that has a build configuration for the master release branch. This is compiled, every time a new version of our product is released.
In order to be able to pinpoint the introduction of errors, I do need a big retention time for some artifacts on this build configuration. As some other artifacts are rather big (full cd installation packages), my server's hard drive gets pretty full when simply upping the cleanup interval of this configuration.
Is it possible to configure two different cleanup intervals somehow? I would love to have a big retention time for the really important artifacts, while throwing the big ones away early.
I currently use TeamCity 9.0.3
Let's say for example, that my project has two artifacs:
smallupdatepack.zip (32 mb)
reallybigupdatecd.iso (700 mb)
I would like to configure TeamCity in a way that has the .iso kept for e.g. the last 10 builds, but the .zip is kept for the last 150 builds.
What I do not want, is a solution where all the .zip files are kept forever, while only the .iso files are deleted by an interval, which is all that seemed possible to me by using the build configuration's setting's artifact patterns alone.
You can specify custom cleanup rules for porjects/targets in Build History Clean-up page.
In your case, you can have a aggressive cleanup for all builds and a lenient cleanup for the Project/target for the master build
I have uploaded an example via an image below , if it helps
If you edit any of the settings, you can set individual period for artefacts. You can setup artefacts cleanup per target. However, for the same target you cannot setup different cleanup rules for multiple artefacts.
The answer by #Biswajit_86 looks like it's the only thing available for setting special clean up rules. I looked at it and it seems like the configuration specific settings should override the project settings and give you what you need, but maybe it doesn't work that way. Try it out and see if it works. If not, file a bug/suggestion with JetBrains.
The only other thing I could think of was to create a separate build configuration that only publishes the artifacts that you want to keep longer than your default rule. Give it a snapshot dependency on the configuration that creates the files and check the box to run on the same build agent. That way it doesn't need to rebuild them and can just publish what was already created. Set up a build trigger so that this new configuration runs whenever the other one finishes. Then set the clean up rules for this configuration to the longer retention setting.
Our company currently uses TFS for source control and build server. Most of our projects are written in C/C++, but we also have some .NET projects and wouldn't want to be limited if we need to use other languages in the future.
We'd like to use Git for our source control and we're trying to understand what would be the best choice for a build server. We have started looking into TeamCity, but there are some issues we're having trouble with which will probably be relevant regardless of our choice of build server:
Build dependencies - We'd like to be able to control the build dependencies for each <project, branch>. For example, have <MyProj, feature_branch> depend on <InfraProj1, feature_branch> and <InfraProj2, master>.
From what we’ve seen, to do that we might need to use Gradle or something similar to build our projects instead of plain MSBuild. Is this correct? Are there simpler ways of achieving this?
Local builds - Obviously we'd like to be able to build projects locally as well. This becomes somewhat of a problem when project dependencies are introduced, as we need a way to reference these resources or copy them locally for the build to succeed. How is this usually solved?
I'd appreciate any input, but a sample setup which covers these issues will also be a great help.
IMHO both issues you mention fall really in the config management category, thus, as you say, unrelated to the build server choice.
A workspace for a project build (doesn't matter if centralized or local) should really contain all necessary resources for the build.
How can you achieve that? Have a project "metadata" git repo with a "content" file containing all your project components and their dependencies (each with its own git/other repo) and their exact versions - effectively tying them together coherently (you may find it useful to store other metadata in this component down the road as well, like component specific SCM info if using a mix of SCMs across the workspace).
A workspace pull wrapper script would first pull this metadata git repo, parse the content file and then pull all the other project components and their dependencies according with the content file info. Any build in such workspace would have all the parts it needs.
When time comes to modify either the code in a project component or the version of one of the dependencies you'll need to also update this content file in the metadata git repo to reflect the update and commit it - this is how your project makes progress coherently, as a whole.
Of course, actually managing dependencies is another matter. Tons of opinions out there, some even conflicting.
I have several projects which use code from a large set of component libraries. These libraries are under source control.
The libraries repository contains all the libraries used by all my projects and contains multiple versions of multiple libraries. Each library/version pair lives in its own folder. Each of my projects identifies the specific library/version pairs it needs through the folder paths of the references in its project file.
For example $(LibraryPath)\SomeLibrary\v1.1.5
Please note that the libraries repository is only ever added to. No changes are made to stuff already in the repository. Ever.
I have been of course been able to configure my build plan to pull the libraries repository to a libraries subfolder of the working directory. So far so good. However, using the auto branch management feature of Bamboo, this setup means that the libraries repository is cloned for each and every branch in all projects.
Not funny. No, really, not funny...
What I would like to do is:
pull the libraries repository in each build plan
but pull it to a fixed location that is the same for all build plans
it doesn't have to be an absolute path
but it does need to be outside the working directory of the current build plan to avoid unnecessary duplication
Unfortunately the Checkout Directory of the Source Code Checkout configuration task in a Bamboo build plan doesn't allow me to specify either an absolute path or a relative one that goes "up" for one or more levels from the working dir. The hint text explicitly states "(Optional) Specify an alternative sub-directory to which the code will be checked out." And indeed, specifying something like ..\Library gets punished with the message "Checkout to parent directory is forbidden".
I have seen information on the "artifact sharing" feature of Bamboo. This will probably work, but it seems like overkill for what I want to achieve.
What would be the easiest and least complicated way to achieve my goal using Atlassian's Bamboo Continuous Integration?
Out-of-the-box alternatives are welcome, but please don't direct me to any products that require intimate CLI use and/or whose documentation assumes (extensive) knowledge of 'nix and/or Java setup. I am on Windows and spoiled rotten by powerful (G)UI's.
I have the same problem - with a repository weighing in at around 2GB.
I'd like to simply "git checkout myBranch" and "git clean -fxd" instead of cloning every time (which should save a lot of time and disk space). However I also like Bamboo's automatic trigger with new branches showing up.
Like the OP, I'd love to be able to put "..\SharedDirectory" in the "CheckoutDirectory" for the
"Source Code Checkout" task but it won't let me go out above the \JOB_KEY\ folder
One possible solution is: replacing the "Source Code Checkout" task with the two git commands above. That way I can specify exact when/where/how to do the checkout. I think there may be problems with the initial checkout in this case - but once that is solved, all subsequent branches would use the same shared folder, and no more pulling down 2GB every time.
I have recently been charged with building out our "software infrastructure" and so I am putting together a continuous integration server.
After a build completes would it be considered bad form for the CI system to check in some of the artifacts it creates into a tag so that it can be fetched easily later (or if the build breaks you can more easily recreate the problem.)
For the record we use SVN and BuildMaster (free edition) here.
This is more of a best practices question rather than a how-to question. (It is pretty easy to do with BuildMaster)
Seth
If you believe this approach would be beneficial to you, go ahead and do it. As long as you maintain a clear trace of what source code was used to build each artifact, you'll be fine.
You should keep this artifact repository separated from the source code repository.
It is however a little odd to use a source code repository for this - these are typically used for things that will change, something your artifacts most definitely should not.
Source code repositories are also often used in a context where you want to check out "everything", for example the entire trunk. With artifacts you are typically looking for a specific version, and checking out all of the would only be done if exporting them to some other medium.
There are several artifact repositories specialized for this, for example Artifactory or Apache Archiva, but a properly backed up file server will thought-through access settings might be a simple and good-enough solution.
I would say it's a smell to check in binaries as a tag. Your build artifacts should be associated with a particular build version in your build system, and that build should be associated with a particular checkin. You should be able to recreate the exact source code from that information. If what you're looking for is a one-stop-function to open the precise source-code revision that generated the broken build, I'd suggest that you invest some time into building a Powershell module that will do that for you.
Something with a signature like:
OpenBuild -projectName "some project name" -buildNumber "some build number"