Team Services Release Management backup before deployment - visual-studio

I'm wondering if there was a way in Team Services for a release definition to have a step that backs up the directory for the application before it deploys the updated code.
I currently do a manual process before we deploy an update of an application where I take the application directory and back it up to a compressed file with the name containing the date of the deployment in the file name, which we do to satisfy our requirements for backups to our Change Control Board. I would like to make this automated process so that it's done the same way each time, even if I'm not the one over the deployment. I know you can do command line tasks and I could write a command line application that takes certain parameters but would like to know if Team Services may already have a task in place that can take care of this.

There are a Copy Files and a Zip Files task which you can drop into your process. You can even make the tasks conditional. There isn't a "backup" task per-se.
However, there is no "rollback" task, the normal thing to do would be to schedule a new release with the previous version to install. Or to fix the issue and trigger a new release with the fixed contents.

Related

CI for hotswapping non-compiled files into VSO and Octopus pipeline

Context:
Currently we manually get a git dev branch built into a package into VSO, and once that package is built it deploys to Octopus and takes down the site for a good amount of time as all the built packages are loaded into each server the site sits on... But, honestly, a lot of the bug fixes end up being in js files that could easily just be hot swapped in... and it's just so annoying to have to wait for poor OPs and support to do all of this just for a teensy tiny change.
The pipedream:
I would like to set up CI that allows hot swapping of js,css,cshtml,html files to VSO and Octopus... This will allow small petty changes in javascript files to be issued out fast and without deployments...
...and also have another option that allows me to say, "look, i've changed some csharp files so i need a built package to be pushed up into VSO and Octopus". A manual rebuild if you will.
Question:
I'm missing the vocabulary to search for this in Google and wondered if you guys can help me on a path to setting this up.
You can just include the modified files (remain folder structure) in the package, then push this package to Octopus server and deploy with this package, then the necessary files will be replaced.
My workflow of IIS deploy:
New Deployment Target with Listening tentacle mode
Add Deploy to IIS process template for Octopus project
Enable Custom installation directory feature in Configure Features window of Process
Specify Custom Install Directory path in process
Build tasks:
Add Copy Files task to copy necessary files (e.g. js)
Add Package Application task to package these files (remain folder structure)
Add Push Packages to Octopus task
Add Create Octopus Release task
On the other hand, you can include all published files to package (tested with File System publish), then deploy with that package, the Octopus can compare files and just replace the modified files ({webapp}.dll will be replaced too, checked the Created, Modified time on target server)
Regarding get changed files during build, you can call Get commit with Changed items REST API, then create/modify the build variable (e.g. depAll) through Logging Command (e.g. PS: Write-Host "##vso[task.setvariable variable=depAll];]Yes") per to the changed files result, then using this variable in task condition (Control Options of each task) to determine which tasks need to be run. Specify conditions for running a task

Specific cleanup interval for artifacts in TeamCity

I do have a project in TeamCity, that has a build configuration for the master release branch. This is compiled, every time a new version of our product is released.
In order to be able to pinpoint the introduction of errors, I do need a big retention time for some artifacts on this build configuration. As some other artifacts are rather big (full cd installation packages), my server's hard drive gets pretty full when simply upping the cleanup interval of this configuration.
Is it possible to configure two different cleanup intervals somehow? I would love to have a big retention time for the really important artifacts, while throwing the big ones away early.
I currently use TeamCity 9.0.3
Let's say for example, that my project has two artifacs:
smallupdatepack.zip (32 mb)
reallybigupdatecd.iso (700 mb)
I would like to configure TeamCity in a way that has the .iso kept for e.g. the last 10 builds, but the .zip is kept for the last 150 builds.
What I do not want, is a solution where all the .zip files are kept forever, while only the .iso files are deleted by an interval, which is all that seemed possible to me by using the build configuration's setting's artifact patterns alone.
You can specify custom cleanup rules for porjects/targets in Build History Clean-up page.
In your case, you can have a aggressive cleanup for all builds and a lenient cleanup for the Project/target for the master build
I have uploaded an example via an image below , if it helps
If you edit any of the settings, you can set individual period for artefacts. You can setup artefacts cleanup per target. However, for the same target you cannot setup different cleanup rules for multiple artefacts.
The answer by #Biswajit_86 looks like it's the only thing available for setting special clean up rules. I looked at it and it seems like the configuration specific settings should override the project settings and give you what you need, but maybe it doesn't work that way. Try it out and see if it works. If not, file a bug/suggestion with JetBrains.
The only other thing I could think of was to create a separate build configuration that only publishes the artifacts that you want to keep longer than your default rule. Give it a snapshot dependency on the configuration that creates the files and check the box to run on the same build agent. That way it doesn't need to rebuild them and can just publish what was already created. Set up a build trigger so that this new configuration runs whenever the other one finishes. Then set the clean up rules for this configuration to the longer retention setting.

How to Gerrit Trigger builds only for a certain part of the project

I'm using Jenkins integrated with git and gerrit trigger.
I manage a huge repo that leads with a big amount of commits. So i'm not interested on running a complete build of the whole project on every push.
The idea here is just build the especific project .pom file where there is code changes.
For example:
-"common" is a subdir with its own .pom
-"persistence" is a subdir with its own .pom. "persistence" depends on common dir.
So if there is a change on persistence dir i have to build only the persistence .pom file because it will build common as a dependencie. On the other hand if someone change code on common, the jenkins do not have the necessity of building persistence. So it will save me time and hw resources.
So the question here is how could i do this? Does gerrit trigger have any support for this?
Update: On selecting the changed files does the file RegEx option refers to the file Path or the file name?
And also, is there a way to, using file RegEx field, express something like build "if changes in common and not changes in persistence"?
Have you tried to use Dynamic Trigger Configuration? you define a filter - on which change the job should start. So you create jobs for each build then define a proper gerrit trigger with filter.
Or if you want to use only 1 job for every build - then write a script where you can use git to determine which file was changed and then select the right project file for it.
Update: try to play with New build parameter: GERRIT_TOPIC
So push your changes with a proper TOPIC name e.g.
git push origin HEAD:refs/for/master/topic_name
where topic_name can be common, persistence
the topic name can be filled automatically with pre-push hook in git.

How does one version control the configuration of a TeamCity project?

In my CruiseControl instances, I have version controlled the ccnet.config file.
When I want to update CruiseControl, I run an "update config" job which fetches the config from version control.
In this manner, the very build process of a release is configuration managed.
I am wondering how to achieve these goals effectively under TeamCity.
I try to keep what ever CI I am using as light as possible and put as much of the running of the build into an msbuild or nant script including running tests, code coverage, etc.
The benefit of this is:
The build file is version controlled.
You can run the script in any environment.
Easier to move between CI environments.
Everyone becomes responsible for the build.
This has been introduced in TeamCity 9. Also answered in another post:
Version control (e.g. in TFS) build configuration for TeamCity - is it possible?
I've been wanting a way to source control TeamCity config for a long time. I ended up writing a Windows Service which monitors the configuration directory and commits changes to git.
The project is on GitHub: https://github.com/grenade/teamcity-config-monitor
You might try looking at the folders that are backed up prior to upgrade (or when restoring team city) as those represent the configurations and changes you've made since initial installation.
http://confluence.jetbrains.net/display/TCD4/TeamCity+Data+Backup
Some of the relevant data is actually a database, (and in fact the documentation advises you to point team city to a real database like mysql instead of the default embedded database it uses)
You could try checking those into SVN, but you'll want to stop team city for any check-in actions.

Version Control for Hudson Continuous Integration Build Jobs

We have a continuous integration server with over 40 jobs that are constantly changing. I would like to version control continuous integration build jobs in Hudson so we can roll back changes if we have problems.
Is there a Hudson plugin that will do this or other solution that already exists or should I keep the config.xml files in SVN.
Hudson Labs has a really great write up on this, Keeping your configuration and data in Subversion
This is the first bit of the article
We all know that keeping important
files in version control is critical,
as it ensures problematic changes can
be reverted and can serve as a backup
mechanism as well. Code and resources
are often kept in version control, but
it can be easy to forget your
continuous integration (CI) server
itself! If a disk were to die or fall
victim to a misplaced rm -rf, you
could lose all the history and
configuration associated with the jobs
your CI server manages.
It’s pretty simple to create a
repository, but it isn’t obvious which
parts of your $HUDSON_HOME you’ll want
to backup. You’ll also want to have
some automation so new projects get
added to the repository, and deleted
ones get removed. Luckily we have a
great tool to handle this: Hudson!
We have a Hudson job which runs
nightly, performs the appropriate SVN
commands, and checks in
You only seem to be interested in the configuration, which is fine, just ignore or filter out the bits about the data and focus on the configuration.
This is one of the more recent threads about using version control with Hudson's configuration on the Hudson users list.
There are no plugins to do store configuration in an SCM right now (March 2010) though the backup plugin might do something close to what you want, but perhaps with less of a view of 'change' and more of just a snapshot at any given time.
The relatively new Job Config History plugin gets part of the way there - it doesn't actually store the configurations in source control, but it does provide history and auditing of changes to jobs.
You could look into the SCM Sync configuration plugin.
It automatically commits all of your jenkins config changes to svn. that way you can track configuration errors easily.
https://wiki.jenkins-ci.org/display/JENKINS/SCM+Sync+configuration+plugin

Resources