I've got a website and a console application which work together.
The Website is published to any of ~10 different servers depending on the publish profile. For production, that's MSDeploy but for Dev/Test/UAT/etc... it's a simple UNC File path copy.
I'd like to be able to publish the CLI as easily (at least to "local" servers). What I want is to click Publish, pick the Profile/Server and have it compile and copy the output in the bin folder to \\Server\ApplicationName
I'm aware I could perform the Copy as a Post-Build step but don't always want to do the copy and definitely don't always want it to the same place.
The publish options which exist for Console/WinForms apps seem to center around 1-click publishing which isn't what I'm after (I don't want the app phoning home looking for new versions every time, I want to explicitly push a specific version to a specific server).
I'm aware that if the app is running during the attempted publish, there will be problems.
Is this achievable?
Related
I am looking for a tool to manage the collection of binary files (input components) that make up a software release. This is a software product and we have released multiple versions each year for the last 20 years. The details and types of files may vary, but this is something many software teams need to manage.
What's a Software Release made of?
A mixture of files go into our software releases, including:
Windows executables/binaries (40 DLLs and 30+ EXE files).
Scripts used by the installer to create a database
API assemblies for various platforms (.NET, ActiveX, and Java)
Documentation files (HTML, PDF, CHM)
Source code for example applications
The full collected files for a single version of the release are about 90MB. Most are built from source code, but some are 3rd party.
Manual Process
Long ago we managed this manually.
When starting each new release the files used to build the last release would be copied to a new folder on a shared drive.
The developers would manually add or update files in this folder (hoping nothing was lost or deleted accidentally).
The software installer script would be compiled using the files in this folder to produce a SETUP.EXE (output).
Iterate steps 2 and 3 during validation & testing until release.
Automatic Process
Some years ago we adopted CI (building our binaries nightly or on-demand).
We resorted to putting 3rd party binaries under version control since they usually don't change as often.
Then we automated the process of collecting & updating files for a release based on the CI build outputs. Finally we were able to automate the construction of our SETUP.EXE.
Remaining Gaps
Great so far, but this leaves us with two problems:
Rebuilding Assemblies The CI mostly builds projects when something has changed, but when forced it will re-compile a binary that doesn't have any code change. The output is a fresh build of a binary we've previously tested (hint: should we always trust these are equivalent?).
Latest vs Stable Mostly our CI machine builds the latest versions of each project. In some cases this is ok, but often we want to release an older tested or stable version. To do this we have separate CI projects for the latest and stable builds - this works but is clumsy.
Thanks for your patience if you've got this far :-)
I Still Haven't Found What I'm Looking For
After some time searching for solutions it seems it might be easier to build our own solution, but surely someone else has solved these problems before!?
What we want is a way to store and manage binary files (either outputs from CI, or 3rd party files) such that each is tagged with a version (v1.2.3.4) that allows:
The CI to publish new versions of each binary (but reject rebuilt versions that already exist).
The development team to make a recipe for a software release (kinda like NuGet packages.config) that specifies components to include:
package name
version
path/destination in the release folder
The Automatic package script to use the recipe collect the required files, and compile the install package (e.g. SETUP.EXE).
I am aware of past debates about storing binaries in a VCS. For now I am looking for a better solution. That approach does not appear ideal for long-term ongoing use (e.g. how to prune old binaries)... amongst other issues.
I have tried some artifact repositories currently available. From my investigation these provide a solution for component/artifact storage and version control. However they do not provide tools for managing a list of components/artifacts to include in a software release.
Does anybody out there know of tools for this?
Have you found a way to get your CI infrastructure to address these remaining issues?
If you're using an artifact repository to solve this problem, how do you manage and automate the process?
This is a very broad topic, but it sounds like you want a release management tool (e.g. BuildMaster, developed by my company Inedo), possibly in conjunction with a package management server like ProGet (which you tagged, and is how I discovered this question).
To address some specific questions you have, I'll associate it with a feature that would solve the problem:
A mixture of files go into our software releases, including...
This is handled in BuildMaster with artifacts. This video gives a basic overview of how they are manually added to releases and deployed to a file system: https://inedo.com/support/tutorials/buildmaster/deployments/deploying-a-simple-web-app-to-iis
Of course, once that works to satisfaction, you can automate the import of artifacts from your existing CI tool, create them from a BuildMaster deployment plan itself, pull them from your package server, whatever. Down the line you can also have your CI tool call the BuildMaster release management API to create a release and automatically have it include all the artifacts and components you want (this is what most of our customers do now, i.e. have a build step in TeamCity create a release from a template).
Rebuilding Assemblies ... The output is a fresh build of a binary we've previously tested (hint: should we always trust these are equivalent?)
You can mostly assume they are equivalent functionally, but it's only the times that they are not that problems arise. This is especially true with package managers that do not lock dependencies to specific version numbers (i.e. NuGet, npm). You should be releasing exactly the same binary that was tested in previous environments.
[we want] the development team to make a recipe for a software release (kinda like NuGet packages.config) that specifies components to include:
This is handled with releases. A developer can choose its name, dates, etc., and associate it with a pipeline (i.e. a set of testing stages that the artifacts are deployed to), then can "click the deploy button" and have the automation do all the work.
Releases are grouped by "application", similar to a project in TeamCity. As a more advanced use case, you can use deployables. Deployables are essentially individual components of an application you include in a release; in your case the "Documentation" could be a deployable, and maybe contain an artifact of the .pdf and .docx files. Deployables from other applications (maybe a different team is responsible for them, or whatever) can then be referenced and "included" in a release, or you can reference ones from a past release.
Hopefully that provides some overview and fits your needs. Getting into this space is a bit overwhelming because there are so many terms, technologies, and methodologies, but my advice is to start simple and then slowly build upon it, e.g.:
deploy a single, manually uploaded component through BuildMaster to a share drive, then manually deploy it from there
add a deployment plan that imports the component
add a second plan and associate it with the 2nd stage that takes the uploaded artifact and deploys it to the target, bypassing the need for the share drive
add more deployment plans and associate them with pipeline stages and promote through them all to "close out" a release
add an agent and deploy to that instead of the default localhost server
add more components and segregate their deployment with deployables
add event listeners to email team members at points in the process
start adding approvals if you require gated "sign-offs"
and so on.
Context:
Currently we manually get a git dev branch built into a package into VSO, and once that package is built it deploys to Octopus and takes down the site for a good amount of time as all the built packages are loaded into each server the site sits on... But, honestly, a lot of the bug fixes end up being in js files that could easily just be hot swapped in... and it's just so annoying to have to wait for poor OPs and support to do all of this just for a teensy tiny change.
The pipedream:
I would like to set up CI that allows hot swapping of js,css,cshtml,html files to VSO and Octopus... This will allow small petty changes in javascript files to be issued out fast and without deployments...
...and also have another option that allows me to say, "look, i've changed some csharp files so i need a built package to be pushed up into VSO and Octopus". A manual rebuild if you will.
Question:
I'm missing the vocabulary to search for this in Google and wondered if you guys can help me on a path to setting this up.
You can just include the modified files (remain folder structure) in the package, then push this package to Octopus server and deploy with this package, then the necessary files will be replaced.
My workflow of IIS deploy:
New Deployment Target with Listening tentacle mode
Add Deploy to IIS process template for Octopus project
Enable Custom installation directory feature in Configure Features window of Process
Specify Custom Install Directory path in process
Build tasks:
Add Copy Files task to copy necessary files (e.g. js)
Add Package Application task to package these files (remain folder structure)
Add Push Packages to Octopus task
Add Create Octopus Release task
On the other hand, you can include all published files to package (tested with File System publish), then deploy with that package, the Octopus can compare files and just replace the modified files ({webapp}.dll will be replaced too, checked the Created, Modified time on target server)
Regarding get changed files during build, you can call Get commit with Changed items REST API, then create/modify the build variable (e.g. depAll) through Logging Command (e.g. PS: Write-Host "##vso[task.setvariable variable=depAll];]Yes") per to the changed files result, then using this variable in task condition (Control Options of each task) to determine which tasks need to be run. Specify conditions for running a task
I was wondering if there is an easy way to use ClickOnce deployment to publish multiple versions of the same application. For instance, I have an application that we distribute to several of our sister companies for use, and so I need to have that companies name set in the config file so that it displays their name on the splash screen and the reports the program outputs (and possibly change the application name as well).
My only thought on how to do this is to have separate directories on the web server and place a published deployment on each one. In visual studio I will have to set the correct name in the config file, publish the files to the server, go back and change the name in the config file, publish again, etc until each company has their own individual deployment.
You can create an MSBuild script that will update the config file, update the location of where to publish the ClickOnce files and then start the actual ClickOnce publishing process. Have the script loop through your different companies until they are all complete.
My staging build configuration runs through all the build steps and finally deploys the application at, say http://build90.qa.testsite.com/, which has been very useful for showing to stakeholders.
Right now I have a script that e-mails users the link to the deployed site, which is nice but it would be better to have a link somewhere in the TeamCity user interface so that people can go to the newly deployed website. There seems to be no built-in way to do this. If the "description" field supported parameters, I could do it there for the latest build, but nothing historic.
Honestly, the best solution I can think of right now is some sort of browser extension, but that doesn't help on mobile devices.
Your best bet might be to set up a server, on, say, http://latest.qa.example.com/, and have TeamCity set that server to redirect to the latest http://buildNN.qa.example.com/ . That way, users don't even have to go to TeamCity to get to the latest site.
Of course, that doesn't make it easy to go to historical runs.
I am setting up a CruiseControl.Net server. So far, it only builds a project (.Net website), and I kind-of know how to set up unit testing, code coverage, etc in the future.
What I will need to have soon is this:
The developers commit changes to SVN continually, thus CCNet builds often.
CCNet will publish the latest version to the development server, as soon as a commit is validated (with unit tests etc).
The project manager validates a specific version, in order to publish it to the pre-production server, and create a SVN tag from this revision.
The last point is where my problem lies: how exactly can I set up things so the project manager can, for instance, browse to the CCNet web dashboard, select a previous specific build, and says "this is the build I want to publish" ?
I believe that my thinking is flawed somewhere, but I can't put my finger on it. Maybe CCNet is not the right place to do these manipulations ?
In my mind, I can create a SVN tag using CCNet, and mostly work from the trunk, but maybe I can't ? Maybe it's the other way around, and I should add a CCNet project every time a tag is created under SVN ?
The final goal is that I want to automate the publication process: zip creation (for archiving), web.config modification (using Nant for instance), and website publication (using FTP).
In all these steps, I want to limit the manual intervention to the maximum. If I can avoid to add a new project to CCNet every time a tag or branch is created in SVN, that would be awesome.
Thanks for your help, and sorry if it's not very easy to read, but it's not very clear in my head either...
Since you can create any task, you should be able to achieve the goal, though unfortunately not out-of-the-box.
Since you use SVN, it all depends actually on revision. I think I'd create a separate project for your third scenario and added a parameter where PM would provide revision number. Then based on that I'd tag sources etc. in my own task.
Regarding the other points, I think this is similar. Recently for web projects we started using MSDeploy, and in each stage build the MSDeploy package was created. Then there was a separate build called Deploy, that when forced allows us to select which package we want to deploy using MSDeploy.
Having several environments, however, started a little bit like overkill for managing with CCNet, and I'll be looking into kwakee at some time.