I'm just looking at streamlining the nuget packages that are coming out of my build system and I'm stuck on how to only package the files that are required.
I have several configurations sharing a Root VCS checkout. I have a configuration that runs a debug build with unit tests. I also have a release configuration that does a release build, this configuration then also uses the TeamCity OctoPack plugin to create the nuget packages.
What I want to achieve is the building of nuget packages that don't contain the *.pdb and *.xml documentation files as these aren't required for the release deployment.
I've looked through this page on the OD site:
http://docs.octopusdeploy.com/display/OD/Using+OctoPack
And according to this page OctoPack should only package up the required files by default. I'm not entirely clear on how or what needs to be done to get around this problem as it doesn't appear to be working as described.
It seems that one solution would be to provide a nuspec file for the projects I'm looking to deply but I'm also wondering if there is something I'm missing before I head off down that route.
I also have some MEF plugins that are copied in post build events and these aren't included in the nuget packages when in fact they are needed for the application to run. I think I need to get explicit with a nuspec file but would like to confirm this.
What is the simplest way of achieving what I need?
Assuming you're running the later versions of OctoPack, in your release build you can set a system parameter system.DebugType = None which will get passed to the OctoPack build scripts and prevent the PDB's being created.
This simply overrides the setting defined in your csproj msbuild file (assuming C#), so you can use it wherever you want to prevent PDB's being created at the build configuration level (not just OctoPack). I generally prefer this approach as it prevents side-effects in your build from changes by developers in the project file.
As for the xml files, I haven't actually tried this, but you can try a similar approach and create a system parameter system.DocumentationFile = "" to blank out the output.
If you really want to make sure that the files have been removed there are a couple of ways you can do this. Modify your deployment process to:
Execute your own custom PowerShell script in that removes the files
Include a script module from the Octopus Library to the same. Check out the File System - Clean Directory from the Octopus Library
Related
When a private agent build starts in VSTS, it gets assigned a directory, e.g. C:\vstsagent_work\1\s
Is there a way to set this to a different path? On other CI servers, like Jenkins, I can define a custom workspace for a job. I'm dealing with a huge monorepo and have dozens of build definitions around the same repository. It makes sense (to me anyway) to share a single directory on the build agent computer.
The benefit to me is that my builds can use pre-built components from upstream repositories, if they have already been built.
Thanks for any help
VSTS build always creates a working directory per build definition. This leaves you two options:
Create a single build definition and use conditionals on steps to skip certain steps in order to only run what is needed. This allows you to use the standard steps and may require a powershell script to figure out which steps to run and which ones to skip. Set variables from powershell using the special logging commands.
Disable the get sources step and add a step that manually fetches sources. You'll need to clean the working directory, checkout the right commit, basically replicating the actions in the get sources step manually. It may require some fidgeting to get all the behavior correctly for normal build, pull request builds etc. That way you can take full control over the location where sources are checked out.
I'd also recommend you investigate the 2017 project formats that use the new <packageReference> in the project files to fetch packages. The new system supports configuring a version range which can always fetch the latest available version of packages. It's a better long-term solution.
No, it isn’t available in VSTS build system.
You can change working directory of agent (C:\vstsagent_work) (Re-configure it and specify another working folder), but it won’t uses the same source folder for different build definitions, the folder would be 1, 2, 3 ….
Scenario:
I am migrating our current VS Solution analysis setup from using the sonar-runner to using the MSBuild runner. However I am encountering a fairly significant problem.
In the old setup, we specified our project name, key and most importantly a long list of skipped projects (sonar.visualstudio.skippedProjectPattern) using the sonar-project.properties file.
This is because [WARNING: ugly legacy bad coding practice alert] we have six solutions that build dozens and dozens of projects, all out of the same git repo. A lot of the projects are common across several solutions and we don't want them analyzed more than once. So each solution has a set of projects that it "owns" and which are analyzed as part of it. Thus the sonar-project.properies file for each of the other solutions specifies that these projects are to be ignored.
The Problem: In the new MSBuild Runner approach, there does not appear to be MS solution level (also read as SonarQube Project level) configuration file or mechanism aside from passing arguments on the command line to the MSBuild runner's 'begin' phase. One either has the global configuration file, or the MSBuild *.*proj files, (that is, MS project level configuration files). This latter is clearly out of the question as whether a project gets excluded from analysis is based on which solution is being analyzed.
As noted, conceivably we could pass all this in on the command line but that is sub optimal. Our builds are done by scripts that are, to the extent possible, generic. Having the configuration in the sonar-project.properities file was a big help in keeping them that way and we are hoping we are missing something here that will let us keep using that file or a similar one. Are we?
There currently is no equivalent to the sonar-project.properties file in the MSBuild SonarQube Runner version 1.0. I've added a new ticket to the project's backlog to consider adding this feature in an upcoming release: http://jira.sonarsource.com/browse/SONARMSBRU-124
The v1.0 MSBuild SonarQube Runner supports a /s: command line argument that allows you to specify the global settings file to use. The settings file can contain any additional global settings that previous you would have put in the sonar-project.properties file.
If you don't specify a global setting file the MSBuild Runner will look for a default global settings file in the same location as the runner executable.
See the documentation repo for more information: https://github.com/SonarSource/sonar-.net-documentation/blob/master/doc/appendix-2.md
The properties now can be added via ItemGroups in each .csproj file, this way:
<ItemGroup>
<SonarQubeSetting Include="sonar.cpd.exclusions">
<Value>Models/**/*.cs</Value>
</SonarQubeSetting>
</ItemGroup>
Our company currently uses TFS for source control and build server. Most of our projects are written in C/C++, but we also have some .NET projects and wouldn't want to be limited if we need to use other languages in the future.
We'd like to use Git for our source control and we're trying to understand what would be the best choice for a build server. We have started looking into TeamCity, but there are some issues we're having trouble with which will probably be relevant regardless of our choice of build server:
Build dependencies - We'd like to be able to control the build dependencies for each <project, branch>. For example, have <MyProj, feature_branch> depend on <InfraProj1, feature_branch> and <InfraProj2, master>.
From what we’ve seen, to do that we might need to use Gradle or something similar to build our projects instead of plain MSBuild. Is this correct? Are there simpler ways of achieving this?
Local builds - Obviously we'd like to be able to build projects locally as well. This becomes somewhat of a problem when project dependencies are introduced, as we need a way to reference these resources or copy them locally for the build to succeed. How is this usually solved?
I'd appreciate any input, but a sample setup which covers these issues will also be a great help.
IMHO both issues you mention fall really in the config management category, thus, as you say, unrelated to the build server choice.
A workspace for a project build (doesn't matter if centralized or local) should really contain all necessary resources for the build.
How can you achieve that? Have a project "metadata" git repo with a "content" file containing all your project components and their dependencies (each with its own git/other repo) and their exact versions - effectively tying them together coherently (you may find it useful to store other metadata in this component down the road as well, like component specific SCM info if using a mix of SCMs across the workspace).
A workspace pull wrapper script would first pull this metadata git repo, parse the content file and then pull all the other project components and their dependencies according with the content file info. Any build in such workspace would have all the parts it needs.
When time comes to modify either the code in a project component or the version of one of the dependencies you'll need to also update this content file in the metadata git repo to reflect the update and commit it - this is how your project makes progress coherently, as a whole.
Of course, actually managing dependencies is another matter. Tons of opinions out there, some even conflicting.
I am deploying a nuget package that comes to Octopus Deploy from TeamCity. What I need to do is to apply config transforms and deploy the zip archive of that package content (Windows Application binaries) to a specific folder of the tentacle. I even don't need to publish the package itself. Is there a way to achieve that?
If I understand you correctly, you should be able to achieve this fairly easily.
Have a standard step type of Deploy NuGet Package to install the nuget package onto the tentacle for the correct environment and role, ensuring that Configuration Variables and Xml Transforms are enabled - This will ensure your configs are transformed OOTB with no effort.
In order to then deploy the binaries I would use some custom powershell as this doesn't appear to be an "application type" that Octopus treats as a first class citizen. You could write the powershell inline as a second step by adding a step type of Run a PowerShell script and writing the code in the Octopus UI. Not knowing if the are going over a network share or any specifics I've not attempted to write any code.
My personal preference would be to write this as a powershell script that is included in the .nuget package. Octopus Deploy supports a naming convention on certain PowerShell files it finds in the .nuget package (PreDeploy.ps1, Deploy.ps1, PostDeploy.ps1). I'd write a PostDeploy.ps1 and package that up. I'd then have that script under source control and could easily make changes to it.
The caveat with this if you aren't careful is that you then start to move deployment implementations away from Octopus. However, written carefully and generically you could bootstrap the script with Variables (such as path to directory / network share etc) which makes it reusable should the need arise, and you can have it work differently in different environments.
Please pardon my ignorance if I've still not understood what you mean by the binaries "being in zipped form"
Further details on the Octopus Deploy PowerShell Scripts
Hope this help.
You could add a post-deployment powershell script to your deployment step to zip the contents of the deployment folder after the configs have been transformed.
https://blogs.technet.microsoft.com/heyscriptingguy/2015/03/09/use-powershell-to-create-zip-archive-of-folder/
I am setting up a Continuous Integration server.
I have one issue that doesn't seem to be mentioned in the tutorials.
I have a ASP.net Web Application that I need to compile and them publish.
My Problem is that I seem to be able to compile the app but when I attempt to use a buildPublisher this copies every thing including .svn files & folders and ms CS files.
I am using an MSBuild task to compile my source. I tried setting my MSBuild Output Directory to directory but this didn't seem to have any effect.
What am I not understanding?
Thanks
You're probably looking for the _CopyWebApplication directive:
http://blogs.msdn.com/nmoreau/archive/2007/01/26/deploying-web-application-projects.aspx
We use CruiseControl.NET to deploy our ASP.NET applications to our test servers for the QA department so this is indeed possible.
In each project we created an additional build configuration called 'Deployer' which is identical to the Debug configuration with the exception of building an additional Web Deployment Project. So by running MSBuild in the Deployer configuration we can generate our compiled output in a known location.
We then use NAnt to perform a simple copy operation to the required location, i.e. into a folder where our IIS server is configured to look for the applications.
I know you don't mention NAnt in your question but it's well worth getting familiar with it if you want to get the most out of CCNet.
I'm not at work at the moment but if this makes any sense and you want some additional information then let me know and I'll pull some more information together.
Hope this helps
Are your bin or obj folders checked into svn? if so that would contribute or possibly cause this issue.
Because those folders shouldn't have any svn folders/files
You can configure SVN to call the .svn folders _svn instead.
Set the SVN_ASP_DOT_NET_HACK environment variable.