I have an application A that depends on an internal shared library B. A and B each has their own repositories.
I'm using TeamCity 10 to build these two projects. Two ways I considering doing this:
Build B and publish the dlls as artifacts. Build A with an artifact dependency on B.
Build B and publish as a nuget package. Build A with a nuget dependency on B.
My questions are:
Which approach is better?
Artifact dependencies seem like the simplest approach and can get the job done, but if we go the nuget route, we can generalize dependency resolution and decouple it from the buildserver to do it. Another advantage I can see with nugets is that when developers checkout solution for B, any dependencies can be resolved through nuget. While for artifact dependencies, some type of pre-build script on the developer's local machine is needed to retrieve/copy dll artifacts, mimicking what TeamCity does with artifact dependencies on the buildserver (is there a better approach for this?).
If we implement nuget dependencies, why would we ever need artifact dependencies at all?
Thanks in advance for any feedback offered.
The best approach for this is based on your projects developing progress.
If the project A and Project B are still in developing, I suggest you use artifact dependencies. Since you will change the projects very often when developing them, in TeamCity you just need to add artifact dependencies to the build configurations. No matter how you change the projects code, the build configurations need not to change.
If the projects development has finished, the NuGet dependencies is a good choice. Because if you are using NuGet dependencies when developing the projects, when any code changed during developing, you need to re-pack the packages and reinstall it into your project.
I would prefer to use Nuget just as the advantages you have already mentioned in your question. It's more convenient and can save more time when you want add the shared library to your project/solution to build locally.
Related
I have a solution with a bunch of different api's in it which reference different dependencies we built in a utilities folder.
In Azure Devops, when I go to build the solution it complains that it can't find the dependencies of those dependencies and fails. Those dependencies are pointing to the same dependencies of the solution which are a few folders up; so the dependency HintPath is ../../packages as opposed to the solution's dependency HintPath of ./packages
When I build the csproj files independent of the solution (i.e. not building them as a dependency of the solution) they build just fine and find the packages folder with no issue. Nuget Restore reports no issues and we have checked in the packages to source control so there should not be anything missing. Even if there was, Nuget would restore them if they were missing.
So I'm thinking: When I build the solution and it tries to build the dependencies down below, it's trying to access the packages starting at the solution directory and then going up from there (master/../../packages/ instead of utilities/../../packages/). That folder does not exist which would cause it to fail and makes sense as to why the individual builds of the dependencies works.
Has anyone come across this issue before in Azure Dev Ops or has any idea how to fix this?
We could edit the hint paths to point to ./ for the dependencies which has worked but we have a local build we still use for production that doesn't function like that and would break. We're trying to keep these in sync.
Please let me know if you have any ideas.
Thanks
Azure Build Pipeline: Building solution dependencies fails - can't find packages but building csproj dependency files independently works fine
To resolve this issue, you could try to install dependencies of those dependencies to the solution. That because NuGet team deprecated solution level packages in NuGet 3.0:
https://github.com/NuGet/Home/issues/522
So, we need make sure the all the used packages for the solution in the default packages folder ./packages instead of the different packages folder.
If above not help you, you could share your project structure in your question.
Hope this helps.
The solution that worked for me was changing the way NuGet packages were stored in the solution. Instead of using packages.config, there's an option in Visual Studio to use PackageReference which is a new way of referencing packages for NuGet. After switching most of our projects to the new way of using NuGet it was able to run and build.
I have just setup TeamCity to automate our builds, our current solution has both a dev and main branch. What I am trying to achieve is to have the development branch build and publish to a development NuGet feed on our ProGet installation, and then have the main branch publish to our Main NuGet feed on ProGet server.
We are using octopus deploy to deploy the packages, within TeamCity we have the octopus deploy plugin installed and if I tick the box to run OctoPack it builds the packages and they appear as artifacts when the build completes. If I try to use the NuGet Pack build step in TeamCity I get the following error for one of our projects:
[08:33:49] : [pack] Attempting to build package from 'xxx.csproj'.
[08:33:50]W: [pack] Unable to find 'xxx.exe'. Make sure the project has been built.
The project has been built and it works with OctoPack so why isn't it working with the NuGet Pack? Wwe have five projects being built and the first four run fine, one is a console app, one is an mvc website, and two are class libraries. The one that doesn't work is a windows service.
The end goal here is to publish these packages to a private feed on ProGet. I don't mind using OctoPack but in my head wanted to remove that dependency from TeamCity but I can live with it. However when I try to use the NuGet Publish runner type how do I select to publish any NuGet artifacts that have been created?
I have been googling like mad and I cannot find any helpful links that describe what you are supposed to enter, I would really appreciate any helpful comments/answers.
We are using version 8.15 of TeamCity.
Hopefully the following will help with at least part of your question; mainly the bit relating to how to publish packaged artifacts.
NuSpec Approach
When using the NuGet Pack build step, you can specify the Output Directory, which will determine the output location of the packages. You can specify this as a relative path to the checkout directory, probably best to define it as a build parameter, such as %system.PackageDeployOutput% as you'll be using it in the next step...:
Next, specify a NuGet Publish build step, fill in the Package Source / API key etc, and specify the Packages to upload as
%system.PackageDeployOutput%\*.nupkg
This will pick up the packages output in the previous step. I've used this quite effectively, and the parameterisation approach encourages conventions across all your builds.
OctoPack Support
If you're using the MsBuild build step with OctoPack, you can use a similar approach by declaring a system parameter called
system.OctoPackPublishPackageToFileShare = %teamcity.build.checkoutDir%\%system.PackageDeployOutput% (note the same parameter as above)
You can declare these as root project parameters, so you get the best of both worlds. My preferred approach to packaging is currently to use nuspec files for deployable endpoints. I've found OctoPack to be a bit of an overhead when it comes to more complex deployments (it's fine for basic MsBuild projects).
Normally "continuous package integration" involves source control, a build server, and participating teams fetching updated packages as often as they like. But I'm looking for a more extreme version of this story - without CM - that happens entirely on a developer's machine, all in one swoop. A more detailed description of what I want goes like this...
Using Visual Studio 2010 or 2012, assume a "foo.csproj" application that implements a plugin system. Each plugin represents a nuget package and has a corresponding Visual Studio project. Each of these projects is part of the same VS solution that contains the base application.
I want the following development story:
Change source code for a plugin.
Build solution, or perform a debug-launch, which causes msbuild to...
rebuild the changed plugin(s)
nuget then packages and uploads each plugin to a local repository (which can be just a subfolder of the VS solution)
rebuild the base application.
refresh the base application's nuget-plugin dependencies, which were just updated in prior steps. Notes:
This assumes MSBuild magically knows not to perform this last step until all plugins are built, packaged, and uploaded.
The "foo" application could itself use nuget.core to refresh the packages, but in this case I'm assuming that the VS build process did this step.
I would like to know if this story is common enough that there are "common" (msbuild?) scripts for this.
My own guess of how this should be handled is as follows:
All plugin projects are placed in a common "Plugins" folder somewhere in the VS solution folder structure.
The base application "project dependencies" are configured with references to all the plugin projects.
Note: I don't like the idea of managing these project dependencies manually.
The base application "foo.csproj" has a build step that scans the "foo.csproj" XML for dependencies it has in the "plugins" folder, and initiates the nuget packaging and deployment for each.
The base application then initiates the nuget "update all". Hopefully this is possible even though msbuild already mid-stride in execution.
In short, the base application is able to instantly consume plugins that have been altered. This is done without check-ins, a build-server, or manual and arbitrary requests to update plugin packages.
If pre-existing scripts do not already exist for this story, then I'll make my own. But I'd still like to know:
Can step 2, immediately above, be converted to something generic? That is, how can I convince msbuild not to build the "base application" until all projects in a particular folder have already been built? Remember, I'd like not to manage the project dependencies manually.
Is there anything flawed with this overall approach?
I would be particularly interested to know if there is an already existing nuget-visual-studio integration that assists with this story that I may have overlooked.
That's quite a long question to answer, so not sure I'm covering everything in this one; I'll do my best. First, your scenario is not uncommon. The first 2 steps of your planned approach seem OK to me (you're free to choose the location of the plug-in projects).
One thing's for sure: you'll have to manually define the build order, because your solution has no idea of knowing whether the projects consuming (NuGet) plug-ins have a dependency to the projects containing the source code for those plug-ins. Instead of using the built-in Build Order dialog in Visual Studio, take a look at this post on the MSDN blog for a correct way of doing this (or you might end up with something that works locally but not on the build server).
The key MSBuild elements in the referred post are the following:
<ProjectReference Include="... foo.csproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
Now, as for the packaging, deployment and consumption of those plug-ins:
Each plug-in project should trigger package creation and publication in a post-build step. This post on my blog contains ZIP-download with quite lot of MSBuild stuff you can use to get started. E.g. I version, package and publish the NuGet package for a class library in Release builds. I'm using the NuGet command line tool to pack (command reference) and push (command reference) the package.
The consuming application project(s) should run NuGet.exe update <packages.config> (command reference) in a pre-build step.
Also pay attention you're NOT running builds in parallel.
Since TeamCity supports many languages, this question might be very tight coupled with the way, projects work in Visual Studio. So here is my question:
For instance, I have a project A in Visual Studio, and I want to add it to TeamCity in order to support continues integration. Now there is another project B. This project depends on project A. Now, how do I reference project A in project B? And how do I update it properly? I know that I can generate artifacts in project A, but I think I'm supposed to download them manually?
You could use NuGet for that. Please take a look at this blog post for more details about TeamCity + NuGet integration.
I found that nuget seems to always install packages under the folder of the visual studio project.
It is not feasible for me because the package that I'm going to distribute contains huge amount of data. I don't want to make a copy of that whenever I add that package to a new visual studio project.
I want that data to be shared between projects. Since it is shared, if one project removed that package, the data should stay there until I explicitly tell the system to remove it.
Is there any way that can deal with this kind of problem?
I heard that Maven installs packages in a global location and it doesn't have such a problem. How about using Maven to install .NET libraries, is that possible? What would be the potential problems?
To upload and store your .NET artefacts you'll need a Maven repository manager like Nexus, Artifactory or Archiva. Good news is that these are capable of storing files of any type.
If you don't fancy converting your build process over to Maven, I'd recommend the following answer on using Apache ivy with MSBuild. All Maven clients appear to cache their downloads for use across projects (They're basically intelligent downloaders)
The upcoming 2.0 version of Nexus promises integration with NuGet. I'm expecting better .NET support from Maven in the future.