Consider this repo/file structure for our solution...
Shared Repo (Checked out to D:/Shared/trunk)
├───Shared1.dll Project
└───Shared2.dll Project
App1 Repo (Checked out to C:/Code/App1/Trunk)
├───App1 Project (Refs Shared1.dll project)
├───App1.dll Project (Refs Shared1.dll and Shared2.dll projects)
└───App1.sln
App2 Repo (Checked out to C:/Code/App2/Trunk)
├───App2 Project (Refs Shared1.dll project)
├───App2a.dll Project (Refs Shared1.dll and Shared2.dll projects)
├───App2b.dll Project (Refs Shared1.dll and App2a.dll projects)
└───App2.sln
To make working with the code easier, we bring in the Shared projects directly into the application's solutions, meaning for instance if you open App1.sln, this would be your project tree...
App1.sln
├───Shared1.dll Project
├───Shared2.dll Project
├───App1 Project (Refs Shared1.dll project)
└───App1.dll Project (Refs Shared1.dll and Shared2.dll projects)
As you can see, the two Shared DLLs are from a separate repository but are included in this solution. Visual Studio handles this without any issue, prompting you that you are updating multiple repos when you perform a commit against the solution. That's fine and is exactly what we want.
The issue we're having however is with NuGet. From what we understand, the NuGet.config (and the hierarchy/precedence of reading/applying them) is relative to the solution file, and therefore the projects' NuGet references are updated accordingly. This causes issues in that the references to the NuGet packages in Shared1.dll an Shared2.dll are relative to App1.sln when you're working in App1.sln, meaning if someone else is working in App2.sln and hasn't checked out their two trunks relative to each other exactly the same way you have, the references break.
Our work-around for this is to always check out all three trunks into the same folder as siblings, then put the packaging folder as another sibling, adding '../packages' in the NuGet.config next to each solution. This ensures the references never break, but forces the location of the checkouts which can be a problem.
C:/Code/
├───Shared Trunk
├───App1 Trunk
├───App2 Trunk
└───packages
However, if we could specify per-project package download locations, we could put the packaging folders relative to the projects themselves meaning it wouldn't matter where you check them out to. They would always find the packages they need. Yes, this means that in our example, there would be duplicate package downloads, but space on disk isn't the issue. Maintenance of the code is.
C:/Code/
├───Shared Trunk
│ └─sharedpackages
├───App1 Trunk
│ └─app1packages
└───App2 Trunk
└─app2packages
Again, what we want is when opening App1.sln, we want packages for Shared1.dll and Shared2.dll to go in 'sharedpackages' folder but packages used by App1 and App1.dll to go in app1packages.
So... is this possible? Can you specify different NuGet package download paths per project regardless of which solution they are in?
I'm in the same situation as /u/MarquelV.
From my investigation insofar into the options provided by nuget (at least up to ver. 3.5) for tackling this sort of scenario, I concluded that one has to completely ignore the graphical tools for nuget inside Visual Studio (at least as far as installing/restoring packages is concerned) and to also disable automatic package restore (Tools -> Options -> Nuget etc). Then resort to invoking nuget.exe from the command line whenever the need arises to install/restore packages specifying the folder in which the packages should be placed - this point is important because the graphical interfaces for nuget in visual studio are bend on storing packages in a "global" repository (typically right next to the .sln file of the solution).
In my projects I create an .nuget folder coupled with nuget.exe inside each and every project and reference dlls thusly.
Last but not least each and every project needs to restore packages by using nuget via the .csproj like so:
<Target Name="BeforeBuild">
<Exec Command=".\.nuget\nuget.exe restore .\packages.config -PackagesDirectory .\packages"/>
</Target>
The thing to take away from all this is that the graphical tools for nuget and the automatic package restoration (Tools -> Options -> Nuget) cannot be relied upon in order to achieve the goals described here.
I recently faced a similar issue. In my case I was combining projects from smaller solutions into a larger one. The projects were still referencing the packages in their subfolders, and I did not want to change those references and break the smaller solutions. I was able to solve it by symlinking the project packages path to the solution-level path.
mklink /J .\packages ..\packages
This effectively tricks the project into thinking it is using a more local version of packages when it was actually using the one from the larger solution.
It's not exactly the same situation, but close enough that I hope it can help someone.
Related
We have a solution which contains several projects. Some projects have NuGet packages installed, for example Json.NET. The whole solution is checked in to TFS Version Control, without the packages folder. We have set up Automatic Package Restore according to the "Nuget 2.7+ method" as described in the Nuget documentation (actually we didn't set up that much since all this is enabled by default).
When we build this solution on another computer, all packages are getting restored.
When we build this solution on our TFS 2013 Build server, all packages are also getting restored.
Now here comes the problem:
When we create a build on our TFS 2013 Build Server which should build only one of the projects in the solution (so targeting the .csproj file instead of the .sln file) the nuget packages are NOT getting restored!
Can anyone tell me why this is happening, or tell me if this is by design? I really don't want to build the whole solution, since it is a release build for only a single small project, but i do want the packages getting restored automaticly...
I believe the Automatic Package Restore hooks into the Build Solution event. Since there's no solution, it's not triggering the restore.
To build a single project, you may need to create a new solution that references just that project.
I've got a packages.config file checked into source control. This specifies the exact version of the Nuget dependency I want. We have our own NuGet repository. We are creating these NuGet packages ourselves.
<packages>
<package id="Dome" version="1.0.0.19" targetFramework="net45" />
<package id="Dome.Dojo" version="1.0.0.19" targetFramework="net45" />
</packages>
These packages have some JavaScript files which when you add the Nuget package as a reference in Visual Studio are copied to the Scripts folder in the project.
I don't want to check these JS files in to source control, I just want to check in the packages.config file.
When my project builds in Team City (or when I build in Visual Studio after a fresh checkout) it doesn't copy the JS files from the NuGet package. There's a question here explaining a similar problem:
NuGet package files not being copied to project content during build
But, the solution in the answer to that question doesn't work for me; that solution uses ReInstall, which is problematic because it can automatically upgrade the version in the packages.config file (say if a dependency is specified as a >=).
The whole point of this is that I want to be able to checkout a revision from my source control, and build that version with the right dependencies AND I want to use the nice packaging features of NuGet. So, I don't want any "automatically update to the latest version during the build."
There's an issue against NuGet (http://nuget.codeplex.com/workitem/2094) about NuGet files not restoring content files. And it's Marked as Closed By Design.
Thinking about how this works a little more, it appears to me (but I'm not 100% sure) that for assemblies NuGet has a different behaviour - it doesn't copy them into the project, instead it references them from the location in the packages folder. It strikes me that js files in the NuGet package should be referenced analogous to how dlls are referenced.
Is there a way to construct a NuGet package so that it references the JS as links in the project (in a similar way to how you can add an existing File as a Link in VS)? And would this solve my problem?
If not then I'll take the advice given by Jeff Handley when closing ticked Nuget Issue 2094 mentioned above:
The option you'd have is to create a new console executable that
references NuGet.Core, and you could build a supplemental package
restore for your own use that copies package contents into the
project.
Writing my own command line tool to copy the contents does seem like I'm pushing water uphill here - am I doing something fundamentally wrong?
The underlying problem here is Visual Studio's relatively poor support of JavaScript projects and JavaScript's lack of built-in module loader.
For C#, when you install a package it adds a reference in your .csproj file to the assembly on disk. When you build, MSBuild knows to copy the thing referenced to the bin directory. Since you aren't checking in your bin directory, this all works great.
Unfortunately for JavaScript, the build system isn't nearly as matured and there aren't well defined guidelines for NuGet to follow. Ideally (IMO), Visual Studio would not run web sites directly from your source directory. Instead, when you built it would copy the JavaScript files, CSS and HTML files to a bin directory from which they would be executed. When debugging, it would map those back to the original JavaScript or TypeScript files (so if you make a change it isn't to a transient file). If that were to happen then there is now a well-defined build step and presumably a well-defined tag for JavaScript files (rather than just "content"). This means that NuGet would be able to leverage that well-defined MSBuild tag and package authors could leverage the NuGet feature to do the right thing.
Unfortunately, none of the above is true. JavaScript files are run in-place, If you did copy them to bin on build Visual Studio would do the wrong thing and editing from a debugger would edit the transient files (not the originals). NuGet therefore has no well-defined place to put files so it leaves the decision up to the package author. Package authors know that the average user is just going to be running directly from source (no build-step) so they dump files into the source folder where they must be checked in to version control.
The entire system is very archaic if you are coming from a modern ecosystem like C# where someone took time to think these things through a bit.
What you could do is create an MSBuild task that, before build, would go through all of your packages, look for content, and copy that content to the desired location. This wouldn't be terribly difficult, though would take a bit of work.
Also, package authors could include a build-task that does this in their package so that before-build all of their content was copied local. Unfortunately, if only some package authors do this then you end up with weird fragmentation where some packages need to be committed to version control and others do not.
When a package is installed into a project, NuGet in fact performs these operations,
Download the package file from source;
Install the package into the so called packages folder, which is $(SolutionDir)\packages by default;
Install the package into the project, which consists of adding references to DLLs, copying content files into the project directory etc.
When a package is restored, only the first two steps are executed. Projects will not be touched by nuget package restore. Which is why the js files in your project will not be "restored".
The only solution for now is to check in the js files in your project.
If you are the owner of the package then you could use the nuget package i've created to be able to have a folder called "Linked" in the package and have a simple Install.ps1 and Uninstall.ps1 (one liners) to add every file in the nuget package's linked folder as existing to the project.
https://github.com/baseclass/Contrib.Nuget#baseclasscontribnugetlinked
I didn't try out how publication treats linked files, the problem is debugging the Project, as the JavaScript files will be missing in the directories.
If you are using git as source control you could try my nuget package which ignores all the nuget content files and automatically restores them before building.
Step by step example in my blog: http://www.baseclass.ch/blog/Lists/Beitraege/Post.aspx?ID=9&mobile=0
I am doing a code review. We have started using Nuget to import 3rd party libraries, for example Enterprise Library.
Previously we had the dll's for Enterprise Library on a share that was accessed by all projects in the solution. When it was updated it was updated for all projects.
Now it is controlled by the packages.config which is a file per project. A change would mean changing several files, with a chance that not all files that should have been updated were updated.
Is there a way to share packages.config across projects, or a way to be able to updated all packages.config at the same time?
In order to update packages in all the projects at the same time, you can launch the Manage NuGet Packages dialog at solution level (right click on the Solution).
Since you're doing a code review, allow me to say this: Enforcing package updates to its consumers is generally not a good practice.
Updating packages on the solution-level is the only way that ensures you the packages.config files will be updated as well (NuGet.exe update doesn't change these files).
packages.config files are project-related and can only be shared if all packages consumed by the projects are exactly the same. Add-as-link in the projects, update the repositories.config to point to the shared packages.config, and hope for the best (not sure whether NuGet checks for packages.config to be physically present in the project directory.
As you can see, this scenario is not mainstream and hence not supported in an optimal way.
We’re on a closed network without Internet access so we’re currently using a file share as a NuGet repository. We do a lot of merging between a development branch and a main branch and occasionally setup a one-off branch for a hotfix or large functionality that will extend beyond our normal release cycles.
What we’ve found is that when we add a NuGet package to a project it puts a file path in the .csproj file for where the package is located. This works fine until we merge into another branch in TFS and then kick off a build. The builds do not pull down the same files from source control (keeping dev and main completely separate in that regards) so the package path is not found and the build fails.
We’ve come up with one solution we know will work but is utterly painful and one proposed solution that we need to investigate further. One solution is to put the packages folder into a common location that every build will include in its workspace and manually modify each .csproj file to point to that location for its packages.
The solution we need to investigate is using NuGet without committing packages to source control. What we need to determine is if NuGet.exe will create the folder structure needed by the .csproj file to find the package.
Are there better solutions to using NuGet in a branching environment that uses automated builds?
What you could do is create a local NuGet repository that is reachable by http:// that can be easily used in both branch situations. Checkout how to create a local NuGet repository here
With newer versions of NuGet it is possible to configure a project to automatically restore NuGet packages so that the packages folder doesn't need to be included in the source code repository. Good.
However, this command adds a new .nuget folder and there is a binary there, NuGet.exe. This can also be re-created automatically by Visual Studio and so it doesn't feel correct to add that to version control. However, without this folder Visual Studio won't even load the solution properly.
How do you people deal with this? Add .nuget to source control? Run some command line script before opening the solution?
This post is old, you should not be using solution level NuGet package restore anymore. As of version 2.7+ there is an option in the NuGet setup to automatically restore packages on build.
So the .nuget folder can be deleted and the option removed from your projects.
http://docs.nuget.org/docs/reference/package-restore
UPDATE: With the release of NuGet 4.x and .NET Standard 2.0, when you use the new csproj format you can now use package references, ironically reintroducing the dependency on msbuild to restore packages, but now packages are a first class citizen of msbuild. The link above also makes mention of the PackageReference, but the following announcement details it better:
https://blog.nuget.org/20170316/NuGet-now-fully-integrated-into-MSBuild.html
And the NuGet 4.x RTM announcement, which ironically isn't as useful:
https://blog.nuget.org/20170308/Announcing-NuGet-4.0-RTM.html
UPDATE 2: Apparently with VS2017 you can even use package references with classic csproj projects, but they aren't backwards compatible anymore, and there have been some problems with restoring package sub-dependencies. I'm sure that will all be resolved.
#Richard Szalay's answer is right - you don't need to commit nuget.exe. If for some reasons Visual Studio does not automatically download the nuget.exe, make sure you have the following set to true in the nuget.targets file:
<!-- Download NuGet.exe if it does not already exist -->
<DownloadNuGetExe Condition=" '$(DownloadNuGetExe)' == '' ">true</DownloadNuGetExe>
Close the VS solution, reopen it and build it. Visual Studio should download nuget.exe automatically now.
According to this thread, the .nuget folder should be version controlled.
You need to commit .nuget\nuget.targets, but not nuget.exe. The targets will download the exe if it doesn't exist, as long as you change DownloadNuGetExe to true in nuget.targets
Although I usually don't like the idea of adding exe's to source control, I would suggest that source control should contain anything that is required in order to open, build and execute the project.
In this case it sounds like the .nuget folder is a required dependency. Therefore it ought to be under source control.
The only question left, that you need to research, is how NuGet is going to react if that folder is marked read-only, which TFS will do once it has been checked in.
Update:
I did a little more research on this as I've never used NuGet before. http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html
I would suggest that probably what you want to do is make NuGet a requirement that has to be installed on every developers workstation.
Further, you should place in source control the batch file required to get a workstation ready to start editing the project. The batch file is going to run the commands necessary to get and install the dependency packages.
Beyond that I'd say you might want to contact NuGet directly to ask them how, exactly, this is supposed to work.
Now that nuget supports package restoration we're looking at it more closely.
We use Subversion for source control, and my initial thoughts are that .nuget should be added to our repository, but added using svn:externals so that it points to a single location.
That way we can automatically push out new versions to all developers and projects. For projects on release branches, rather than HEAD, we can specify the revision of svn:externals reference if we want to leave nuget alone.
We have a lot of projects, so it also means not duplicating nuget.exe multiple times in the repo.
We have the nuget.config file in the folder, as it has the references to our internal Nuget server, using the Package Sources area:
https://docs.nuget.org/consume/nuget-config-settings
Apart from this reason, you should let Visual Studio handle the downloading of packages.