How to update dependencies not referenced directly as PackageReference via NuGet? - visual-studio

This question is similar to one described here.
When using "legacy"-style .csproj project files we have a separate packages.config file where all dependencies are listed, including transitive ones. This enables a use case when one installs a package with dependencies and then decides which transitive dependencies can be manually updated. So, the benefits are:
Dependencies are easily identifiable due to presence of a flat list
Fine-grain control over all dependency versions
E.g., after installing Autofac.WebApi2.Owin from NuGet, we have a picture like this:
Transitive dependencies which are clearly viewable can be manually updated very easily.
When using the new Sdk-style .csproj projects NuGet references are added as <PackageReference/> to the project file itself and transitive dependencies are referenced by MSBuild silently:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net462</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Autofac.WebApi2.Owin" Version="4.0.0" />
</ItemGroup>
</Project>
So, to update transitive dependencies, one would have to
Identify them (e.g. via obj/project.assets.json)
Add all of them explicitly to the project
Perform updates
And this has to be done after each update and for every (!) transitive dependency in the project which is clearly almost impossible.
Possible resolutions:
Adding transitive dependencies to the project automatically
Show transitive dependency updates in NuGet GUI
Unfortunately, no such feature was found in the documentation.
So, is there an easy way to get the best from two worlds?

Not possible at the moment but a discussion is open on Github.

So, is there an easy way to get the best from two worlds?
I think NuGet already have an easy way to get the best from two worlds.
When using the new Sdk-style .csproj projects, you will notice that there is a tree structure of transitive dependencies in the Reference:
With this structure, we can not only get presence of a flat list can also clearly know the specific dependencies between packages. In the "legacy"-style .csproj, we could to know the flat list, but we could not know the specific dependencies between each package. We need select each package, then check it`s Dependencies. This is very inconvenient.
Besides, we generally do not go over the package itself and update its dependencies directly, this will bring a lot of conflict between dependencies. When you using the new Sdk-style, NuGet hides all dependencies of each package, so that the NuGet Package Manager UI and project file .csproj looks very simple.
If you still want to update one of dependence individually, you can install it, nuget will prompt you, you are update package from version 1 to version 2:like, Autofac:
In this case, you can update dependencies not referenced directly as PackageReference via NuGet.
For more detail info, you can refer to below blog:
https://blog.nuget.org/20170316/NuGet-now-fully-integrated-into-MSBuild.html

The ability to control transitive package reference updates is now being tracked in https://github.com/NuGet/Home/issues/5553

Related

Package dependency when dependency not on nuget yet

I have a Xamarin binding, and another one which depends on the first. In the second binding the dependency to the first requires the dependency to be published on nuget. However, as both are updated together, it's not possible to build the second binding without publishing the first.
Ideally what I'd like to do is depend on the first package locally when building, but in the package .nuspec depend on the nuget package. As the first package will be published first, when the second package is published the dependency can be satisfied. Is this possible?
Do they need to live apart? Couldn't you have both projects live in the same solution? This way you could build them together, then package these two libraries separately afterwards.
Alternatively, you could create a local repository, which is just a folder where you put in your dependencies, and you can simply point to that folder as a NuGet repository. You can configure that through VS, or you can create a NuGet.config in the root of the repository and add an entry looking something like:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<add key="Local Packages" value="path/on/disk" />
</packageSources>
</configuration>
This should automatically be picked up by NuGet and it will try to restore packages from here too.

Gradle share dependencies in a cascade manner between related projects

I have the following Java projects structure:
Util
|
-- Core
|
-- Services
|
-- Tools
The projects: Tools and Services references to Core and Util projects, the thing is that I ended up writing the same dependency over each project, there must be a better way to inherit the dependencies of the referenced projects and add new ones if needed.
I know about multi projects in Gradle, but this is not like a multi project, since I can basically take the Core library, compile it (which will then contain Core + Util libs) and use it in another project.
I wonder what would be the best way to approach this?
Repeating the same dependencies in every project is usually reasonable because in a bigger project you'll never know when they become different, and you don't want to deal with compilation/runtime problems when someone changes common dependencies list.
I believe that it is more pragmatic to add dependency analyser plugin to your build. It will help you to remove unnecessary dependencies and explicitly add transitive dependencies. And if you add this plugin to your build chain, it will help you to keep your dependencies healthy in the future. Pick this plugin here gradle-dependency-analyze, or maybe there is a better fork or equivalent somewhere.
You are actually out of options in your case because there are only two kinds of dependencies: (1) external (some other jar artefact) or (2) internal (another module in a multimodule build).
2.1 When you use an external maven-like dependency it will come to you with own dependencies (they are named "transitive dependencies"). It means that if you do compile 'yourgroup:Core:1.0' then you will get Util as a transitive dependency. But as I mentioned above, it is better to list transitive dependencies explicitly if they are used during compilation or to prevent them from being accidentally removed and crash your application in runtime.
2.2. If your projects live in the same version control repository and usually change and build together, then the multimodule layout is your best choice. In this case, you will refer to Core dependency like compile project(':Util:Core') and it will grab Util as a transitive dependency as well. And you will be able to do what you asked for and define dependencies for Services and Tools once - inside subprojects {} closure in the Core/build.gradle.
Having multimodule built doesn't limit you from using Core library elsewhere. No matter if it is a multimodule build or not, you can always add maven-publish plugin to Core/build.gradle, execute publishToMavenLocal task and reference to Core.jar from another project the same way you do for external dependencies.
You can always put your common code (like the one which will add common dependencies) in the external gradle file or custom plugin and apply it in Services and Tools.

Import custom libraries into main project without breaking NuGet

I have a bunch of personal libraries I reuse. These are kept in their own separate git repositories:
\Libs.Motors
\Libs.ComputerVision
\Libs.*
These each rely upon NuGet packages such as log4net.
I want to be able to copy specific release versions of these libraries into my projects.
\DestructoRobot
\Libs.Motors
\Libs.ComputerVision
\MainProject
DestructoRobot.sln
This appears to break when a particular NuGet package has differing versions in the \Libs.* compared to the solution's MainProject.
\DestructoRobot
\packages
\log4net.2.0.5
\Libs.Motors
\packages
\log4net.1.9.2
\MainProject
How do I ensure that NuGet works smoothly when copying over projects and continues to use old package references?
What is the recommended way of structuring this?
You have to remove strict dependency on your libraries and/or main project. i.e. add allowedVersions to your libraries and/or main project

Coding when NuGet (or Maven) is used for enterprise project dependencies?

Suppose that a large project is split into multiple projects, each housed in an individual Mercurial repository (as per What's a good way to organize projects with shared dependencies in Mercurial?).
Suppose also, that a dependency manager is being used internally (we're using NuGet, but the same could apply to Maven) so that:
ProjectA depends on Ninject and MongoDB
ProjectB depends on ProjectA, and log4net
Projects A and B can be built independently; NuGet automatically downloads both OSS and internal dependencies from a NuGet server (ProGet in this case).
Suppose finally, that ProjectB depends on v1.2.3.4-SNAPSHOT of ProjectA, and that a CI server continually updates the ProjectA.1.2.3.4-SNAPSHOT package in the NuGet server. Thereby ProjectB will always be developed against the latest checked in changes of ProjectA.
What if related changes are required in both Project A and B? What neat and clever ways are there to do this properly? Some ideas:
Developer checks out Project A and B. Changes are made to A, built, and checked in. Developer waits for CI server to build and update the NuGet server. Changes are made to B, built, and checked in. (I dislike this as code is being checked in as part of development process.)
Developer checks out Project A and B, and rewires B to use A source as a dependency (instead of NuGet package ProjectA). Changes are done to both A and B. Check in is performed for both A and B together after proper testing, but developer must ensure dependency changes are not checked in.
I'm not particularly good at this, so I think that someone will blow my ideas out of the water with something quite clever.
I don't know about how NuGet does it, but with Maven, your second idea works fine, apart from 'rewires B to use A source as a dependency' being unnecessary. You would just build A locally (using install) and it would be installed to the your local Maven repo. Then when building B, it will pick up the newly built A, rather than the one from the central repo.
I could think of the following with nuget :
In Project A drop the packages at a central location (in this example I am placing it in c:\localpackages)
Build A with
MSBUILD.exe /t:Build,Package A.csproj
With Project B you could add a .nuget\nuget.config that specifies
(you can read more about specifying package folder location here http://docs.nuget.org/docs/release-notes/nuget-2.1)
This should pick the changes made by project A. It might become tricky when you have different versions of nuget package dropped for A
Hope this helps.

package managers, project structure and migration

I have a solution with multiple projects in it, so for example say 10 testing related projects have a dependency on nunit. Currently my solution structure includes folders for Tools and Lib, so maybe the full nunit download is in Tools and just the dll in Lib.
I suppose any package manager (NuGet and OpenWrap being two I'm looking at) needs to create it's own 'known' location for packages. So whereas the old fashioned way of package management, after manually updating my Lib folder, I know every project that had a dependency on nunit just got updated.
But if I update with a package manager, I need to visit each and every project to ensure it is updated and pointing at the same reference, yes? And some dll's may not be found (am thinking unHAddins right now) so you aren't completely liberated from manual package management. Meaning migration to the latest updates isn't done until each and every project is updated by the package manager.
So I'm wondering if my understanding is correct, what the best approach to incorporating package management into a decent sized solution is - for example:
0) add to source control: NuGet 'packages' folder or OpenWrap 'wraps' folder
1) pick a dll (start with one that you beleieve has minimal dependencies)
2) pick a project (ideally with minimal dependencies that might break)
3) if OpenWrap, get the package you want into 'wraps'
4) for each project:
a) add reference to subject dll (manually if OpenWrap, NuGet will add for you)
b) fix compile error as needed
c) run tests
Does that sound right?
Cheers,
Berryl
To answer your questions, no you don't have to do anything with openwrap, all projects import all dependencies within a scope, so the update applies to everything.
I can't answer for the other package managers out there, but in openwrap, you'd add the /wraps folder in source control with the packages that got pulled when you added or updated them. The process would be to first add the package from a remote repository (or create one from your existing assemblies if there's not one available), and remove manually the references from /lib. In OpenWrap, we don't add the references to your csproj, we add them at build time, so if there's already a dependency in /lib, we won't add it. That means you can add all the packages, and remove the references one after the other, running your tests everytime.
Hopefully, this is a temporary problem until all dlls are available as packages, which will happen rather quickly.

Resources