I have a bunch of personal libraries I reuse. These are kept in their own separate git repositories:
\Libs.Motors
\Libs.ComputerVision
\Libs.*
These each rely upon NuGet packages such as log4net.
I want to be able to copy specific release versions of these libraries into my projects.
\DestructoRobot
\Libs.Motors
\Libs.ComputerVision
\MainProject
DestructoRobot.sln
This appears to break when a particular NuGet package has differing versions in the \Libs.* compared to the solution's MainProject.
\DestructoRobot
\packages
\log4net.2.0.5
\Libs.Motors
\packages
\log4net.1.9.2
\MainProject
How do I ensure that NuGet works smoothly when copying over projects and continues to use old package references?
What is the recommended way of structuring this?
You have to remove strict dependency on your libraries and/or main project. i.e. add allowedVersions to your libraries and/or main project
Related
I would love to use vcpkg to manage our dependencies with third party libraries, but I'm not sure how this is would work in our environment. We have hundreds of projects in our solution, and release new versions of our software over time. This is a very common situation for software development, but I don't understand how vcpkg can work effectively.
Global installation will not work. We need to tie our releases to specific versions of packages. So we're left with using manifests. In a manifest, you specify a "builtin-baseline" which is a great way to pin the build of a project to a point in time, with some guarantees that the dependencies between the different packages are correct.
This is all great for a project. How does one manage hundreds of projects?
We would like to use the same baseline across all projects. Is there a way to specify the builtin-baseline in one place?
Some packages are common to all projects. Is there a way to specify this in one place?
We use project files for building. By default, vcpkg uses a project-local install directory. We don't need 100s of copies of boost. There is a vcpkg project setting for "Installed Directory" which allows a global location. But again, this setting is per-project. I don't see any integration with .props files. Is there a way to manage this?
Is there a way to get a solution-wide listing of all packages?
Would using CMake make this all easier?
This question is similar to one described here.
When using "legacy"-style .csproj project files we have a separate packages.config file where all dependencies are listed, including transitive ones. This enables a use case when one installs a package with dependencies and then decides which transitive dependencies can be manually updated. So, the benefits are:
Dependencies are easily identifiable due to presence of a flat list
Fine-grain control over all dependency versions
E.g., after installing Autofac.WebApi2.Owin from NuGet, we have a picture like this:
Transitive dependencies which are clearly viewable can be manually updated very easily.
When using the new Sdk-style .csproj projects NuGet references are added as <PackageReference/> to the project file itself and transitive dependencies are referenced by MSBuild silently:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net462</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Autofac.WebApi2.Owin" Version="4.0.0" />
</ItemGroup>
</Project>
So, to update transitive dependencies, one would have to
Identify them (e.g. via obj/project.assets.json)
Add all of them explicitly to the project
Perform updates
And this has to be done after each update and for every (!) transitive dependency in the project which is clearly almost impossible.
Possible resolutions:
Adding transitive dependencies to the project automatically
Show transitive dependency updates in NuGet GUI
Unfortunately, no such feature was found in the documentation.
So, is there an easy way to get the best from two worlds?
Not possible at the moment but a discussion is open on Github.
So, is there an easy way to get the best from two worlds?
I think NuGet already have an easy way to get the best from two worlds.
When using the new Sdk-style .csproj projects, you will notice that there is a tree structure of transitive dependencies in the Reference:
With this structure, we can not only get presence of a flat list can also clearly know the specific dependencies between packages. In the "legacy"-style .csproj, we could to know the flat list, but we could not know the specific dependencies between each package. We need select each package, then check it`s Dependencies. This is very inconvenient.
Besides, we generally do not go over the package itself and update its dependencies directly, this will bring a lot of conflict between dependencies. When you using the new Sdk-style, NuGet hides all dependencies of each package, so that the NuGet Package Manager UI and project file .csproj looks very simple.
If you still want to update one of dependence individually, you can install it, nuget will prompt you, you are update package from version 1 to version 2:like, Autofac:
In this case, you can update dependencies not referenced directly as PackageReference via NuGet.
For more detail info, you can refer to below blog:
https://blog.nuget.org/20170316/NuGet-now-fully-integrated-into-MSBuild.html
The ability to control transitive package reference updates is now being tracked in https://github.com/NuGet/Home/issues/5553
Suppose that a large project is split into multiple projects, each housed in an individual Mercurial repository (as per What's a good way to organize projects with shared dependencies in Mercurial?).
Suppose also, that a dependency manager is being used internally (we're using NuGet, but the same could apply to Maven) so that:
ProjectA depends on Ninject and MongoDB
ProjectB depends on ProjectA, and log4net
Projects A and B can be built independently; NuGet automatically downloads both OSS and internal dependencies from a NuGet server (ProGet in this case).
Suppose finally, that ProjectB depends on v1.2.3.4-SNAPSHOT of ProjectA, and that a CI server continually updates the ProjectA.1.2.3.4-SNAPSHOT package in the NuGet server. Thereby ProjectB will always be developed against the latest checked in changes of ProjectA.
What if related changes are required in both Project A and B? What neat and clever ways are there to do this properly? Some ideas:
Developer checks out Project A and B. Changes are made to A, built, and checked in. Developer waits for CI server to build and update the NuGet server. Changes are made to B, built, and checked in. (I dislike this as code is being checked in as part of development process.)
Developer checks out Project A and B, and rewires B to use A source as a dependency (instead of NuGet package ProjectA). Changes are done to both A and B. Check in is performed for both A and B together after proper testing, but developer must ensure dependency changes are not checked in.
I'm not particularly good at this, so I think that someone will blow my ideas out of the water with something quite clever.
I don't know about how NuGet does it, but with Maven, your second idea works fine, apart from 'rewires B to use A source as a dependency' being unnecessary. You would just build A locally (using install) and it would be installed to the your local Maven repo. Then when building B, it will pick up the newly built A, rather than the one from the central repo.
I could think of the following with nuget :
In Project A drop the packages at a central location (in this example I am placing it in c:\localpackages)
Build A with
MSBUILD.exe /t:Build,Package A.csproj
With Project B you could add a .nuget\nuget.config that specifies
(you can read more about specifying package folder location here http://docs.nuget.org/docs/release-notes/nuget-2.1)
This should pick the changes made by project A. It might become tricky when you have different versions of nuget package dropped for A
Hope this helps.
I have many projects that use a bunch of exact same class.
Is there a way to add a script to Xcode, so, each time i compile, he go to a network folder and update is files from there... If newer. (i do this step manually, but could be great to automate it)
Thanks
You could add a "run script" build phase to copy over files before compiling if that's really what you want to do. That would catch updates for you but I don't think it would help you if new files are added (though copying them into a location your project has a folder reference rather than a group pointing to might work).
That said I think there's a better solution. It sounds like you're reinventing a process for managing project dependencies when you could use existing tools. I would publish those shared classes as a library and add it to each project using CocoaPods and a reference to the library's git repository. That way you just need to run a pod install to get the latest version of your library. A good dependency manager gives you a clear understanding of which version of your dependencies you're currently using, control over when to update them, handles installing dependencies of your dependencies, and will avoid link errors from multiple static libraries attempting to each include a copy of the same common dependency.
I have a solution with multiple projects in it, so for example say 10 testing related projects have a dependency on nunit. Currently my solution structure includes folders for Tools and Lib, so maybe the full nunit download is in Tools and just the dll in Lib.
I suppose any package manager (NuGet and OpenWrap being two I'm looking at) needs to create it's own 'known' location for packages. So whereas the old fashioned way of package management, after manually updating my Lib folder, I know every project that had a dependency on nunit just got updated.
But if I update with a package manager, I need to visit each and every project to ensure it is updated and pointing at the same reference, yes? And some dll's may not be found (am thinking unHAddins right now) so you aren't completely liberated from manual package management. Meaning migration to the latest updates isn't done until each and every project is updated by the package manager.
So I'm wondering if my understanding is correct, what the best approach to incorporating package management into a decent sized solution is - for example:
0) add to source control: NuGet 'packages' folder or OpenWrap 'wraps' folder
1) pick a dll (start with one that you beleieve has minimal dependencies)
2) pick a project (ideally with minimal dependencies that might break)
3) if OpenWrap, get the package you want into 'wraps'
4) for each project:
a) add reference to subject dll (manually if OpenWrap, NuGet will add for you)
b) fix compile error as needed
c) run tests
Does that sound right?
Cheers,
Berryl
To answer your questions, no you don't have to do anything with openwrap, all projects import all dependencies within a scope, so the update applies to everything.
I can't answer for the other package managers out there, but in openwrap, you'd add the /wraps folder in source control with the packages that got pulled when you added or updated them. The process would be to first add the package from a remote repository (or create one from your existing assemblies if there's not one available), and remove manually the references from /lib. In OpenWrap, we don't add the references to your csproj, we add them at build time, so if there's already a dependency in /lib, we won't add it. That means you can add all the packages, and remove the references one after the other, running your tests everytime.
Hopefully, this is a temporary problem until all dlls are available as packages, which will happen rather quickly.