When nuget restore generates project.assets.json can we get callbacks to see where a dependency is coming from - nuget-package-restore

We have assemblies that are generated in project.assets.json that are dependencies. Some of these dependencies should not be there. I looked at all of one projects nuget packages dependencies on nuget.org I don't see the assemblies listed. Is there a way to get callbacks or something else to determine where a dependency is coming from when nuget restore is run?

Related

Installing upstream NuGet packages from my Azure DevOps Artifacts feed (to get them saved in that feed) has no effect

The documentation for consuming packages from upstream sources in our Azure DevOps Artifacts feed says to install packages from the NuGet Package Manager Console and then those packages and all their dependencies will be saved in our Artifacts feed and will be visible on the NuGet page in Visual Studio.
But when I do exactly that, NuGet says that the packages are already installed and does nothing. If I try deleting my packages directory and clearing the local NuGet caches first, NuGet says I have to do a NuGet restore first. And if I try to do the restore, NuGet says it can't find the packages in the package source.
Is there some undocumented step that I'm missing?
Is there some undocumented step that I'm missing?
You are not missing any undocumented steps.
When you restore the packages but not find the packages in the packages in the package source, you should make sure:
Your Azure DevOps Artifacts feed has enable the upstream sources.
All those packages in the Azure DevOps Artifacts feed or the upstream sources.
If you do not enable upstream sources or your packages are not exists in the Azure DevOps Artifacts feed or the upstream sources, you will get the error find the packages in the package source.
If you have confirm above but still have this issue, please share the error log in your question, I will check it.
Hope this helps.

Azure Build Pipeline: Building solution dependencies fails - can't find packages but building csproj dependency files independently works fine

I have a solution with a bunch of different api's in it which reference different dependencies we built in a utilities folder.
In Azure Devops, when I go to build the solution it complains that it can't find the dependencies of those dependencies and fails. Those dependencies are pointing to the same dependencies of the solution which are a few folders up; so the dependency HintPath is ../../packages as opposed to the solution's dependency HintPath of ./packages
When I build the csproj files independent of the solution (i.e. not building them as a dependency of the solution) they build just fine and find the packages folder with no issue. Nuget Restore reports no issues and we have checked in the packages to source control so there should not be anything missing. Even if there was, Nuget would restore them if they were missing.
So I'm thinking: When I build the solution and it tries to build the dependencies down below, it's trying to access the packages starting at the solution directory and then going up from there (master/../../packages/ instead of utilities/../../packages/). That folder does not exist which would cause it to fail and makes sense as to why the individual builds of the dependencies works.
Has anyone come across this issue before in Azure Dev Ops or has any idea how to fix this?
We could edit the hint paths to point to ./ for the dependencies which has worked but we have a local build we still use for production that doesn't function like that and would break. We're trying to keep these in sync.
Please let me know if you have any ideas.
Thanks
Azure Build Pipeline: Building solution dependencies fails - can't find packages but building csproj dependency files independently works fine
To resolve this issue, you could try to install dependencies of those dependencies to the solution. That because NuGet team deprecated solution level packages in NuGet 3.0:
https://github.com/NuGet/Home/issues/522
So, we need make sure the all the used packages for the solution in the default packages folder ./packages instead of the different packages folder.
If above not help you, you could share your project structure in your question.
Hope this helps.
The solution that worked for me was changing the way NuGet packages were stored in the solution. Instead of using packages.config, there's an option in Visual Studio to use PackageReference which is a new way of referencing packages for NuGet. After switching most of our projects to the new way of using NuGet it was able to run and build.

TFS check-in error: Could not find a part of the path

Our team works on a project with TFS as source control. Sometimes that I want to check-in some errors happen.
D:\CustomManager.1.0.7184.35750\lib\net461\CustomManager.dll: Could not find a part of the path 'D:\CustomManager.1.0.7184.35750\lib\net461\CustomManager.dll'.
I gotta go to the Packages folder and make a new folder. after that, I have to copy the previous version of the package in that and then rename it to solve the case. This way is a little annoying because sometimes after that, new errors will show with different versions.
Additional information: This error will only be shown when I update the NuGet packages.
Is there a simple way to fix this?
Seems you directly checked libraries(dlls) in to TFS and manage version control of them.
It's not a recommend way, there are multiple downsides such as it's never exactly clear which projects are using which versions of which assemblies. It's a maintenance nightmare.
Suggest you use Nuget to handle these libraries in TFS. You should compile your code, package it in Nuget and publish it. For multiple projects you can upgrade their nuget references when appropriate, or stick with the older versions if they need to. If you need to reference a known-good, stable version, you just make sure your project is configured to pull a specific version from NuGet.
TFS use Package Management that hosts NuGet, npm, and Maven packages alongside all your other TFS assets: source code, builds, releases, etc, also be able to handle the external packages.
You could add external packages to a TFS Package Management feed. When you restore the packages, select the feed. All need packages will be restored entirely. To achieve this, just use Push NuGet packages to specify the packages you want to publish and the target feed location.
More details please refer Get started with NuGet Package Management in TFS
Update:
Keep looking for old packages, this will not happen if you already referred the latest dll in project. Please double check this part.
In your situation, if you want to check the dll in source control, you should add dlls in the solution/project and use relative path. Otherwise, tfs server may not find path.
For cache issue, suggest you to clear TFS cache then restart VS, and check in again, this may do the trick.

Build Process Design: Nuget vs Artifact Dependencies

I have an application A that depends on an internal shared library B. A and B each has their own repositories.
I'm using TeamCity 10 to build these two projects. Two ways I considering doing this:
Build B and publish the dlls as artifacts. Build A with an artifact dependency on B.
Build B and publish as a nuget package. Build A with a nuget dependency on B.
My questions are:
Which approach is better?
Artifact dependencies seem like the simplest approach and can get the job done, but if we go the nuget route, we can generalize dependency resolution and decouple it from the buildserver to do it. Another advantage I can see with nugets is that when developers checkout solution for B, any dependencies can be resolved through nuget. While for artifact dependencies, some type of pre-build script on the developer's local machine is needed to retrieve/copy dll artifacts, mimicking what TeamCity does with artifact dependencies on the buildserver (is there a better approach for this?).
If we implement nuget dependencies, why would we ever need artifact dependencies at all?
Thanks in advance for any feedback offered.
The best approach for this is based on your projects developing progress.
If the project A and Project B are still in developing, I suggest you use artifact dependencies. Since you will change the projects very often when developing them, in TeamCity you just need to add artifact dependencies to the build configurations. No matter how you change the projects code, the build configurations need not to change.
If the projects development has finished, the NuGet dependencies is a good choice. Because if you are using NuGet dependencies when developing the projects, when any code changed during developing, you need to re-pack the packages and reinstall it into your project.
I would prefer to use Nuget just as the advantages you have already mentioned in your question. It's more convenient and can save more time when you want add the shared library to your project/solution to build locally.

NuGet downloads all packages, doesn't update references

I have my project set to Allow NuGet to download missing packages as well as Automatically check for missing packages during build in Visual Studio. In my solution folder, there is a packages folder and it contains everything that I need for my project. However the references to them in the project are still broken.
I have tried removing the references and adding them with NuGet, but NuGet says the item is already in the project (it is in the packages folder) even though the reference is there and the project can't build. The only way that I can seem to get around it is to manually go into each of the packages in the packages folder and select every .dll.
Is there a better way to do this?
Open package manager console and type:
Update-Package -Reinstall
This should refresh all your references based on each project's packages.config file.

Resources