Nuget package backup - visual-studio

It has been common practice to check nuget packages used in a solution into source control. With the new package restore feature of nuget 1.6 it is no longer necessary to check packages into source control. However, this leaves your projects dependent on nuget.org. There may come a time when a needed package is not available on nuget.org and not available locally in your organization, without which you would not be able to build your project.
Are there any enterprise solutions for backing up nuget packages used in projects in a centralized fashion? One scenario is to have an enterprise nuget proxy server, from which projects get their nuget packages. This proxy server can backup the requested packages in some fashion, like storing them on a backed up folder and checking the content into a shared source control repository. Another scenario is to have the backup logic done automatically on each developer's machine.
In summary, what are some good automated options for backing up nuget packages?

You should look at Artifactory from JFrog (http://www.jfrog.com/): they recently added support for NuGet. Artifactory can act as a central cache for multiple NuGet feeds. I spoke to the development team at a conference earlier this year, and they're really switched on.
There is also ProGet, although I have not used this: http://inedo.com/proget/overview

I don't know of any existing backup solutions, but this is definitely something we'd like to solve at some point. A couple of ideas come to mind.
Use DropBox to backup the packages directory into another location.
Run your own instance of NuGet.org locally and have your CI server populate the local one with installed packages.
Use MyGet.org to host a private feed of all the packages your team has installed.

Related

What's the best mechanism for deploying libraries (dll's) centrally, so a dev team can just add reference to them?

Just curious, per my title above.
I am leaning towards a nuget package-like style, where the dev team can continuously update libraries (dll's) and deploy them to a central location e.g. Nuget. Then, have project reference the dll depending on the version that's been deployed in a central location e.g. Nuget.
Deploying the dll's in a shared folder won't work.
Please advise, thanks.
Actually, nuget is the best way.
Pack
You can work with your teammates to maintain the same nuget project. When you are creating a nuget package or make some changes, you can just set different nuget versions for them to pack them as nuget package(.nupkg). These are different pack functions(non-sdk project uses nuget.exe, new-sdk projects use dotnet cli, msbuild.exe can work with both of them but note in non-sdk projects, it works with non-sdk projects with PackageReference).
Push
When you finish it, you could push the nuget package into a a shared folder, or push into an azure devops feed.
Use
You just send the full path of the shared folder or the feed url link(they should give your teammates sufficient permissions)
And anyone who wants to use it should set the link into nuget package sources, and then they can install it to use it.
Besides, if there is a subsequent update operation, as a nuget maintainer, you modify the lib project, after that, set a new version for it, any descriptions to tell others there is a new release updated version during pack process.
Then, push the new version into the feed to let others could install it.

TFS check-in error: Could not find a part of the path

Our team works on a project with TFS as source control. Sometimes that I want to check-in some errors happen.
D:\CustomManager.1.0.7184.35750\lib\net461\CustomManager.dll: Could not find a part of the path 'D:\CustomManager.1.0.7184.35750\lib\net461\CustomManager.dll'.
I gotta go to the Packages folder and make a new folder. after that, I have to copy the previous version of the package in that and then rename it to solve the case. This way is a little annoying because sometimes after that, new errors will show with different versions.
Additional information: This error will only be shown when I update the NuGet packages.
Is there a simple way to fix this?
Seems you directly checked libraries(dlls) in to TFS and manage version control of them.
It's not a recommend way, there are multiple downsides such as it's never exactly clear which projects are using which versions of which assemblies. It's a maintenance nightmare.
Suggest you use Nuget to handle these libraries in TFS. You should compile your code, package it in Nuget and publish it. For multiple projects you can upgrade their nuget references when appropriate, or stick with the older versions if they need to. If you need to reference a known-good, stable version, you just make sure your project is configured to pull a specific version from NuGet.
TFS use Package Management that hosts NuGet, npm, and Maven packages alongside all your other TFS assets: source code, builds, releases, etc, also be able to handle the external packages.
You could add external packages to a TFS Package Management feed. When you restore the packages, select the feed. All need packages will be restored entirely. To achieve this, just use Push NuGet packages to specify the packages you want to publish and the target feed location.
More details please refer Get started with NuGet Package Management in TFS
Update:
Keep looking for old packages, this will not happen if you already referred the latest dll in project. Please double check this part.
In your situation, if you want to check the dll in source control, you should add dlls in the solution/project and use relative path. Otherwise, tfs server may not find path.
For cache issue, suggest you to clear TFS cache then restart VS, and check in again, this may do the trick.

Nuget Packages not available in TeamCity

We are experimenting with using Octopus for CD using TeamCity. We have enabled OctoPack to create the Nuget Packages for use during the deployment. We also are experimenting with building libraries and using the integrated Nuget Server.
We were able to do both successfully. Both deployment to an environment, and using Nuget.Config to both install the library package and restore and build in TC.
Yesterday, the CD stopped working, the packages are being built but the Nuget Server is not making them accessible. We reset metadataBuilds, per TC instructions and we are still not getting new packages in the feed. We did confirm that the packages are still being built.
Any ideas?
I would say a starting point to get a solution to this depends on where the nuget packages are stored when your solution is built in TC.
If they are left on the TC Nuget feed, then you would want to watch to see if there are more than 100 packages in the Nuget store (TC Artifacts). We have found that once you go over 100 packages, the ones after those 100 do not show up in the feed when Octopus tries to pull from it.
If you are pushing to the native Octopus Nuget store, perhaps watch to sere that the space on that server hasn't filled to the point where it cannot push any more of them.
The build log in TC should tell you a lot about where and how these packages are being dealt with. They should also show up as build artifacts after a build, which would allow further verification that they are at least being built.
Although it may not be related, the Nuget feed in TC may take a while to pick up new packages after a build finishes. Particularly once you get a large number of packages. That may cause Octopus to fail if it is kicked off right afterwards (by a chained build).
What I've found works best is to push deployment packages directly to the Octopus internal nuget store, and keep shared (referenced in other projects) in TC or another nuget server. (NB you cannot use Octopus as a nuget server to retrieve packages). The push is done as an explicit step in the build that produces the packages.

Is it possible to share huge data between projects when the data is installed by Nuget?

I found that nuget seems to always install packages under the folder of the visual studio project.
It is not feasible for me because the package that I'm going to distribute contains huge amount of data. I don't want to make a copy of that whenever I add that package to a new visual studio project.
I want that data to be shared between projects. Since it is shared, if one project removed that package, the data should stay there until I explicitly tell the system to remove it.
Is there any way that can deal with this kind of problem?
I heard that Maven installs packages in a global location and it doesn't have such a problem. How about using Maven to install .NET libraries, is that possible? What would be the potential problems?
To upload and store your .NET artefacts you'll need a Maven repository manager like Nexus, Artifactory or Archiva. Good news is that these are capable of storing files of any type.
If you don't fancy converting your build process over to Maven, I'd recommend the following answer on using Apache ivy with MSBuild. All Maven clients appear to cache their downloads for use across projects (They're basically intelligent downloaders)
The upcoming 2.0 version of Nexus promises integration with NuGet. I'm expecting better .NET support from Maven in the future.

How (and when) do I use TFS with private DLLs that can also be served by NuGet/NuPack?

We have a couple of private "Enterprise Services" DLLS that are used in all our Websites for authentication, logging, etc. Since they are private, we also control the versioning and source of these DLLs. Our historic (error prone) steps after creating File | New Project include
Add the "Enterprise Services" project
Add a reference to above
Edit web.config sections such as Authentication, HttpHandlers, etc...
NuGet will automate the above process
I just came across NuGet (bundled in MVC3) which allows me to download and install VS2010 packages from a privately hosted server, and automate the config settings that previously would have made manually.
Question:
Does it make sense to publish my dll into a private NuGet server?
Will I lose the ability to debug and step into this dll if I need to?
What other things should I consider if the rest of my project is based in TFS?
I agree with marcind: having a private feed make sense.
My 2 cents are that you don't need to configure a private server: configuring your VS to target a shared folder is enough for distributing the packages and it will be easy to update with your TFS builds: just create the NuGet package and drop it into the shared folder.
Keep in mind that, for the latest NuGet bits that I tested, the client (both the console and the gui) does not look into other feeds for locating the dependecies so it will complaint that it can't resolve them automatically: you'll have to install them by hand.
Yes, it makes sense for you to have a private NuGet feed
I'm not sure about stepping into the dll, but if you provide PDBs in your NuGet package as well as the library sources on a share (and then configure VS to know where those sources are) then you should be able to step into the code just like you can today for the .NET framework itself.
NuGet was designed to work well with projects that are mapped to source control so hopefully there's nothing else you need.
#Ghidello NuGet will resolve dependencies automatically as long as you aren't using a specific respository (the package source dropdown in the console is set to All instead of your private repo)

Resources