Get NuGet installed dll's to travel to another TFS workspace - visual-studio

I am working on projects in a TFS environment. The work is distributed to several machines with different developers. A person, who is installing a dependency with NuGet into a project, get's actual dll's to his machine. When another user on another machine Gets Latest Version, he has the project with the dependency, but not the actual dll's, so the dep. is marked with an exclamation mark.
We manually add dll's to a source control, and have to be calling Get Latest Version of a Source Control Window, not on in the solution explorer on a different machine, in order to get the dependency dll.
Is there a way to just make TFS store dll's automatically, to facilitate the process discribed above?

You can get the dependencies at compile time. Take a look at this link

Related

TFS check-in error: Could not find a part of the path

Our team works on a project with TFS as source control. Sometimes that I want to check-in some errors happen.
D:\CustomManager.1.0.7184.35750\lib\net461\CustomManager.dll: Could not find a part of the path 'D:\CustomManager.1.0.7184.35750\lib\net461\CustomManager.dll'.
I gotta go to the Packages folder and make a new folder. after that, I have to copy the previous version of the package in that and then rename it to solve the case. This way is a little annoying because sometimes after that, new errors will show with different versions.
Additional information: This error will only be shown when I update the NuGet packages.
Is there a simple way to fix this?
Seems you directly checked libraries(dlls) in to TFS and manage version control of them.
It's not a recommend way, there are multiple downsides such as it's never exactly clear which projects are using which versions of which assemblies. It's a maintenance nightmare.
Suggest you use Nuget to handle these libraries in TFS. You should compile your code, package it in Nuget and publish it. For multiple projects you can upgrade their nuget references when appropriate, or stick with the older versions if they need to. If you need to reference a known-good, stable version, you just make sure your project is configured to pull a specific version from NuGet.
TFS use Package Management that hosts NuGet, npm, and Maven packages alongside all your other TFS assets: source code, builds, releases, etc, also be able to handle the external packages.
You could add external packages to a TFS Package Management feed. When you restore the packages, select the feed. All need packages will be restored entirely. To achieve this, just use Push NuGet packages to specify the packages you want to publish and the target feed location.
More details please refer Get started with NuGet Package Management in TFS
Update:
Keep looking for old packages, this will not happen if you already referred the latest dll in project. Please double check this part.
In your situation, if you want to check the dll in source control, you should add dlls in the solution/project and use relative path. Otherwise, tfs server may not find path.
For cache issue, suggest you to clear TFS cache then restart VS, and check in again, this may do the trick.

This project references NuGet packages that are missing on this computer (TFS)

I'm using TFS for the first time and attempting a build. I'm getting the error:
This project references NuGet package(s) that are missing on this computer. use NuGet Package Restore to download them.
I realize that there are many similar posts on StackOverflow and I've searched through quite a few of them. What I've gathered is that the two boxes under Package Restore in Package Manager Settings should be checked (but that this is also irrelevant now because they're checked by default). I verified that mine were both checked anyway.
The next piece of advice I considered is deleting the /packages folder from the Source Control version of my application. There is no packages folder there OR in my local (pre migration to TFS) version of the application. Instead, there's a ../packages/ folder (up one level) from the application folder. It seems that, at some point, I've opted to store the packages for all of my applications in the same folder? If so, where is this setting and what do I need to change it to either in my local version or in Source Control Explorer?
Thanks!

Bundling non-Nuget packages in Bamboo

I have been trying to find an "elegant" way to integrate non-NuGet package with my Bamboo builds.
There's a plethora of stacks on the topic of adding non-NuGet packages into NuGet bundles:
managing non-nuget dlls along with nuget packages
Creating NuGet package with reference to a non-NuGet reference
Trying to add non-.NET libraries to NuGet package
and the list goes on. There's also many a stack about using NuGet in Bamboo and that part works smoothly.
None of these deal with the situation of having an automated build environment, which may be sitting on some other remote server, running Bamboo.
Specifically, I'm trying to automate Xamarin.iOS deployments to HockeyApp.
The steps are:
Coding and local testing in VS2015 on Windows and with Mac for iPhoneSimulator
Merge into deployment branch and push to Bitbucket server
Bamboo picks up the push and kicks off build
Build checks out deployment branch
Runs NuGet downloads
Starts compile for Ad-Hoc/iPhone environment, creating IPA
Kicks off the HockeyApp deployment (there's a free addon for that)
Nearly all the steps are in place, except for the one where I have 2 dependencies which the commercial vendor (Syncfusion) has for unknown reasons decided to bundle into their "Studio" product, from where my Visual Studio project/solution has to refer to them by location outside of my project directory.
As a result, my Bamboo build fails with not-found DLLs, because they're missing as they would have to come in somewhere between 5. and 6. above.
I don't want to copy the binaries and then check/commit/push them into my repository, as that's considered a Bad Thing. My Bamboo Plan already successfully grabs NuGet packages before the actual build without having to drag binaries along.
Simply copying the DLLs on to the Bamboo build machine (i.e. where the remote agent is running) was one idea, but the problem is that the VS/MSBuild project file now has hard-wired directories - so, I'd have to install the whole Syncfusion Studio, or emulate their directory structure just for those 2 DLLs.
So I would need to adjust the .csproj references in an automated fashion. Not sure how I would do that, except with Yet Another Task and Script.
Apart from yelling at Syncfusion (which I've already done) about making all their DLLs available through NuGet (because some are, and those I'm successfully receiving in step 5. above), does anybody have a suggestion how to get this missing step to work?
For expediency sake, I have now added the libraries to the repository in a separate sub-directory.
It's not how I wanted to do it, but as the 2 libraries were a mere 200-300KB each and as there just didn't seem to be a simpler solution it solved the issue for now.
Specifically:
Leave .dll in .gitignore
Copy libraries you need into local sub-directory, e.g. LocalLibs/
add specific libraries with git add -f LocalLibs/speciallib.dll so that only these become a part of the repo
Change the project reference in Visual Studio to point to the local libraries, instead of their main install location
Verify that builds still work from within Visual Studio but also with MSBuild
I may revisit this and update if a better way comes along, especially if the libraries are significantly larger, such that you definitely would not want to add them to your repository.

'Sharing' class libraries in Visual Studio Online source control

We are currently migrating our source control to Visual Studio Online. A feature we had in our old system (SourceGear Vault) was to share projects between solutions. Although this created a folder for our Framework project in each solution it kept it up to date when changes were checked in.
This is useful to us as it allows us to work on the Framework code in all the Solutions that are using it. I know its better practice to just compile the dlls and reference them - at this point in development we want to continue having full code access and debugging in all the solutions using this core framework.
Any help very much appreciated.
You have a few equally valid options for handling shared projects:
Reference the same project from each solution that needs it.
This gives you full control over the source code of the shared project while you work on the consuming solution, and may allow for easier debugging.
The downside here is that maintenance and releases may become trickier if Solution A is being released on Thursday, but Solution B is being released in 3 months and is in the middle of a huge refactoring cycle that has significantly modified Shared Component X, and Shared Component X isn't stable enough to be released.
Reference shared components from an internal NuGet repo.
You set up your release pipeline to push the shared components into NuGet as part of your release process (ideally, using a purpose-built release management tool... Microsoft Release Management is what I have in mind here) -- you check the code in, project gets built. Release process packages it up and pushes it into NuGet as a "prerelease" version. You reference the latest version in anything that needs the latest version.
If you need to reference a known-good, stable version, you just make sure your project is configured to pull a specific version from NuGet.
When you're done, you've tested the shared thing, and you know everything is good, you approve the prerelease version, and the same binaries are repackaged into a "stable" version.
The downside here is that there are some additional software requirements, configuration, and training for your team. This would be my recommended approach.
Check binaries into source control.
I don't recommend this one -- you end up bloating your source control repo (and if you're using Git, it's an explicitly stated anti-pattern -- never put binaries into Git, it causes long-term severe performance problems), and it's never exactly clear which projects are using which versions of which assemblies. It's a maintenance nightmare.
(1) is the best approach if you're releasing everything in lockstep and don't have to worry about maintaining separate versions.
(2) is the best approach if #1 is false.
(3) is the best approach if #1 is false and you're a time traveler who is posting from 2006.
Have you considered implementing Symbol Server and Source Server indexing with the check-in binaries or NuGet repo approach? This allows you to easily get back to the source while debugging and it's coming from a single known location. Visual Studio Online and Team Foundation Server have built-in support for helping you out with getting this setup during your build process. Here's more information in a write-up here: Source Server and Symbol Server Support in Team Foundation Server and Visual Studio Online
Thanks for the responses. We actually found a solution that works well for us. We branch our framework project into the implementation projects when we want access to the code base. If not we just use the DLLs
If it is then altered and checked into the implementation project it can be merged back with the other Framework branches easily when ready.
This probably wouldn't work well if the Framework code was being developed heavily, as it is now its only undergoing small additions and tweaks so wont be plagued with merge issues.
I have to agree with the majority. I just ran into the same issue. I implemented the Nuget Gallery Site on the internal network. It was a pain to implement, but once implemented, it's easy to use. I created a class library project that implements ADO.Net and the EntityFramework. I bundled it into a NuGet package and uploaded it to the internal NuGet gallery. From there I was able to add a package source to the internal NuGet gallery and grab the package that I uploaded. Very simple and convenient.
I set up the NuGet Gallery with Visual Studios 2017. FYI: Make sure that building the project isn't part of the Publish. It will not render with a ViewHelper.cshtml error.
I created the projects with Visual Studios 2015. I ran the command prompt as administrator. I also had to copy the NuGet.exe file into the directory where the project file existed.
Check out the below links for more information.
NuGet Gallery
Hosting NuGet Gallery
Create and Publish Package
Creating a Package
Create .Net Standard Package

Adding Visual Studio Project references to SVN

I checked in a project to SVN with about 15 references from one dev box then checked out the same project on a second dev box but most of the reference files are missing. Is it possible to checkin the reference files automatically?
Version control will only keep track of the actual files underneath the working folder. If the third party libraries are installed elsewhere on the machine, they will not be included in the source control at all.
You'll have to do one of these:
Ensure that the 3rd party libraries (eg, nunit, enterprise libraries) are installed on all required development machines.
Don't install the libraries using the normal installers at all, instead, add the individual dll's and other resources to source control as Vendor Branches, then bring them under your project by either branching them into your project location, or by adding an svn:externals definition.
Copy the required reference files under into your source locations, add them to source control and reference them from there.
I think it's hard for Visual Studio SCC tools to determine wether or not these files should be automatically added. If you're using the first scenario Jim T described, you definitely don't want that to happen.

Resources