I have a nuget package with ~30mb of content items. It is used in several projects of solution versioned with svn. Content items are included automatically in every project and become versioned several times. Also content items are changed frequently, so every commit with update of nuget takes up + 60-90 mb of space (old content deleted, new appeared. so no patches).
I moved these contents files to solution folder (like in question "Adding solution-level items in a NuGet package") to prevent data duplication and added svn ignore on these files.
Is there any good way to automatically deploy script which is copying solution files after build to output directory to projects where nuget package is installed?
Related
I'm using TFS for the first time and attempting a build. I'm getting the error:
This project references NuGet package(s) that are missing on this computer. use NuGet Package Restore to download them.
I realize that there are many similar posts on StackOverflow and I've searched through quite a few of them. What I've gathered is that the two boxes under Package Restore in Package Manager Settings should be checked (but that this is also irrelevant now because they're checked by default). I verified that mine were both checked anyway.
The next piece of advice I considered is deleting the /packages folder from the Source Control version of my application. There is no packages folder there OR in my local (pre migration to TFS) version of the application. Instead, there's a ../packages/ folder (up one level) from the application folder. It seems that, at some point, I've opted to store the packages for all of my applications in the same folder? If so, where is this setting and what do I need to change it to either in my local version or in Source Control Explorer?
Thanks!
I have two copies of Visual Studio 2017 open and each window is working with a separate solution. Both solutions are checked into Team Foundation Server source control in separate folders. For example, \Projects\Project1 and \Projects\Project2. There are no files common between these two projects.
When I add a new file into \Projects\Project1 using Windows Explorer, it appears as an entry in both copies of Visual Studio under "excluded changes". If I add the file through Visual Studio, it appears to get added as a file that's referenced by that solution (or project) and appears in the included changes for that project.
How do I configure Visual Studio to not add arbitrary files that exist outside of the solution folder into the detected files list?
In Visual Studio, under Excluded Changes, there is a View Options link with a Show Solution Changes option as well as a filter. However, neither of these options stops the inclusion of files that belong to other projects. It is as if the root of the detected code has been set to the parent folder of both solutions.
Since you are using a local workspace, first please go through below mechanism of detecting files:
While edits in a local workspace can be pended implicitly just by
editing the file, adds and deletes still must be explicitly pended.
However, TFS version control’s workspace scanner also detects new
files which are ‘candidates’ for addition, and missing files and
folders which are candidates for deletion. The Team Explorer’s Pending
Changes Page has a link which shows the number of detected adds and
deletes, and provides a link to the Promote Candidate Changes dialog
which can be used to pend ‘real’ adds or deletes on these items.
This is called “promoting” the candidate adds and/or deletes – because
they become real pending changes.
Source Link: Server workspaces vs. local workspaces
In your case, since you are adding files using Windows Explorer, TFS will add the files in excluded changes, you need to manually promote the files as real add in pending changes list. However, TFS could not judge the exclude list should belong to which Visual Studio. That's why it appears as an entry in both copies of Visual Studio under "excluded changes".
How do I configure Visual Studio to not add arbitrary files that exist
outside of the solution folder into the detected files list?
There is no way to do this for local workspace, unless you are using server workspace, but there are also many limitations.
Another workaround is configuring multiple projects which have multiple TFS workspaces. You could have several branches for different projects under the main or dev branch, and create a different workspace for each branch. Each branch is a 'project' in itself.
It is as if the root of the detected code has been set to the parent
folder of both solutions.
Correct. It sounds as though you have a single TFVC workspace mapped to a single root folder. If you have a single workspace mapped to a root folder, all changes in that folder will be reflected in the team explorer.
The solution: Use multiple workspaces, one for each project. If you're not familiar with the concept of workspaces in TFVC, there is extensive documentation available on the subject.
Consider this repo/file structure for our solution...
Shared Repo (Checked out to D:/Shared/trunk)
├───Shared1.dll Project
└───Shared2.dll Project
App1 Repo (Checked out to C:/Code/App1/Trunk)
├───App1 Project (Refs Shared1.dll project)
├───App1.dll Project (Refs Shared1.dll and Shared2.dll projects)
└───App1.sln
App2 Repo (Checked out to C:/Code/App2/Trunk)
├───App2 Project (Refs Shared1.dll project)
├───App2a.dll Project (Refs Shared1.dll and Shared2.dll projects)
├───App2b.dll Project (Refs Shared1.dll and App2a.dll projects)
└───App2.sln
To make working with the code easier, we bring in the Shared projects directly into the application's solutions, meaning for instance if you open App1.sln, this would be your project tree...
App1.sln
├───Shared1.dll Project
├───Shared2.dll Project
├───App1 Project (Refs Shared1.dll project)
└───App1.dll Project (Refs Shared1.dll and Shared2.dll projects)
As you can see, the two Shared DLLs are from a separate repository but are included in this solution. Visual Studio handles this without any issue, prompting you that you are updating multiple repos when you perform a commit against the solution. That's fine and is exactly what we want.
The issue we're having however is with NuGet. From what we understand, the NuGet.config (and the hierarchy/precedence of reading/applying them) is relative to the solution file, and therefore the projects' NuGet references are updated accordingly. This causes issues in that the references to the NuGet packages in Shared1.dll an Shared2.dll are relative to App1.sln when you're working in App1.sln, meaning if someone else is working in App2.sln and hasn't checked out their two trunks relative to each other exactly the same way you have, the references break.
Our work-around for this is to always check out all three trunks into the same folder as siblings, then put the packaging folder as another sibling, adding '../packages' in the NuGet.config next to each solution. This ensures the references never break, but forces the location of the checkouts which can be a problem.
C:/Code/
├───Shared Trunk
├───App1 Trunk
├───App2 Trunk
└───packages
However, if we could specify per-project package download locations, we could put the packaging folders relative to the projects themselves meaning it wouldn't matter where you check them out to. They would always find the packages they need. Yes, this means that in our example, there would be duplicate package downloads, but space on disk isn't the issue. Maintenance of the code is.
C:/Code/
├───Shared Trunk
│ └─sharedpackages
├───App1 Trunk
│ └─app1packages
└───App2 Trunk
└─app2packages
Again, what we want is when opening App1.sln, we want packages for Shared1.dll and Shared2.dll to go in 'sharedpackages' folder but packages used by App1 and App1.dll to go in app1packages.
So... is this possible? Can you specify different NuGet package download paths per project regardless of which solution they are in?
I'm in the same situation as /u/MarquelV.
From my investigation insofar into the options provided by nuget (at least up to ver. 3.5) for tackling this sort of scenario, I concluded that one has to completely ignore the graphical tools for nuget inside Visual Studio (at least as far as installing/restoring packages is concerned) and to also disable automatic package restore (Tools -> Options -> Nuget etc). Then resort to invoking nuget.exe from the command line whenever the need arises to install/restore packages specifying the folder in which the packages should be placed - this point is important because the graphical interfaces for nuget in visual studio are bend on storing packages in a "global" repository (typically right next to the .sln file of the solution).
In my projects I create an .nuget folder coupled with nuget.exe inside each and every project and reference dlls thusly.
Last but not least each and every project needs to restore packages by using nuget via the .csproj like so:
<Target Name="BeforeBuild">
<Exec Command=".\.nuget\nuget.exe restore .\packages.config -PackagesDirectory .\packages"/>
</Target>
The thing to take away from all this is that the graphical tools for nuget and the automatic package restoration (Tools -> Options -> Nuget) cannot be relied upon in order to achieve the goals described here.
I recently faced a similar issue. In my case I was combining projects from smaller solutions into a larger one. The projects were still referencing the packages in their subfolders, and I did not want to change those references and break the smaller solutions. I was able to solve it by symlinking the project packages path to the solution-level path.
mklink /J .\packages ..\packages
This effectively tricks the project into thinking it is using a more local version of packages when it was actually using the one from the larger solution.
It's not exactly the same situation, but close enough that I hope it can help someone.
I have a project with a few front-end frameworks obtained via and managed by Nuget (Twitter Bootstrap, jQuery, jQuery UI ...).
I want to keep the files in my project, but remove them from Nuget's grip (I don't like the way Nuget organizes these files).
When I un-check the project for a library Nuget removes all the files it had installed, unless I've editted them (e.g. I over-wrote bootstrap.css with a customized version from getbootstrap.com).
As I do this from time-to-time, instead of backing up the /Content and /Scripts directories and adding back in the relevant files after removing the library from Nuget, I'd like to be able to dissociate all files of a particular library from Nuget at once without removing them from the project's directories. Is that possible, by either the GUI or the console?
I don't know if this issue is specific to a version(s) of Visual Studio, but mine is VS 2012
Can you elaborate for the part of "I don't like the way Nuget organizes these files"? If you remove NuGet from the picture, the good things that it does for you (detection of package updates, automatic package restore etc.) will be gone.
If you absolutely need to do this, one possible hack would be deleting packages.config from the project.
I've got a packages.config file checked into source control. This specifies the exact version of the Nuget dependency I want. We have our own NuGet repository. We are creating these NuGet packages ourselves.
<packages>
<package id="Dome" version="1.0.0.19" targetFramework="net45" />
<package id="Dome.Dojo" version="1.0.0.19" targetFramework="net45" />
</packages>
These packages have some JavaScript files which when you add the Nuget package as a reference in Visual Studio are copied to the Scripts folder in the project.
I don't want to check these JS files in to source control, I just want to check in the packages.config file.
When my project builds in Team City (or when I build in Visual Studio after a fresh checkout) it doesn't copy the JS files from the NuGet package. There's a question here explaining a similar problem:
NuGet package files not being copied to project content during build
But, the solution in the answer to that question doesn't work for me; that solution uses ReInstall, which is problematic because it can automatically upgrade the version in the packages.config file (say if a dependency is specified as a >=).
The whole point of this is that I want to be able to checkout a revision from my source control, and build that version with the right dependencies AND I want to use the nice packaging features of NuGet. So, I don't want any "automatically update to the latest version during the build."
There's an issue against NuGet (http://nuget.codeplex.com/workitem/2094) about NuGet files not restoring content files. And it's Marked as Closed By Design.
Thinking about how this works a little more, it appears to me (but I'm not 100% sure) that for assemblies NuGet has a different behaviour - it doesn't copy them into the project, instead it references them from the location in the packages folder. It strikes me that js files in the NuGet package should be referenced analogous to how dlls are referenced.
Is there a way to construct a NuGet package so that it references the JS as links in the project (in a similar way to how you can add an existing File as a Link in VS)? And would this solve my problem?
If not then I'll take the advice given by Jeff Handley when closing ticked Nuget Issue 2094 mentioned above:
The option you'd have is to create a new console executable that
references NuGet.Core, and you could build a supplemental package
restore for your own use that copies package contents into the
project.
Writing my own command line tool to copy the contents does seem like I'm pushing water uphill here - am I doing something fundamentally wrong?
The underlying problem here is Visual Studio's relatively poor support of JavaScript projects and JavaScript's lack of built-in module loader.
For C#, when you install a package it adds a reference in your .csproj file to the assembly on disk. When you build, MSBuild knows to copy the thing referenced to the bin directory. Since you aren't checking in your bin directory, this all works great.
Unfortunately for JavaScript, the build system isn't nearly as matured and there aren't well defined guidelines for NuGet to follow. Ideally (IMO), Visual Studio would not run web sites directly from your source directory. Instead, when you built it would copy the JavaScript files, CSS and HTML files to a bin directory from which they would be executed. When debugging, it would map those back to the original JavaScript or TypeScript files (so if you make a change it isn't to a transient file). If that were to happen then there is now a well-defined build step and presumably a well-defined tag for JavaScript files (rather than just "content"). This means that NuGet would be able to leverage that well-defined MSBuild tag and package authors could leverage the NuGet feature to do the right thing.
Unfortunately, none of the above is true. JavaScript files are run in-place, If you did copy them to bin on build Visual Studio would do the wrong thing and editing from a debugger would edit the transient files (not the originals). NuGet therefore has no well-defined place to put files so it leaves the decision up to the package author. Package authors know that the average user is just going to be running directly from source (no build-step) so they dump files into the source folder where they must be checked in to version control.
The entire system is very archaic if you are coming from a modern ecosystem like C# where someone took time to think these things through a bit.
What you could do is create an MSBuild task that, before build, would go through all of your packages, look for content, and copy that content to the desired location. This wouldn't be terribly difficult, though would take a bit of work.
Also, package authors could include a build-task that does this in their package so that before-build all of their content was copied local. Unfortunately, if only some package authors do this then you end up with weird fragmentation where some packages need to be committed to version control and others do not.
When a package is installed into a project, NuGet in fact performs these operations,
Download the package file from source;
Install the package into the so called packages folder, which is $(SolutionDir)\packages by default;
Install the package into the project, which consists of adding references to DLLs, copying content files into the project directory etc.
When a package is restored, only the first two steps are executed. Projects will not be touched by nuget package restore. Which is why the js files in your project will not be "restored".
The only solution for now is to check in the js files in your project.
If you are the owner of the package then you could use the nuget package i've created to be able to have a folder called "Linked" in the package and have a simple Install.ps1 and Uninstall.ps1 (one liners) to add every file in the nuget package's linked folder as existing to the project.
https://github.com/baseclass/Contrib.Nuget#baseclasscontribnugetlinked
I didn't try out how publication treats linked files, the problem is debugging the Project, as the JavaScript files will be missing in the directories.
If you are using git as source control you could try my nuget package which ignores all the nuget content files and automatically restores them before building.
Step by step example in my blog: http://www.baseclass.ch/blog/Lists/Beitraege/Post.aspx?ID=9&mobile=0