EF and SQL Server CE with NuGet - visual-studio-2010

While poking around NuGet I noticed a couple of packages that I have already installed, EF and SQL Server CE. This raised a couple of questions that I have not been able to find any information on.
On EF package what does this add that isn't already there in VS or does it just do all the reference work for you?
On SQL Server CE package, what does this add that installing it doesn't? Or is this just a better way to install it.
It certainly takes a lot to get an environment setup and I like NuGet for other packages, but I do not want to break anything unless it would ultimately benefit me in the long run. Any comments, answers, or anecdotes would be great.

NuGet is definately the way forward for all binary references and even project tooling such as NUnit etc.
I'd recommend bringing in the combined EntityFramework.SqlServerCompact package. This will add the latest binaries for each and also hook up the correct provider factory config and add a WebActivator in the App_Start folder for Connection Factory initialisation. The WebActivator works in both ASP.NET and Web Forms but will invariably cause a compile error in a none Web based project but this is no big deal.

The other thing it adds is that those assemblies are now bin deployed and part of your solution. For example, if you commit the packages folder as part of your project, and someone checks it out but doesn't have EF or SQLCE installed, the project will still compile and work. They don't need to go hunting for the MSI installers.
Secondly, if you deploy to a server that doesn't have these DLLs in the GAC (most don't), well you'll still be ok because they will be set to "Copy Local to True".

Related

project.assets.json not found - TFS Build Server, no internet

We're just in the process of transitioning from VS2013&15/TFS2013 to VS2017/TFS2017 (on-site TFS, not VSTS) and the first test solution is a dotNet Core 1.1 based one (a multi-project web service).
The solution builds fine on the original developer's box and I've got it out of TFS and it builds fine on mine too. In keeping with our previous methodology the contents of the packages folder are checked in with the projects as this makes the packages locally available on the build box (no internet).
Building the solution on the build server is a different story, however, as I get multiple errors of the form...
..\obj\project.assets.json' not found. Run a NuGet package restore to generate this file.
I get the errors both when I run the TFS build definition and when I remote to the box and build directly through the VS on the box itself.
This whole project.assets.json not found issue seems to be causing headaches all over. In my case the issue is that I'm trying to resolve it on our TFS 2017 Build Server, which does not and never will have internet access ('cos it's a server!).
All the solutions I've seen thus far seem to suggest running the Nuget Restore command but that can't work since the server cannot get to nuget.
This is nothing fancy yet, just a simple TFS 2017 Build definition with a Get sources and a Build solution step. I can't understand how something so simple has become so difficult.
Changing the Nuget Package Restore options makes no difference.
Since the project.assets.json files are generated on the fly in the obj folder, I can't even check them in to reuse. Can anyone please suggest a workaround, at the moment the test project is dead in the water.
Edit: trying the same process with a 4.6.1 web project created with VS2015 had similar results of unresolved references (e.g. System.Web) but didn't raise the same error, probably due to being an older, non-Core project.
According to I get the errors both when I run the TFS build definition and when I remote to the box and build directly through the VS on the box itself.
The issue seems not related to TFS build side since it also not work with local build through VS in the build agent machine.
Since this is a dotnet project. So, you could try to use “dotnet restore” and not “nuget restore”. Try using the dotnet core template (which uses dotnet restore).
If you are using authenticated nuget feeds, then you can use nuget restore but you also need to use nuget installer task. See https://github.com/Microsoft/vsts-tasks/issues/3762 for a discussion on that.
The Nuget version should be higher than 4.0.
Without dotnet restore and Nuget restore and only use get source/Visual Studio Build will not be able to build the dotnet core project. If your server do not have internet access, as a workaround you should use Local feeds.

'Sharing' class libraries in Visual Studio Online source control

We are currently migrating our source control to Visual Studio Online. A feature we had in our old system (SourceGear Vault) was to share projects between solutions. Although this created a folder for our Framework project in each solution it kept it up to date when changes were checked in.
This is useful to us as it allows us to work on the Framework code in all the Solutions that are using it. I know its better practice to just compile the dlls and reference them - at this point in development we want to continue having full code access and debugging in all the solutions using this core framework.
Any help very much appreciated.
You have a few equally valid options for handling shared projects:
Reference the same project from each solution that needs it.
This gives you full control over the source code of the shared project while you work on the consuming solution, and may allow for easier debugging.
The downside here is that maintenance and releases may become trickier if Solution A is being released on Thursday, but Solution B is being released in 3 months and is in the middle of a huge refactoring cycle that has significantly modified Shared Component X, and Shared Component X isn't stable enough to be released.
Reference shared components from an internal NuGet repo.
You set up your release pipeline to push the shared components into NuGet as part of your release process (ideally, using a purpose-built release management tool... Microsoft Release Management is what I have in mind here) -- you check the code in, project gets built. Release process packages it up and pushes it into NuGet as a "prerelease" version. You reference the latest version in anything that needs the latest version.
If you need to reference a known-good, stable version, you just make sure your project is configured to pull a specific version from NuGet.
When you're done, you've tested the shared thing, and you know everything is good, you approve the prerelease version, and the same binaries are repackaged into a "stable" version.
The downside here is that there are some additional software requirements, configuration, and training for your team. This would be my recommended approach.
Check binaries into source control.
I don't recommend this one -- you end up bloating your source control repo (and if you're using Git, it's an explicitly stated anti-pattern -- never put binaries into Git, it causes long-term severe performance problems), and it's never exactly clear which projects are using which versions of which assemblies. It's a maintenance nightmare.
(1) is the best approach if you're releasing everything in lockstep and don't have to worry about maintaining separate versions.
(2) is the best approach if #1 is false.
(3) is the best approach if #1 is false and you're a time traveler who is posting from 2006.
Have you considered implementing Symbol Server and Source Server indexing with the check-in binaries or NuGet repo approach? This allows you to easily get back to the source while debugging and it's coming from a single known location. Visual Studio Online and Team Foundation Server have built-in support for helping you out with getting this setup during your build process. Here's more information in a write-up here: Source Server and Symbol Server Support in Team Foundation Server and Visual Studio Online
Thanks for the responses. We actually found a solution that works well for us. We branch our framework project into the implementation projects when we want access to the code base. If not we just use the DLLs
If it is then altered and checked into the implementation project it can be merged back with the other Framework branches easily when ready.
This probably wouldn't work well if the Framework code was being developed heavily, as it is now its only undergoing small additions and tweaks so wont be plagued with merge issues.
I have to agree with the majority. I just ran into the same issue. I implemented the Nuget Gallery Site on the internal network. It was a pain to implement, but once implemented, it's easy to use. I created a class library project that implements ADO.Net and the EntityFramework. I bundled it into a NuGet package and uploaded it to the internal NuGet gallery. From there I was able to add a package source to the internal NuGet gallery and grab the package that I uploaded. Very simple and convenient.
I set up the NuGet Gallery with Visual Studios 2017. FYI: Make sure that building the project isn't part of the Publish. It will not render with a ViewHelper.cshtml error.
I created the projects with Visual Studios 2015. I ran the command prompt as administrator. I also had to copy the NuGet.exe file into the directory where the project file existed.
Check out the below links for more information.
NuGet Gallery
Hosting NuGet Gallery
Create and Publish Package
Creating a Package
Create .Net Standard Package

Update DLL reference

I wanted to update some DLLs used in my .NET project to the latest version and I've noticed that, if I replace the DLLs on the file system with their new versions, VS 2012 updates the DLL version number in the Properties window.
Is this some new feature of VS 2012? I don't remember seeing it in VS 2010 (I expected it would need more manual handling).
Is this working right, or should I remove and re-add the DLLs manually from the references, just to be sure?
Anyway, my project compiles and runs fine, so I guess it works...
EDIT:
I guess it works because the DLLs are not strongly named (http://msdn.microsoft.com/en-us/library/wd40t7ad.aspx)?
Perhaps I should re-add them if they were...
Inside the project file I saw it had the old version number, but in the properties window I saw the new one...
Thanks!
Easier option to refresh / reload references & types DLL (for example COM interop types) without re-adding all DLLs one-by-one is to reload all projects that reference it like this:
It's better to remove and re-add your references. You said it yourself that the project file was not up-to-date.
I am not sure if there is an add-on for VS that could make updating DLLs easier. If there isn't one, definitely someone should make one. Changing assembly references in large projects is a pain in the #ss.
What worked for me is - go to 'manage nuget packages' and update all.
Managing references as NuGet packages is significantly easier. You can view which references became out of date and choose which to update.
Downside: if you are not using the standard packages (available via NuGet.org) you have to manage your own NuGet repository.
You can manage your NuGet packages via GUI or console.
NuGet packages manager GUI
better option is to de refrence the dll and refrence again but make sure for safety purpse you keep the backup of your previous dll...

Visual Studio 2012 Stale DLL

Intro
I don't know if this is a bug or there is something I'm completely missing.
I have a Project (Windows Service), let's called it WINSERV. And I have 3 DLL's which it depends which come from 3 seperate Projects (Project A, B, C). However A has a dependency on C. And B has a dependency on A and C.
When Building Installer with InstallShield LE, for some reason my Project C is always a STALE version which I cannot get to change. I increment the version, "make clean && rebuild", and the installation (on a remote server) always includes the STALE version.
I set each project that has a dependency on C, to not "Copy to Local", so allowing Project WinServ to maintain the "reference" itself, and copy it across for the Installer.
Problem
But again, no matter what I do, when I install it's the stale version. After a few hours, I did manage to get around the problem, by removing "Copy to Local" on Project WINSERV, and adding in Project C as an addition to Application Data (Project C's Primary Output) to the InstallShield Install Project. This seems to work!!!!
Question
I did try "MAKE CLEAN" a thousand times. I checked Project C DLL version before building and creating the Installer and it was always STALE, old version.
Anyone can explain this? or Bug?
PS. InstallShield LE does not do Windows Services so if you reading this, don't get caught out.
It's definitely a bug in Visual Studio 2012.
I've managed to replicate in the following fashion:
While having a Project opened (referencing let's say MYSQL 5.1.4).
I then commence to uninstall the MySQL Connector and install 5.1.7.
I close Visual Studio 2012 and recompile and all works well.
However when I build with InstallShield it still references 5.1.4 despite it being uninstalled. My thoughts is that it's caching it somewhere and accessing it later during the build process.
How I solved it, is after installing a newer/older/different version of an existing DLL, I then REBOOT my machine and all is now well.
Hope this helps someone
Sounds like a similar problem I had.
For some reason InstallShield sometimes gets DLLs from the temporary ASP.NET folder (even if your project is not ASP)
Try clearing the folder: C:\Windows\Microsoft.NET\Framework\v4.0.30319\Temporary ASP.NET Files

How (and when) do I use TFS with private DLLs that can also be served by NuGet/NuPack?

We have a couple of private "Enterprise Services" DLLS that are used in all our Websites for authentication, logging, etc. Since they are private, we also control the versioning and source of these DLLs. Our historic (error prone) steps after creating File | New Project include
Add the "Enterprise Services" project
Add a reference to above
Edit web.config sections such as Authentication, HttpHandlers, etc...
NuGet will automate the above process
I just came across NuGet (bundled in MVC3) which allows me to download and install VS2010 packages from a privately hosted server, and automate the config settings that previously would have made manually.
Question:
Does it make sense to publish my dll into a private NuGet server?
Will I lose the ability to debug and step into this dll if I need to?
What other things should I consider if the rest of my project is based in TFS?
I agree with marcind: having a private feed make sense.
My 2 cents are that you don't need to configure a private server: configuring your VS to target a shared folder is enough for distributing the packages and it will be easy to update with your TFS builds: just create the NuGet package and drop it into the shared folder.
Keep in mind that, for the latest NuGet bits that I tested, the client (both the console and the gui) does not look into other feeds for locating the dependecies so it will complaint that it can't resolve them automatically: you'll have to install them by hand.
Yes, it makes sense for you to have a private NuGet feed
I'm not sure about stepping into the dll, but if you provide PDBs in your NuGet package as well as the library sources on a share (and then configure VS to know where those sources are) then you should be able to step into the code just like you can today for the .NET framework itself.
NuGet was designed to work well with projects that are mapped to source control so hopefully there's nothing else you need.
#Ghidello NuGet will resolve dependencies automatically as long as you aren't using a specific respository (the package source dropdown in the console is set to All instead of your private repo)

Resources