dbo.__MigrationHistory table not being updated when publishing from Visual Studio - visual-studio

I am having an issue with the dbo.__MigrationHistory table not being updated with new database migrations, BUT the actual database itself is correctly updated with relevant changes. This means the site won't load due to it spotting a change in the context - and I have to manually insert the new changes into the dbo.__MigrationHistory table to make it load.
When I publish to Azure from Visual Studio everything works fine in my PRODUCTION site. However, I have the issue only on my TEST site (which has a different publish profile).
I have checked the EXECUTE CODE FIRST MIGRATIONS box in the publish profile on both the TEST and PRODUCTION publish profiles. Indeed, both publish profiles appear to be identical except for pushing to a different site.
In case it helps - whenever this happens my localdb SqlServer database also becomes unattached during the publish. So on my local PC I have to then go back and reattach the .mdf database file in SQL server management studio.
Any help / advice you can offer would be amazing.

I've had a number of occasions where Entity Framework migrations have left the state of migrations in an unusable state. Usually this happens after a large number of migrations have been applied from different developers and get stuck to where we can't update a database with new migrations or roll back.
It's simply easier to delete the migrations and start with a clean state from the current schema.
If you go the Route of resetting your migrations, make sure you back up your code and make known good backups of your database.
In summary, the steps to do this are:
Remove the _MigrationHistory table from the Database
Remove the individual migration files in your project's Migrations folder
Enable-Migrations in Package Manager Console
Add-migration Initial in Package Manager Console
Comment out the code inside of the Up method in the Initial Migration
Update-database in Package Manager Console(does nothing but creates Migration Entry)
Remove comments in the Initial method
This is not a ideal work around but this will resolve the your issue.
P.S.- This usually occurs when somebody update DB from local machine and it goes out of sync.

Related

Why does Visual Studio deploy two separate SSAS tabular models

So I've created a 2017 SSAS tabular model in VS. I've then deployed it to the SSAS tab work space server. But for some reason it creates two separate models. One normal and one with some random characters appended to it. Worse still, when I deploy changes to the model, it only updates the one with the characters appended.
What is going on here?
Deployment setting:
Two models showing in the work space server:
The tabular model with your username and a GUID is the workspace database. This is a local copy of the tabular model with the changes that you've applied to it when Integrated Workspace mode is not used. The workspace database is kept in memory while it's open in SSDT, and depending on the Workspace Retention property may be either removed from memory, removed from memory/disk, or still kept in memory. The default setting is to remove it from memory but not from disk, hence why you may not see this database whenever you close the model in SSDT. This property can by accessed from SSDT by highlighting the .bim file and viewing the properties (press F4).
The changes made to the workspace database should be applied to the deployed model when it's deployed using the model name as the target database on the deployment server, as in your screenshot. When you examine the model (non-workspace database) in SSMS, how do you know the changes are not applied to this, and have you tried refreshing the view from SSMS? This can be done by right-clocking the Database folder above the tabular models and pressing Refresh. Also, the deployment SSDT is succeeding without errors, correct?
Since I cannot comment on the correct answer I'll make it a separate answer.
What I saw happening was when you import a cube in Visual Studio (while creating a new project) a new tabular DB is created with the name "CubeName_username_Guid".
Everything you modify gets deployed to the real cube after you close Visual Studio. Also after you close VS, the oddly named cube also disappears.
I hope it helps.

Is it possible to update a SSDT DB project from a database?

We have two software projects that both communicate with a single database. Right now SQL updates are all done on the database and it's relying on developers to make sure to update both sets of projects independently to use the latest database model. Making these matters worse both projects are in separate solutions in separate source control repositories.
While I acknowledge this is a terrible situation to be in, I inherited this situation, and while my long term goal is to consolidate and share the (lots) of duplicated logic between them in one common project shared among both sets of application for various reasons it is not feasible to jump right into that right now due to critical deadlines coming up and the need to combine them iteratively and schedule it with other developers to not disrupt work too much.
Keeping that in mind, I really want to use SSDT to at least start bringing the database structure under source control and make it easier to manage, as there are quite a few database changes that I'm about to do.
The problem with SSDT in this scenario is that you can only import from database once. After that the option is greyed out and unavailable, which is apparently a design decision of SSDT, since it's explicitly listed in the MSDN documentation.
Is there any easy way to update my SSDT project without nuking the current project and recreating it each time someone makes a change to the database structure?
Firstly your right, it is a horrible situation so work on improving it in the long term!
There are two things you can do, firstly you could use SSMS "Generate Scripts" to export all the objects and then use the import in SSDT to import from the scripts - this isn't greyed out.
The second thing you can do is manually bring the changes in using the schema compare in SSDT, you can set the database as the source and project as the destination and choose what you drop, update and import.
Ed
its bit delay in answer. I am using VS2017 Database project in which I have achieved this task by comparing a local database with database project once the comparison is over you can update the database by update button
Step 1 right click on the database project and click on schema compare item.
Step 2 select target -> select database connection option
Step 3 change source and target
Review Screenshots for more detail
I am going with compare solution :
Choose schema compare and make your database as a source and database project as a target then compare and update
see the this answer
Make a new temp Database project (outside of TFS) and import all the objects.
Checkout the Database project (inside TFS) and copy and paste all the folders (excluding BIN, OBJ folders) from the new temp Database Project into the Database Project (in TFS) and check in. This was way you get the latest DB object into TFS without duplicating.
If you expect new files in the copy/paste operation, then the new files should be included in the DB Project.
Delete the temp Database project folder.
You will need to do the process whenever you want to update all DB Objects into TFS.
This is a workaround which worked for me for this file duplicating issue.

TFS Migration Risks

I would like to create a new installation of TFS 2013 on a new server.
I made my research and learnt that the migration process as it is described on this link below carries some risks:
TFS Migration Manual:
https://msdn.microsoft.com/en-us/library/ms404869.aspx
Risks:
http://blogs.msmvps.com/p3net/2014/04/12/tfs-upgrade-nightmares/
I have a plan to avoid using the TFS Migration manual above, instead; I would instead check all of my projects out (about 20) and then re-create them on the new TFS and check them in again.
However, we have work-items, users, workspaces and other agile information which I have created for my projects, and which I still require to be on the new installation.
I was wondering whether the following works (again without risks and hassle, as time is scarce):
Back up the TFS Databases from the old installation, and restore them into the new installation or simply import the data from old to new using SQL Server's Data Import Tool.
I am particularly referring to these databases, which TFS has:
Tfs_Configuration; Tfs_DefaultCollection; Tfs_Warehouse.
I found these databases on the SQL Server instance which TFS uses.
Also, this approach works easier without having to obstruct the team, as the Data Base Resotation can occur after hours..
Now, will this plan work?
No, your plan will not work and will leave your TFS in an unsupported state.
You need to follow a combination of the Upgrade and "changing environment" workflow.
1) Restore all TFS databases (tfs_*) to tye new environment
2) Install TFS 2015
3) Configure and select Upgrade Wizard - when running make sure you have all the new server names
4) (optional) ChangeServerID - if this is a practice run you should then immediately:
4.1) I unconfigure the application tier with "tfsconfig exe setup /uninstall:all"
4.2) run the ChangeServerID command
4.3) reconfigure tfs and run the "app tier only" wizard
Simples....
Note: You need to change the server ID if this is a test/practice instance as each server gets a unique ID. When clients first connect to the new server they will "upgrade/migrate" the users data across. You don't want that happening for a trial...so change the ID...
WARNING: If you manipulate the data in the TFS server in any way that is not done by the TFS Product Team tools you will turn your instance to crap. Do not ever edit, or cause to edit, the data in the operational store.

Attached SQL Express DB is causing problems

I have been asked to create an MVC web application in VS 2010, and was instructed to use a SQL express database for my data. I am using EF Code-First for creating and managing my data. The database was created in VS2010, and is attached via "AttachDBFilename" in the web.config.
I have used SQL CE before with MVC with no problems, however the attached SQL Express DB is causing weird issues.
For one thing, when I try to deploy the app, it fails and tells me that it cannot copy the database.mdf because it is in use by another process. I have NOT opened the database in VS2010 nor SSMS. Of course the program code accesses it - is there some reason that connection would remain open? I am using boilerplate code from the scaffolding.
I should mention that I use a ProjectInitializer.cs to create the sample data. It runs at every launch for the moment, since I am testing quite a bit.
The other problem I have is that if I delete the database, it fails to recreate it. It says that my windows account does not have access to the (now non-existent) database that it is trying to create. I literally have to create a new database with new name, as anything that was created previously (with that DB name) fails.
I assume there is some sort of residual info being left somewhere that is out of synch, but I don't know what it is. I've closed all connections to the file in VS 2010, deleted the files, both any found via VS2010 and any physical files I see in the app_data directory.
Any help or suggestions would be appreciated.
Shut down the web server (Cassini, IIS, IIS Express) and try again. The file can remain locked if the web process is still referencing the file. In addition the loaded EF context will retain the db name. Ensure the visual studio browser isn't running in the tray still either.

Visual Studio Database Project Rollback Script

I'm using the Database project in visual studio 2010 to generate a script to deploy my database (and it's changes). This works great.
Is there a way to have Visual Studio database project generate a rollback script as well as the deployment script.
I'm not looking for rolling back the transaction while deploying; but say I deploy it and my stored procedure has an overlooked performance issue that comes up a week later that requires a rollback to the previous version of the database.
Is there a way to generate the rollback script at build/deploy time that will undo whatever changes the deployment script made.
EDIT: If we ignore that I'm using a database project: What is a good way to have an upgrade and downgrade path for a database generated?
This generation needs to be part of an automated build process.
To create a rollback script While doing a schema compare using VS2010, It is as simple as swapping the db names specified in the source and target.
This way VS2010 would create a rollback script which would have drop statements against your stored proc.
I've not seen anything like that.
I think you need to reconsider this approach, as you'd still need to fix the stored proc in your database project, otherwise you'd just be re-deploying the "bad" version the next time you deploy. (I'm sure you're already aware of that, but it doesn't help to point out the obvious sometimes!)
If you need to restore an old version of the sproc to the server in the mean time, I would have thought that the easiest thing to do would be to get the previous version from source control and manually deploy that.
You could create a backup of the database before the release and then just restore from the backup if things go wrong. Obviously you'd also loose any data changes (either made as part of the release or subsequently) since the backup was taken.
Another idea I had was to create a snapshot before the release. The operation to create a snapshot is very light weight. I'm not sure you'd want to keep the snapshot for a week, but if the release went wrong then I think it would be quicker to restore from a snapshot than from a full backup. I would be interested to hear any comments people have on this idea.

Resources