If there are rows in a migrations table in my database but the migration files are not in the code itself. Would that mean that the people before me who built the code before i took it over never pushed the migrations to the git repo? also if this is true would i have to create my own versions of the migrations and add them to the repo?
Correct, however I would check the packages you are using as well. Sometimes packages allow you to run migrations but they are not registered in your migrations directory. I always try to publish them to avoid situations like this. However if you look and don't see any of the missing tables, there are tools are out there that should help you make the migrations from an existing DB. You can take a look at this package for example: https://github.com/kitloong/laravel-migrations-generator
Related
everyone. I use Symfony 4.2 and following database-first approach and have auto-generated entities; But then I need to do some changes in field definitions in entities but I don't want to affect the database structure. Everything works well but if I try to create a migration, doctrine includes all the differences in migrations, and I find no way to prevent this behaviour. I've tried schema_filter: ~^migration_versions$~ but somehow it doesn't help.
So the questions:
1) is it a normal application state on prod when column definitions slightly differ in database and entities?
2) how can I say to doctrine to ignore differences in some tables when creating migrations? Thanks.
When you run bin/console doctrine:migrations:diff it will generate a file in your src\Migrations\ directory. You can edit the generated file to remove whatever you don't want to change before you run bin/console doctrine:migrations:migrate.
I don't suggest doing this on a production server though, and especially if you do so then you should certainly have a backup of your database.
We are looking to implement Continuous Integration using Circle CI but we are not sure on how should we proceed with our test database. We have the following alternatives in mind:
Run the migrations from scratch (the problem is that we have a lot of migration files, our first migrations were moving everything from MySQL and PostgreSQL and using a legacy database, so, it's rather complex).
Recreate the current DB and have a .sql file that will create our current tables, and then, we create a seeder to fill the information that we need.
But we are not sure which is the best alternative or if we're missing something?
Thank you
I am having a very bad situation here, working on a Laravel 5 project. previously developed by another developer. That developer at start created couple of tables using migration generators and added some columns using migrations. After that, he added table columns straight away using some sql GUI. I was given the sql dump which i imported and set it up on my local machine, now when i created a table using php artisan make:migration create_myTableName_table --create="myTableName" the table migration is created successfully, but, when i did php artisan migrate it's giving me SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'someTable' already exists I checked migrations folder and matched it with current version of someTable and i can see the columns are different, same with other tables aswell.
What should be the best case to handle this in this situation, i want to keep up with Laravel migrations generator so that in future if any other developer want to work on this project he just has to run migration command to migrate database or to create tables or create columns... Should i re-write all migrations ? pleas help. Thanks
Given your situation, I’d put the .sql export in your repository, clear out the old, broken migrations, and create a new one that initially imports the database dump. Then just create migrations as normal going forward.
Quick and dirty?
Delete current migration files. Clear your migrations table and export the whole DB. Put the SQL dump in the repo and instruct other devs to import it before they run php artisan migrate. With the dump imported and migrations table truncated, you can create new migrations with no further collisions with the legacy migrations.
Feeling ambitious?
Use a package like migrations-generator to generate migrations based on your current DB structure. Then, migrate away.
I have a local instance of a database that I recently created using DbContext.Database.Create(), so the __MigrationHistory table exists with an InitalCreate entry that matches the code at the moment.
Some code-based migrations exist in the Migrations folder, however. These will be run in our development and staging environments to bring those databases in line with the code. I don't need to apply them locally, however, since I created the database using the current code.
I now need to make a change to the model and create the corresponding migration. But when I run Add-Migration TestMigration, I get the following error
Unable to generate an explicit migration because the following explicit
migrations are pending:
[201203271113060_AddTableX,
201203290856574_AlterColumnY]
Apply the pending explicit migrations before attempting to generate
a new explicit migration.
What should I do in this case? I can't point the Add-Migration tool at another environment because it's not guaranteed that version matches what I have locally. I want a migration that matches only the changes I've made.
It seems I have a few options but none are ideal:
Delete the other migrations from the Migrations folder, run the Add-Migration command, upgrade the database, then restore the old migrations. This is simple but seems a bit hackish.
Revert to the version of the model in source control that the first migration was applied to, then build this and use it to create the database. Then get the latest version, apply all the migrations, then I'm ready to add my migration. This seems like a lot of effort!
Create the migration manually.
Does anyone have any suggestions about how to manage this?
We are planning to use a variant of your Option #1...
Our Standard Operating Procedure is to generate a SQL script for each migration (using the -script option of update-database), in order to have SQL scripts to be applied to end-user "production" databases by InstallShield (we plan to use EF update-database only for developer databases).
Thus, we have both the Migration .cs files and the corresponding .sql files for all migrations in our Migrations folder.
So rather than deleting the migrations from the Migrations folder (as you proposed in #1), we use SQL Mgmt Studio to manually apply just the parts of the .sql files that do the inserts into _MigrationHistory.
That brings the _MigrationHistory of the local database up-to-date with the changes that are already incorporated into that database.
But it's a kludge, and we're still looking for a better solution.
DadCat
What I've found works best is very simple: don't use DbContext.Database.Create() once you've enabled migrations. If you want to programmatically create a new database, use the migrations API instead.
var migrator = new DbMigrator(new Configuration());
migrator.Update();
Then you've got the full migration history and adding further migrations works just as expected.
You either need to run "update-database" from the package manager console to push your changes to the database OR you can delete the pending migration file ([201203271113060_AddTableX]) from your Migrations folder and then re-run "add-migration" to create a brand new migration based off of your edits.
I have encountered the same problem.
If you run
Update-database
and then run
Add-Migration YourMigrationName
This solves the problem
simply exclude the old migration file from the solution files.
Hey guys,
I need to figure out a way to back up and also migrate our Oracle database from our production schema to the dev schema and the other way around.
We have bunch of config tables that drive how systems on our platform run, and when setting up new systems or doing maintenance, we need to update our config tables. We want to be able to work on the dev schemas and after setting up a system/feature, we want to be able to migrate all those configs to the dev schemas.
I thought of running a procedure where we give the ID of the system (from the main table) and i would go through all the tables and select nvl(..) and if it doesn't exist, i would insert into, and if it does exist then i just run an update on that row.
This code will get very messy and complicated especially since the whole config schema is very complex and it might be hard to handle all the keys properly.
Another option i was looking at was triggers, so when setting up a new system, there would be a log of all the statements we ran while setting up/editing a system, then we would run it on our production schema.
I'm on a coop term, and have only been working with databases for 6 months, so i don't know that much and any information/advice would be greatly appericiated.
(We use pl/sql)
What about using export / import (or datapump) to bring over the config tables?
Check out data comparison tools like this
Think TOAD has one built in. I'm sure there are others out there too.
It is common to have tables in a schema that are what we call "static data", i.e. the users don't change it because it controls how the application works.
Each change to config data should not be run ad-hoc in the target environment. Instead, you design and code your DML carefully in one or more scripts, which get tested in a dev environment, checked into change control, and can be re-run in any environment when required.