Data Migrations In Laravel
Data Migrations rewrite
Migrations are one of my favorite things for development, it keeps the project consistent, and allows a consistent ecosystem across the board. You can read more about them here.
Once an application gets into production, there will be times where data will need to be updated, wouldn't it be ideal if it could happen during the deployment process? This is where the idea of
data migrations come into play.
During deployment like we deploy migrations for schema changes, we can do the same for production data changes. We can achieve this by creating an additional directory called
data_migrations, any type of data change migrations will go in this branch.
When working with migrations these are the practices to follow.
Create two migration directories:
database/migrations (comes out of the box via Laravel)
- runs all environments
- database schema changes
- database production data changes
- use database transactions
database/migrations will execute in all environments, it manages the schema of your application.
You should not be editing existing migrations in an active project, and you should avoid adding data inside of a migration. It should be strictly for
To get data into your application, use seeders for local/staging, and use data_migrations for production data where applicable. i.e. you can run seeders the first time, and utilize the data_migrations to fine tune any additional data for a production ready application.
data_migrations should be run on the production environment, the only changes that you should be doing here are strictly production data changes changes.
During this set up, you should use
database transactions if available, that way in the event of a failure we throw an exception and the database will be rolled back so that you can safely fix an issue and retry the migration. (If done correctly, the system will not register the migration unless it completes, allowing you to fix the issue and retry using the same migration file.)
I would recommend this solution for changes in data where you know the command will complete in a short amount of time to avoid any significant downtime. If you have to change data across something that will take significant amount of time, my suggestion would be to move it off to a CLI command, or handle it in a job queue to execute that large job outside of the deployment window. If you are an experienced DBA there are other ways to alter large tables, however that's out of the scope of this article.
data_migrations can be utilized for more then managing data of an application, you can kick off jobs, run commands etc... to have an automated type of deployment, however I would suggest creating additional directories to keep everything cleaner and clearer for your team.