Skip to content

Migrations and fixtures

When models are updated, new migrations must be generated to reflect those changes in the database schema. A data migration and fixture update may also be needed, depending on the nature of the model changes.

Schema migrations

A simple helper script exists to generate migrations based on the current state of models in the local codebase:

bin/makemigrations.sh

bin/makemigrations.sh

This script:

  1. Runs the standard Django makemigrations command
  2. Formats the newly regenerated migration file with Black

Commit the new migration file along with the model changes.

Data migrations

Warning

If your schema changes affect any model fields that currently hold data, you probably need to also write a data migration to ensure that existing data is not lost.

Our preferred way of handling data migrations is to add a migrate_data function to the file that was already generated for the schema migration, and add a RunPython operation to the list of operations to call it.

Here is a simple example of the resulting file after combining schema and data migrations, with the additions for data migration highlighted:

# Generated by Django 5.2.7 on 2026-02-03 22:25

from django.db import migrations, models


def migrate_data(apps, schema_editor):
    TransitProcessorConfig = apps.get_model("core", "TransitProcessorConfig")

    for config in TransitProcessorConfig.objects.all():
        if config.environment == "qa":
            config.environment = "test"
        config.save()


class Migration(migrations.Migration):
    dependencies = [
        ("core", "0073_add_roseville"),
    ]

    operations = [
        migrations.AlterField(
            model_name="transitprocessorconfig",
            name="environment",
            field=models.TextField(
                choices=[("dev", "Development"), ("test", "Testing"), ("prod", "Production")],
                help_text="A label to indicate which environment this configuration is for.",
            ),
        ),
        migrations.RunPython(migrate_data),
    ]

In the above example, the schema change was to change the choice value qa to test, so the migrate_data function looks for existing TransitProcessorConfigs with values of qa and updates them to test.

In some cases, more complex data migrations may warrant more than just a single migrate_data function.

Note

We do not generally worry about making reversible data migrations. We have enough checks and balances that we are comfortable with just moving forward.

We also prefer to combine all migrations (schema and data) pertinent to a single PR into one file, even if the changes are broken out across multiple commits for easier reviewing. Migrations added in later commits can simply be added to the first migration that was created on the branch.

Updating fixtures

Aside from migrating the database schema and possibly existing data, our fixture files which are used for development will also need to be updated.

In addition to the local_fixtures.json file that is checked into the repository, we maintain private fixture files with working values for external integrations that are not included in the sample fixtures in local_fixtures.json.

An easy way to update these fixture files is to run them through the migrations that were created. The steps in general are:

(from within the devcontainer)

  1. Download the fixtures that you need to update (if updating the fixtures with secrets)
  2. Set your DJANGO_DB_FIXTURES environment variable to that file
    • There are many ways to do this. Here is one way:
      1. Update the value of DJANGO_DB_FIXTURES in your .env file
      2. Run source .env
      3. Check the value with echo $DJANGO_DB_FIXTURES
  3. Checkout the commit on main prior to the model changes (i.e. where the fixtures can be loaded in)
  4. Run ./bin/setup.sh to reset your database and load in the fixtures
  5. Checkout the commit with the new model changes (most likely, the latest commit on main)
  6. Run python manage.py migrate to apply migrations
  7. Export the migrated fixtures to a temporary file
  8. Review the migrated fixtures, and do any clean-up or manual updating needed (though generally we should be able to have data migrations that make it so no manual updating is needed)
  9. Save the updated fixtures back where they belong

A helper script at bin/dumpdata.sh handles some of these steps, prompting where input is needed, so the steps are simplified down to:

  1. Download the fixtures that you need to update (if updating the fixtures with secrets)
  2. Set your DJANGO_DB_FIXTURES environment variable to that file
  3. Run ./bin/dumpdata.sh
  4. Review the migrated fixtures in unreviewed_fixtures.json, and do any clean-up or manual updating needed (though generally we should be able to have data migrations that make it so no manual updating is needed)
  5. Save the updated fixtures back where they belong