Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Task]: Restructure db session setup to allow for easy use of multiple schemas #1270

Closed
1 task
chouinar opened this issue Feb 16, 2024 · 0 comments · Fixed by #1520
Closed
1 task

[Task]: Restructure db session setup to allow for easy use of multiple schemas #1270

chouinar opened this issue Feb 16, 2024 · 0 comments · Fixed by #1520
Assignees
Labels
project: grants.gov Grants.gov Modernization tickets

Comments

@chouinar
Copy link
Collaborator

Summary

Right now every table we setup goes into a single default database schema (non-locally that's the api schema) but we want to be able to put specific tables in different schemas. SQLAlchemy supports this, but some of our setup will break if we try to use this because currently the behavior is:

  1. Tests create a separate test_schema_<some random number> which is created and dropped before tests - this needs to be duplicated or changed in some way
  2. We use the search_path param in our DB client for using assumed schemas, which if we start to add many schemas will be difficult to maintain if we add a lot of schemas
  3. We need to make sure every schema is created for both local and test setup before tables are created (SQLAlchemy can generate all the tables, but needs the schemas ahead of time)

One approach that could simplify this is to create a new database (create database unit_tests_<some number>) before the tests, and just create the same schemas (eg. just api) as outside of tests. This makes it so we don't need to have any complex mapping, all it takes to switch which database we use is a single env var when we connect to the database.

Acceptance criteria

  • We can setup schemas on our tables, and explicitly define all of our current schemas as being in the api schema (this won't have any effect non-locally but will change local which had been defaulting to public).
@chouinar chouinar added the project: grants.gov Grants.gov Modernization tickets label Feb 16, 2024
@chouinar chouinar self-assigned this Mar 20, 2024
chouinar added a commit that referenced this issue Apr 2, 2024
…1520)

## Summary
Fixes #1270

### Time to review: __10 mins__

## Changes proposed
Modify our SQLAlchemy logic to allow for multiple schemas to be setup.
This includes:
* Setting the schema explicitly in a class that all SQLAlchemy models
inherit from
* Setting up a schema translate map (only meaningfully needed for local
tests) to allow for changing the name of a schema
* A new (LOCAL ONLY) script for creating the `api` schema

## Context for reviewers
**Non-locally this change does not actually change anything yet -
locally it does by making local development more similar to non-local**

This does not actually setup any new schemas, and every table we create
still lives in a single schema, the `api` schema.

This change looks far larger than it actually is. Before, all of our
tables had their schema set implicitly by the `DB_SCHEMA` environment
variable. Locally this value was set to `public` and non-locally it was
set to `api`. These changes make it so locally it also uses `api`,
however in order for that to work, the Alembic migrations need to
explicitly say `api` (in case we add more schemas later). There is a
flag in the Alembic configuration that tells it to generate with the
schemas, but we had that disabled. I enabled it so future migrations
_just work_. But in order to make everything work locally, I had to
manually fix all of the past migrations to have the `api` schema.

Non-locally the schema already was `api` so changing already-run
migrations won't matter as they already ran as if they had that value
set.

## Additional information
This change requires you run `make db-recreate` locally in order to use
the updated schemas.

To test this, I manually ran the database migrations one step at a time,
fixing any issues. I then ran the down migrations and made sure they
also worked correctly, undoing the up migrations correctly. I then ran a
few of our local scripts to make sure everything still worked properly
and didn't find any issues.

---------

Co-authored-by: nava-platform-bot <platform-admins@navapbc.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
project: grants.gov Grants.gov Modernization tickets
Projects
Development

Successfully merging a pull request may close this issue.

1 participant