Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Problems
schema
/target_schema
, but which do not also declare an identicaldatabase
/target_database
, return the exception we intended to only appear when a user has specified a model configdatabase
that differs from the model's configuredschema
(see Master not working with sources #89)Approaches
Always set
database = ''
database != ''
, we can see if the user has manually set thedatabase
config, allowing us to raise an appropriate exception.database
. When it comes time for docs generation,_get_one_catalog
is passed a single database with multiple schemas, raising the exception'Expected only one schema in spark _get_one_catalog'
. This exception is well motivated; on Spark, we need to run a separatelist_relations
query for each database/schema.Always set
database = schema
list_relations
queries.schema
/target_schema
that differs from the default database (target.database
). We can't raise a helpful exception in the event that the user is trying funky things with thedatabase
config; we'll just need to document this.I think I prefer the second approach, and I've tried to implement it here. I figured out a way to change the value of
database
via__post_init__
even thoughSparkRelation
is inhering a frozen dataclass.However,
dbt docs generate
is still returning this error:My best guess right now:
_get_catalog_schemas
uses thecreate_from
classmethod from BaseRelation to createinfo_schema_name_map
, i.e. the object which establishes which relations exist in which schemas in which databasescreate_from
-->create_from_node
does not respect my__post_init__
resetting of database = schema, instead pulling the values of database, schema, etc directly off the node attributes.@beckjake I'd be thrilled if you could check this out and help me debug what's going wrong. As it is, we may need to revisit some of the catalog generation methods in light of #90.