Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report "dangling" content validator errors #572

Merged
merged 8 commits into from
May 25, 2022

Conversation

DylanBaker
Copy link
Collaborator

Change description

We currently prune the LookML tree when building the project. This means that content validator errors relating to explores that don't exist anymore don't get reported by Spectacles. This PR changes this functionality so that these errors now appear.

The changes work in the following way:

  • When we build the project for content validation runs, we now return the entire LookML tree. This allows us to attach all errors to their relevant place on the tree, even if the explore has been excluded from the run. This is what allows us to distinguish between explores that have been deleted and explores that have been excluded by the filters.
  • When we attach the errors to the LookML tree, we first check if the relevant explore exists. If it does, we attach the error there. We then check if the relevant model exists. If it does, we attach the error there. This second part is new. Previously, we were only attaching to existing explores.
  • When we return the errors for the project, we now allow for filters to be passed to the get_results function. This will only happen during the content validator. When this occurs, we prune the errors to just the relevant LookML tree.

This changes mean that we will now return all content validation errors relevant to a project, except for those that are linked to models that don't exist anymore. There is no way for us to know where to attach those, so they will remain dangling.

Type of change

  • Bug fix (fixes an issue)
  • New feature (adds functionality)

Related issues

Closes #240

Checklists

Security

  • Security impact of change has been considered
  • Code follows security best practices and guidelines

Code review

  • Pull request has a descriptive title and context useful to a reviewer

@DylanBaker DylanBaker added the Bug Something isn't working label May 20, 2022
@DylanBaker DylanBaker self-assigned this May 20, 2022
Copy link
Collaborator

@joshtemple joshtemple left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your work on this! I have some feedback below. Also, regarding tests: have you added any tests that explicitly test for this situation? That would be good to do in addition to refactoring existing ones.

spectacles/lookml.py Show resolved Hide resolved
spectacles/lookml.py Outdated Show resolved Hide resolved
spectacles/lookml.py Outdated Show resolved Hide resolved
spectacles/lookml.py Outdated Show resolved Hide resolved
spectacles/lookml.py Outdated Show resolved Hide resolved
spectacles/lookml.py Outdated Show resolved Hide resolved
spectacles/lookml.py Show resolved Hide resolved
spectacles/validators/content.py Outdated Show resolved Hide resolved
spectacles/validators/content.py Outdated Show resolved Hide resolved
spectacles/validators/content.py Show resolved Hide resolved
@@ -410,7 +440,7 @@ def build_project(
fields = ["name", "project_name", "explores"]
for lookmlmodel in client.get_lookml_models(fields=fields):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure how important this is, but we loop through models 4 times (441, 459, 467, 477). Do you think we can consolidate some of these loops? I don't consider it blocking for the PR but if you have some time it might give us a bit of efficiency and readability.

@DylanBaker DylanBaker merged commit b9f6d15 into master May 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Return dangling content validation errors
2 participants