-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limiting the scope of "Directives Are Unique Per Location" validation #429
Comments
Awesome! We've talked about this before @OlegIlyenko and this is something we've been bit by recently too. Although it's often possible to work around by having plural arguments (like in your example) i definitely think its more elegant to sometimes used multiple calls to a same directives like in your example:
Should we have a limitation as far as argument values go? Would using the same directive more than once with the same argument be valid? As far as
Could we maybe use the same kind of logic for repeated skips and includes? If there are multiple skips then it would be evaluated as Thanks for opening this! Edit: As an opposite point of view, I also kind of agree with @stubailo's comment and @leebyron's answer here: #229 (comment) I do like how currently there are less chances of "misusing" directives since the schema lets you know what's possible. |
Wasn't able to make the WG meeting yesterday, but two question I had around this had to do with how this would interact with field collection from fragments; in particular, when that creates trickiness around duplication and ordering. For example: # Assume Page and User both implement Profile
fragment F on Profile {
name @A
... on User {
name @B
}
...H
name @C
...G
... on Page {
name @D
}
}
fragment G on User {
...H
name @E
}
fragment H on Profile {
name @F
} demonstrates some of the weird cases. If we apply this to a Page, then it would "collect" to
which could potentially make the behavior of "ordering-significant" directives surprising. |
We discussed this issue at GraphQL WG Meeting #5. I saw a strong consensus on moving forward and continuing the work on this issue. I would like to outline the list of suggested options with some updates:
As @xuorig pointed out, the validation was introduced in this PR: #229. One concern is that whatever we decide, we need to make sure that the directive behavior is easy to understand and consistent across different parts of the GraphQL spec. Syntactic location vs semantic interpretationEven if we decide on the last option and don't change the validation, I believe that we still have an issue. For example, if we consider that type # 1
type Book @foo(id: 1) @foo(id: 2) {
title: String!
pages: Int!
} and # 2
type Book @foo(id: 1) {
title: String!
}
extend type Book @foo(id: 2) {
pages: Int!
} As validation is defined right now, the alternative 1 fails, but the alternative 2 succeeds. In my opinion, this behavior should be more consistent and consider the semantic interpretation of the type and schema extensions. This also extends to the query-side directives. @dschafer Thanks a lot for pointing this out! Initially, I haven't considered it in this context, but I believe that this is something that needs to be addressed as well. Even this simple query passes the validation: {
hello @skip(if: true)
hello @skip(if: false)
} After testing it against the reference implementation, the field is present in the response (so the Potentially breaking change@leebyron pointed out that this is potentially breaking change since GraphQL libraries might rely on the fact that directive name is unique at a specific AST node. For example, a GraphQL library might internally represent the list of directives as a hash map. This also raises another question: is the ordering of directives significant? I'm not 100% sure, but I believe that at the spec does not explicitly defines this. On an AST level directives are written in a specific order, but is it ok for a specific GraphQL implementation or library to rely on particular ordering of the directives? I think parallels can be made with a JSON object field ordering. JSON spec defines the object as:
Still, GraphQL spec explicitly defines the semantic interpretation of the JSON object fields in the response (it reflects the field ordering in the GraphQL query). @jjergus you were involved in the introduction on the validation rule. I would really appreciate your feedback and opinion on this issue. |
Just to express my personal preference: I think I would prefer the the first option where we remove the validation altogether. I think it also might be a good option to define a new validation specific to the interaction between I see the directive semantics as a domain concern. So the interpretation of the directive semantics, uniqueness at a particular location, ordering, etc. should be done by the application that defines these directives. |
I agree, I like option 1 or 4.
I would suggest that we define the behavior of multiple
to be the same as if the field was queried multiple times with one directive each time:
This already has well defined behavior per the GraphQL spec (we include the field in the response if at leasts one of the instances would have us do so).
Yeah, this part is kinda sad. This means that everytime anyone declares any directive in the future, they will have to specify the correct behavior for multiple instances. Inevitably, someone somewhere will forget to do this for some new directive, resulting in ambiguity. Because of this, I also like your option 4 -- where directives would be (non-)unique unless explicitly marked as the opposite (I'm not sure what a better default would be -- probably unique by default?).
I suggested this validation rule because it was the simplest way to resolve the ambiguity that existed at the time -- not because I had a strong preference towards that solution -- so I am totally fine with changing it (as long as the spec is updated such that there is no ambiguity).
True story.
Same question already exists for field arguments, or for multiple non-identical directives. |
@jjergus thanks a lot for a quick reply and your input! I like your solution as well (1 without an extra validation). I think I personally also fine with 4. Though in this case I think we need to address the question of uniqueness semantics (whether type & schema extensions as well as things like fragments should be considered). Regarding the hashmap directive representation. If the first option would be picked, do you think it would a big issue for the applications your are working on? If you are relying on the uniqueness of the directive at the specific location, do you think it would be possible/reasonable to re-introduce this constraint as an application/company-level convention or validation? |
@OlegIlyenko According to current spec this should fail: # 2
type Book @foo(id: 1) {
title: String!
}
extend type Book @foo(id: 2) {
pages: Int!
} Since directive location for both defintions is
I actually think that 1-5 creates a problem in this example since the reader doesn't know if |
Shouldn't be a problem. We could easily implement a solution that doesn't use a hashmap, or if that results in a performance regression, then add it back as a local validation rule specific to our codebases (we already have a few of those, as well as custom directives, client-side transforms, etc.) |
@IvanGoncharov you are totally right! I missed this part in the spec. Though I checked it against I guess this is something to implement :) I'm glad that spec considers this aspect for the type extensions, thanks for pointing this out 👍 I also checked the
Indeed. I think the main idea is that the directive itself defines the semantics. This flexibility has a danger where one directive overrides another one with the same name. But it also enables scenarios like the one I have shown in the "Example 1: adding more information" especially if extensions are spread in different files (it not only limited to schema definitions, are also have use-cases where this is useful for the type definitions). That said, I also wonder whether the name of the directive is a sufficient criteria for ensuring the domain constraint. |
@dschafer We would be discussing it on WG tomorrow. If you have a time would be great if you join in.
I reviewed your example more closely and I think it presents a real problem but it's unrelated to
query ($foo: Boolean = true, $bar: Boolean = false) {
field @skip(if: $foo) {
subfieldA
}
field @skip(if: $bar) {
subfieldB
}
} The same applies to your examples with fragments and spreads so it's also valid according to |
@OlegIlyenko I think this issue uncovered some obscurity in the current version of the spec: What is the directive location?Is Are order of directives significant or not?Argument are unordered. Input object fields are unordered. If we decide that directive are ordered how should we map it to type extensions? For example: extend scalar Test @foo
scalar Test @bar Is it equivalent to I think we need to open separate issues for this two questions. |
@IvanGoncharov I think it is a good summary of our discoveries 👍 I would suggest to discuss it tomorrow at WG meeting and then based on the outcome I can create separate issues for these. I feel that the question of |
The A thought experiment for this might be: if we allowed fragment and operation extensions, how should we resolve duplicate directives on the server? I'm not sure, but I suspect we'd make actually-identical directives merge, conflicting directives throw an error, and require directive order to not matter (just like field order within a query is not supposed to matter). |
Marking as Strawman, but noting that mostly side-discussion leading to real RFCs is happening here (which is great!) @OlegIlyenko feel free to close this issue once you think your RFC PRs replace the need for tracking this issue. |
I agree. 2 RECs are well on their way. I think we can close this one |
According to the spec, "Directives Are Unique Per Location" validation is defined like this:
In the light of recent GraphQL spec changes (extensions on all types and the schema itself) and other developments in the community, I would like to suggest to reconsider the inclusions of this validation rule or limiting it's scope to a subset of directives.
I would like to show 2 cases where it might be beneficial to allow non-unique directives.
Example 1: adding more information
In this scenario, directives might add more information. The order of directives is insignificant.
Considering the schema delegation case where multiple GraphQL schemas are merged into one. I can define the config like this (with the current limitation):
If the limitation is lifted, I can potentially split the schema in 2 files like this:
This provides more modularity (and, I would argue, simplicity), even in absence of an extension:
Example 2: chaining directives/data transformation
In this scenario, I would like to transform the data in a chain. A small subset of directives might represent reusable logic for a data retrieval and transformation. In this case the ordering the directives is significant.
For example, if I have a set of directives to retrieve data from an external HTTP service, I can define a field like this:
Possible solutions
We can tackle this in a number of ways:
@skip
,@include
and potentially other custom directives@skip
and@include
directivesI would really appreciate your opinions and comments on this suggestion.
The text was updated successfully, but these errors were encountered: