Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Would an RDF "living standard" be a good idea? #88

Open
pchampin opened this issue Aug 24, 2021 · 15 comments
Open

Would an RDF "living standard" be a good idea? #88

pchampin opened this issue Aug 24, 2021 · 15 comments
Labels
standards Standardization should address this

Comments

@pchampin
Copy link

A lot of good ideas about how RDF could be improved have been brought up here. However, actually implementing those changes in the RDF specification would be a huge task, considering the sheer amount of recommendations that would need updating [1]. I may be wrong, but I consider this to be a major reason for the inertia of RDF.

The new W3C process makes it possible for recommendations to be updated in a more agile way than what was previously possible, similar to the HTML5 "living standard".

Should a working group publish a new version of the RDF (and friends) specifications and opting in for this feature, it would then become much easier for RDF to evolve. On the other hand, it might hurt interoperability to some extent, as all implementation may not implement the changes at the same pace.

So my question to this community is: would the pros of an RDF "living standard" outweight the cons? Could we strike a better balance between agility and interoperability than what we have now?

[1] 8 of them for RDF itself; 13 more for SPARQL; yet another 12 for OWL...

@kasei
Copy link

kasei commented Aug 24, 2021

I think there might be immediate benefits to being able to incorporate the built-up errata on the various SPARQL specs, but worry about the interop issues for trying to do larger feature changes/additions in an ongoing fashion. There might be a subset of changes that would fit well with this approach (purely additive features that are very well scoped like additions to the SPARQL function library or operator mapping table), but I think more complex changes/features might be better addressed by the traditional WG process.

@gkellogg
Copy link
Member

Other groups have many specs, such as CSS, so RDF would not be alone in needing to curate many things together. I think having the authority to make such changes would be good, and there are a number of low-hanging-fruit issues that it would be good to get a handle on. Certainly, the impact of any given change on the set of documents would need to be weighed on its own merits.

Of course, a huge burden will be to get repos set up and the editor's drafts compatible with the latest version ReSpec.

@afs
Copy link
Contributor

afs commented Aug 24, 2021

Yes, a living standard would be good. As @kasei says, the more complex the addition, the more it fits the CG/WG process.

It does not have be all documents at once.

@mhedenus
Copy link

Regarding the very slow progress I support the idea of a living standard!

@rubensworks
Copy link
Member

+1 on the idea of a living standard!

On the other hand, it might hurt interoperability to some extent, as all implementation may not implement the changes at the same pace.

This could be partially solved by introducing feature detection capabilities, as HTML, CSS, and JS already do. This may only be possible for incremental features, and not for significant changes. But for the latter, the traditional CG/WG process could then be used.

@namedgraph
Copy link

IMO "living standard" is an oxymoron that reeks of HTML5 and WHATWG. Relying on precise definitions and a high level of interoperability, what does RDF stand to gain from such process?

@mhedenus
Copy link

@namedgraph It might gain progress and higher acceptance outside the somewhat closed group of RDF enthusiasts. Consider the history of HTML5 and recall why WHATWG was founded: The little progress of HTML and the focus on XHTML.

@namedgraph
Copy link

@mhedenus have you looked at the HTML5 spec, and compared with the HTML 4? It is an order of magnitude more complicated, and with a major scope creep which pulled in everything from JavaScript to DOM. Nobody in their right mind can hope to implement it -- except the corporations behind WHATWG, who also happen to develop the few remaining web browsers.

If a similar thing would happen to RDF, it would be the end of it.

@mhedenus
Copy link

mhedenus commented Aug 25, 2021

@namedgraph Agreed. You don't have to make some mistakes again. I have the feeling that RDF/SPARQL is stalled and a more open process could bring some dynamics.

@namedgraph
Copy link

@mhedenus I'd like to see SPARQL 1.2 happen soon, I'm just wary of the "living standard". There should be some middle way.

@mhedenus
Copy link

mhedenus commented Aug 25, 2021

@namedgraph Again agreed.

The high quality level of the RDF/SPARQL specs allow something that is only a theoretical idea in other areas: the seamless migration from one implementation/system to another. I have done this two times with only minimal trouble. The 100% interoperability of RDF systems is extremely precious and must not be lost.

A "middle way" could be very strict requirements for a new feature to be accepted and incorporated into the standard but also to have an open space for collecting ideas. To move the groups here to Github and invite the public to participate was a great step! Now, why not start a "living draft" as a playground and then take over what the community agrees on?

@pchampin
Copy link
Author

pchampin commented Sep 3, 2021

In order to reach a possibly larger audience, I relayed this discussion to the semantic.web mailing list:
https://www.w3.org/mid/c356f244-5cba-6c14-ff74-f09d1005e721@w3.org

@HolgerKnublauch
Copy link

Difficult question. I like @rubensworks comment on feature detection above. Quite possibly this would be a requirement before RDF 2 and similar updates can happen. APIs and tools can then clearly communicate what features they support. For example I believe that if someone tries to get RDF-star through the current W3C process, many people may not be on board and block progress. With a better framework for dialects, people can formally opt out of such features without being able to block them. I personally would for example favor a dialect of RDF that deletes all unnecessary numeric XSD datatypes and only keeps integer and decimal. Graphs that contain other numeric literals would fall outside of the dialect and thus be rejected at load time, or values translated on the fly. In case of RDF-star, an alternative mapping to rdf:Statement exists, but documents would even have a different mime type and file ending so are even easier to distinguish. In a SPARQL 1.2 with RDF-star support, the spec could preamble the paragraphs that only apply to a certain RDF dialect. Likewise a SHACL 1.1 that supports RDF-star for those that want it.

Not sure yet what I am suggesting here, just thinking out loud, but the notion of dialects may need to be formalized.

@pchampin
Copy link
Author

pchampin commented Sep 8, 2021

<nitpicking>literal datatypes are not stricly part of the RDF core model, so for me, literal "normalization" would be a feature above RDF rather than a feature of RDF -- more related to something like SHACL. But that's still a very interesting use-case.</nitpicking>

I think the feature detection thing is an important topic if we want to move forward.

@VladimirAlexiev
Copy link

VladimirAlexiev commented Oct 8, 2021

I voted 👍 though I appreciate the difficulties this may create.

  • SPARQL 1.0 really was deficient for serious applications
  • SPARQL 1.1 was a giant step forward and made the language fit for serious querying
  • SPARQL 1.2 as discussed in https://github.com/w3c/sparql-12/issues is not a single coherent thing, but a bunch of excellent ideas, loosely fitting together (and in some cases competitive/incompatible)
    • It seems obvious to me that a useful majority of these ideas cannot be developed at the same time because that involves a lot of work. Not just because there are many interlinked specs but also because many of the ideas are non-trivial and have non-trivial interactions
    • There are too many useful ideas to pass on, so if they are to be released, that will happen in leaps and bounds, not all together.
    • In addition to SPARQL features capture "best practices" sparql-dev#58 is basically a proclamation to allow gathering of good ideas, even if they won't make it into a SPARQL spec

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
standards Standardization should address this
Projects
None yet
Development

No branches or pull requests

9 participants