Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposed RFC: Documentation reporting on the Feature Grid #64

Closed
FiniteStateGit opened this issue Sep 29, 2022 · 12 comments
Closed

Proposed RFC: Documentation reporting on the Feature Grid #64

FiniteStateGit opened this issue Sep 29, 2022 · 12 comments
Labels
rfc-feature Request for Comments for a Feature

Comments

@FiniteStateGit
Copy link
Contributor

Summary:

Documentation health in each important feature area needs to be reported in some way for the O3DE feature grid (link). This RFC covers the column extensions and terminology used by sig-docs-community for reporting on the state of documentation in a way that aligns with how SIGs report on the state of their features to the TSC, TAC, and Governing Board.

What is the motivation for this suggestion?

Currently, the TSC, TAC, and Governing Board only are informed of the engineering state of the product. They should also be aware of the level of documentation available to users for each feature. For many features documentation is considered critical to user success, and so the lack of documentation for a fully complete feature may still render it difficult or impossible to use for customers.

If this suggestion is not implemented, sig-docs-community will find another method to establish conveying information on product health to the TSC, TAC, and Governing Board.

Suggestion design description:

Feature grid extension

The following three columns are added to the feature grid: Conceptual Docs, Tutorials, Samples. These columns are defined as follows:

  • Conceptual docs: Descriptions of features, and how the features operate in a user-facing manner. Does not necessarily include engineering or low-level documentation (low-level is feature-relative; Gem development or core engine programming will always require that even the highest-level content is complex.) This is about docs that explain the concepts behind the feature, and often cross-reference or relate to other documentation about the engine.
  • Tutorials: Examples, samples, and walkthroughs designed to help users with the specified feature. This can include video series available from O3DE on YouTube, projects designed with the intent for users to learn and explore a feature (some crossover with samples), or written instructions or guidance walking a user through a common or example scenario.
  • Samples: Code or existing projects, assets, and content which can assist a user in using the feature. Samples include things like code samples, example projects, demos (such as MultiplayerStarterProject), and reference assets or templates.

Removal of current docs info on Feature Grid

We suggest removing the “Docs link” column; for some features this may be misleading, or even lead them into a section of the documentation that doesn’t appear relevant (even if it is.) This is a consequence of the current information architecture of the o3de.org (http://o3de.org/) website, and the lack of ability to set up a full subgrid for navigation underneath a single feature. The column can be re-added with an appropriate RFC to account for these issues.

New documentation column scores

Since the definitions used by sig-docs-community to grade content are different from the definitions that can be used to describe code health, we require the following to be added for our columns only. Our completeness grades need to be more like a “quality” grade - the work of documentation is never done.

  • N/A: Documentation is not required and does not need to report on any information for this specific feature.
  • None: Documentation is required, but none is available. This includes topics where documentation contains only deprecated or removed features and was not updated.
  • Minimum: Documentation exists, but is only of the minimum amount to have been determined to make a user “successful” with the feature. Content may be of low quality, but is guaranteed to meet the minimum quality bar for accepting contributions set by sig-docs-community.
  • Partial: A feature is approximately 50% or more documented at the minimum quality bar.
  • Good: A feature is approximately 75% or more documented, and the quality is above the minimum bar.
  • Excellent: A feature is 90% or more documented, and the quality is well above the minimum bar.

Explicitly not in this RFC

  • API reference coverage levels: API reference generation is an ongoing topic in the SIG, and needs RFCs to establish a process and convention. Currently our API reference is generated manually, and an automated process will allow for some form of red-yellow-green coverage dashboard for overall API health for each SIG.
  • Script Canvas, Component, Lua, and some other reference: These are also all being investigated for automated reference generation, and the process of doing so will set up automation and coverage dashboard reporting in the same manner as API reference.
  • Subdividing docs categories or supplying links to content: This requires an entirely different grid of information well beyond the current feature grid, and is more of a site navigation feature. As described in the section on removing the “docs link” column, a lack of a subgrid prevents this from being particularly useful at this time.

What are the advantages of the suggestion?

  • Allows engineering teams to make informed decisions about feature upgrade and improvement paths by looking at the grid; A downgrade in a docs score for that feature means that the change would be significantly user-impacting, even if there is lower engineer impact.
  • Guarantees that sig-docs-community and all engineering SIGs are aligned regarding what they report on in terms of a feature, and what the state of that feature is.

What are the disadvantages of the suggestion?

  • Because feature grids are submitted per-SIG, SIGs will need to evaluate the state of the feature documentation prior to submitting them to the feature grid repository. SIGs would need to assume the responsibility for this evaluation or coordinate with members of sig-docs-community. Alternatively, sig-docs-community would need to amend each SIG’s feature grid after it has been submitted to the feature grid repository.
  • The proposed changes do not guarantee that the entirety of O3DE documentation is evaluated. Fundamental, peripheral, or cross-cutting topics may not be listed among a SIGs features.
  • The proposed scoring system manages to define gradations in document quality but may be difficult to uniformly implement across features because of its inherent subjectivity. For some features, defining a minimal set of use-cases may be difficult.

How will this work within the O3DE project?

  • As a first step, the feature state form would be updated with the columns and scoring selection outlined by this RFC.
  • Currently the feature grid is updated and released for each major release. We propose that SIGs evaluate documentation for their features after code freeze occurs and after any relevant feature documentation has been submitted to the o3de.org (http://o3de.org/) repository for inclusion in a release.
  • We propose that individual SIGs take primary responsibility for reviewing and updating the documentation sections of their respective feature grids. We reason that SIGs will be the best judge of the sufficiency of a feature’s coverage in documentation, which is the predominant component of documentation scoring.
  • Sig-docs-community will provide support, consultation, and feedback when requested.

Are there any alternatives to this suggestion?

  • Consideration was given to a solution wherein sig-docs-community submits its own feature grid independent of other SIGs. Existing column and scoring options were not adequate to detail the state of documentations. While this implementation did allow freedom to list alternative and additional features than those listed by other SIGs, it conflicted with columns (and their interpretation) of the feature grids already submitted by the SIGs. Users would need to navigate to two different, disjointed rows per-feature to review the state of engineering and documentation efforts. Refer to prior discussion by sig-docs-community: Proposed SIG-Docs-Community Feature Grid #40.
  • Keep the present feature grid design, which cannot adequately capture the state of feature documentation with regards to the different types of documentation the product requires.

What is the strategy for adoption?

  • The README for the feature state form is updated (https://github.com/o3de/community/tree/main/features#readme) with descriptions of new columns and the criteria for scoring them. Criteria for scoring are as described in “Column Scores”.
  • The proposed changes are implemented on the feature state form ( https://o3de.github.io/community/features/form.html).
  • After completely listing each feature included in a release, each SIG, independently or in conjunction with sig-docs-community, evaluates and scores relevant documentation for those features. SIG feature grids are updated and uploaded, as necessary, to the feature grid repository.
@FiniteStateGit FiniteStateGit added the rfc-feature Request for Comments for a Feature label Sep 29, 2022
@willihay
Copy link
Contributor

I've read and support this proposal. Getting sig involvement in the evaluation of documentation relevant to their feature areas seems like a smart idea.

A couple questions on the definitions of the columns:

  1. For tutorials you include "samples" in the description:

    Tutorials: Examples, samples, and walkthroughs designed to help users with the specified feature.

    "Samples" means something very specific, and has its own column. I'd remove that from the description.

  2. Do you include as part of the tutorials category the how-to topics that are often embedded in the feature documentation? I am in agreement with this, because they both involve practical steps, but it's not 100% clear to me if this is the intention, and since this category is named "tutorials", the how-to topics might be overlooked when sigs evaluate the docs. You might want to explicitly state that tutorials can include any procedural topics in any of the various guides, not just the Tutorials ("learning guide") section. You could call this category "Procedural docs".

@chanmosq
Copy link
Contributor

I've read and support this RFC proposal.

It's really important for docs to have a feature grid that can accurately portray the docs' health status across O3DE's various domains - I think this is a great idea. Additionally, it brings clarity to the overall state of O3DE Docs and can help identify any gaps.

Upon acceptance, work for this RFC should be supplemented with defining the levels of docs "quality". That would help sigs make a more informed judgement of the state of their feature docs.

@chanmosq
Copy link
Contributor

Requesting feedback from @o3de/sig-release, and suggest bringing this up to TSC for awareness. Since this depends on other sigs to report on the docs health status of their own features, and is involved with the release process, we would like your review on this.

The last day for feedback is Friday, Oct. 28.

@amzn-rhhong
Copy link

Do we have a score/feedback system on each doc page? Something like "is this article helpful?" Or "on scale of 1-5 how do you think about this article"

I'm thinking pulling data like that and use it to generate the health matrix for docs is going to really reflect how customers think.

Could be as simple as thisimage

@vincent6767
Copy link
Contributor

I've read and supported this proposal. I suggest asking individual SIGs for feedback as this adds another responsibility and coordination effort with the SIG Docs community.

@chanmosq
Copy link
Contributor

chanmosq commented Nov 1, 2022

@amzn-rhhong Having some sort of customer assessment on the docs' quality is a good idea and something the sig should investigate in. Perhaps this is a conversation for outside of the RFC, because the feature grid is intended for sigs to report on the status of docs. And at this time, the "quality" that we report on will be based on an internal assessment of "Do we have some learning content (docs, videos, or samples) such that a user can use the feature?"

@chanmosq
Copy link
Contributor

chanmosq commented Nov 1, 2022

In response to @vincent6767, we will reach out to other sigs and extend the date for processing this RFC.

@FiniteStateGit
Copy link
Contributor Author

1. For tutorials you include "samples" in the description:
   > Tutorials: Examples, samples, and walkthroughs designed to help users with the specified feature.
   
   "Samples" means something very specific, and has its own column. I'd remove that from the description.

Agreed, "samples" in the definition of tutorials could be substituted with how-to's, topics that demonstrate a procedural workflow.

2. Do you include as part of the tutorials category the how-to topics that are often embedded in the feature documentation? I am in agreement with this, because they both involve practical steps, but it's not 100% clear to me if this is the intention, and since this category is named "tutorials", the how-to topics might be overlooked when sigs evaluate the docs. You might want to explicitly state that tutorials can include any procedural topics in any of the various guides, not just the [Tutorials](https://www.o3de.org/docs/learning-guide/) ("learning guide") section. You could call this category "Procedural docs".

It is intended that both how-to topics normally found embedded in the user-guide and the dedicated tutorials found in the tutorial section of the site are counted towards a feature's documentation.

@lmbr-pip
Copy link

lmbr-pip commented Nov 1, 2022

I do agree with the sentiment, though I would like to understand how doc users find and make use of this information. Is there linkage to it from o3de.org? Its not really covered how folks looking for documentation are expected to work with this information. Is it just for potential docs contributors?

As for the proposal, my main problem is that ratings still seem highly subjective and require a lot of cognitive thought to set up for SIG Chair(s).

Secondly, did you consider having a state of docs section, ie "Missing - Not Planned", Planned, In Active Development, Delivered or something to convey state of feature with relation to docs

Especially interested in how do we relate docs and features to the "dev" documentation? I may mark a feature as having rich/wonderful/completed docs but they may only be in the dev docs branch and thus invisible to most users.

@sptramer
Copy link
Contributor

sptramer commented Nov 8, 2022

Do we have a score/feedback system on each doc page? Something like "is this article helpful?" Or "on scale of 1-5 how do you think about this article"

I'm thinking pulling data like that and use it to generate the health matrix for docs is going to really reflect how customers think.

Could be as simple as thisimage

👍/👎 rankings are outside of the discussion of the feature grid at this time. That would require a separate RFC; engaging in this kind of user study requires cooperation from legal departments.

@sptramer
Copy link
Contributor

sptramer commented Nov 8, 2022

@lmbr-pip full comment

Hi pip I accidentally edited your comment instead of writing my own. Misclick that led to me overwriting some of your comments but I have left which ones I kept there, and moved my edits below.


I do agree with the sentiment, though I would like to understand how doc users find and make use of this information. Is there linkage to it from o3de.org? Its not really covered how folks looking for documentation are expected to work with this information. Is it just for potential docs contributors?

It would be part of the feature grid release notes (https://www.o3de.org/docs/release-notes/22-10-0/feature-state/). As for how they "make use" of this information - It's unclear. There are many ways that this can go wrong with under/over-evaluation (especially RED/GREEN), even if we were taking a data-driven approach to determine "quality" or "usefulness". We may need a more objective measurement system like NPP (negative sentiment evaluation).

As for the proposal, my main problem is that ratings still seem highly subjective and require a lot of cognitive thought to set up for SIG Chair(s).

Subjective evaluations would be performed by docs + the sig, under however they choose to do so. We understand that we will be consistently YELLOW or RED under this reporting rubric. It's about "sufficiency" of documentation for the product: Can a user onboard? Can they understand it? Is there full reference available?

Engineering teams who take dedicated time to focus on docs would be able to establish a baseline of quality ("is it better / worse than last time? Did we add stuff? Are things still missing? Do users report issues?"). We may need to hold this RFC until there's a way to perform better evaluations.

Secondly, did you consider having a state of docs section, ie "Missing - Not Planned", Planned, In Active Development, Delivered or something to convey state of feature with relation to docs

This would be the intent of a roadmap, not a feature grid. Feature grids are supposed to be for snapshots in time.

Especially interested in how do we relate docs and features to the "dev" documentation? I may mark a feature as having rich/wonderful/completed docs but they may only be in the dev docs branch and thus invisible to most users.

Do you report on the dev feature in the feature grid? If so, docs for the feature are also reported in the feature grid, on the same line item.

@chanmosq
Copy link
Contributor

chanmosq commented Nov 8, 2022

At sig-docs-meeting on 11/08/2022, we unanimously provisionally accepted this RFC, with the understanding that we need to establish a system of metrics for evaluating the quality, and address other concerns that are commented on this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
rfc-feature Request for Comments for a Feature
Projects
Development

No branches or pull requests

7 participants