-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extend existing doc plugin? #144
Comments
Hi @slorber Thank you for the feedback. I forgot to submit after releasing 1.0.0. I will do it this weekend ;-) I did think about having the build at runtime, but I faced some issue, and it was not important for our internal project.
|
thanks for the PR sure, all this is just a suggestion :) having cli + runtime make sense |
I restarted, without success, the work on plugin extension following the docs for plugin lifecycle and the following discussion on the Docusaurus repo:
Here a simplified version of the implementation (full code here and in PR #991) import * as PluginContentDocs from "@docusaurus/plugin-content-docs";
const pluginContentDocs = plugin.default ? plugin.default : plugin;
export default async function pluginGraphQLMarkdown(context, options) {
const docsPluginInstance = await pluginContentDocs(context, options);
const loadContent = async () => {
// resolve configuration
const config = await buildConfig(options);
// generate MDX files from Schema
await generateDocFromSchema(config);
// Handover to @docusaurus/plugin-content-docs
return await docsPluginInstance.loadContent();
};
return {
...docsPluginInstance,
name: "docusaurus-graphql-doc-generator",
loadContent,
};
} The MDX files are correctly generated by
This let me assume that |
Hmmm I don't know, what are the files it generates, their content and path? Maybe try to create a repro by hardcoding these files locally instead of generating them, so that I can inspect it? |
@slorber - Apologies for the late feedback. I managed to create a simple repo reproducing the issue: https://github.com/edno/graphql-markdown-docusaurus-plugin-debug |
@slorber - I think I found a possible root cause. If one wants to "extend"
A possible solution would be to be able to override the path in If this solution makes sense to you, then I will investigate how to implement it and push a PR for Docusaurus. |
Hey, to be honest I have a hard time understanding your issue 🤪
Which data, what is the temp folder path exactly, and what does it mean to "break"?
Can you give me a concrete example, what is this temp folder path exactly for the docs plugin? VS the one you use for your inherited plugin? And how is it a problem? Where is the code that is supposed to create this behavior? Because I don't see what you mean here. Afaik we only create
Yes, and if you want to emit json bundle files in the same path you can hardcode this as well in your inherited plugin no? const pluginDataDirRoot = path.join(
generatedFilesDir,
'docusaurus-plugin-content-docs',
);
const dataDir = path.join(pluginDataDirRoot, pluginId);
How would that All this is not super fresh in my mind sorry 😅 maybe we should first figure out what are the blockers exactly, on a super simple use case (like just generating one doc file) before trying to even make this plugin work entirely |
@slorber - Maybe the repo https://github.com/edno/graphql-markdown-docusaurus-plugin-debug will be more explanatory than my previous comment. |
Closing the issue as there's no straight forward solution without changing Docusaurus plugin handling (lifecycle hooks). |
Hi and thanks for working on this interesting plugin.
Don't hesitate to submit it to our community docs so users can more easily find it.
One suggestion I could make is, instead of generating the md docs with the CLI, maybe you could extend the docs plugin so that it creates the md files just before reading the md files?
You may find this interesting to try:
facebook/docusaurus#4138 (comment)
Not sure it would be more convenient than a cli though, but can be worth giving it a try
The text was updated successfully, but these errors were encountered: