-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Annotate H5 clades through a node data JSON file instead of modifying metadata #25
Comments
Following up from a question by Pauline Trinh about this issue, we would probably copy the pattern used in this script from the seasonal flu workflow to generate a node data JSON file with the clade labels. |
Yup, node-data JSONs are more powerful in that they allow annotating internal nodes and labelling branches. Despite being called I'd still recommend starting with a PR which kept the current functionality but made the snakemake workflow simpler to reason with, and then add internal nodes / branch labels in subsequent work, as desired. |
@jameshadfield can we hold off on this for now? We have nextclade working, but the historic, all classes assignments are not absolutely stellar so I want to retain annotations with LABEL for now. Happy to expand on this more, but I would prefer to keep as is for now! |
Sure thing. I'm not proposing / planning to do the work, I was just trying to caution others about changing our current functionality to add in internal nodes / branch labels for LABEL annotations without touching base with you first. P.S. the original aim,
Wouldn't change at all how you actually run P.P.S. please reach out if you want help incorporating the Nextclade outputs into the phylogenetic workflows! |
Context
In conversation about #22, @trvrb noted:
@jameshadfield noted that:
Description
We should modify
scripts/add-clades.py
to create a node data JSON file as output and update the workflow to make the resulting output an input to theexport
rule instead of a step that modifies the metadata.The text was updated successfully, but these errors were encountered: