Skip to content

Nvidia Flare: ML/DL models Limitations and Potentials #1726

Closed Answered by ZiyueXu77
Datumologist asked this question in Q&A
Discussion options

You must be logged in to vote

@YuanTingHsieh gives a detailed answer, adding an additional point for "compatibility":
XGBoost and random forest are two of the most widely used methods in "decision trees or gradient boosting" family, as for Naive Bayes, it can be formulated in a distributed/federated way (and implemented with NVFlare) by collecting the stats. So they can all be implemented with NVFlare
As yuanting mentioned, the real question is "would/how these models be compatible with federated learning", and there is no "compatibility" issue on NVFlare side as long as we can formulated a method under a federated setting, and have sufficient API support from the package we use (e.g. scikit learn).
We will have a blo…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@ZiyueXu77
Comment options

Answer selected by YuanTingHsieh
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants