Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix default -> auto prior parameter in documentation for lda-related models #2156

Merged
merged 9 commits into from
Aug 13, 2018
Merged

Conversation

Laubeee
Copy link
Contributor

@Laubeee Laubeee commented Aug 13, 2018

Change "default" to "auto" since there is no handling on the value "default".

Change "default" to "auto" since there is no handling on the value "default".
@Laubeee Laubeee mentioned this pull request Aug 13, 2018
Copy link
Contributor

@menshikh-iv menshikh-iv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @Laubeee, please make the same update for others and I'll merge current PR

@@ -369,7 +369,7 @@ def __init__(self, corpus=None, num_topics=100, id2word=None,
Alternatively default prior selecting strategies can be employed by supplying a string:

* 'asymmetric': Uses a fixed normalized asymmetric prior of `1.0 / topicno`.
* 'default': Learns an asymmetric prior from the corpus.
* 'auto': Learns an asymmetric prior from the corpus.
@menshikh-iv menshikh-iv changed the title update documentation to match actual code Fix default -> auto prior parameter in documentation for lda-related models Aug 13, 2018
@Laubeee
Copy link
Contributor Author

Laubeee commented Aug 13, 2018

Oh alright didn't catch those
Were these all the places?

@@ -61,7 +61,7 @@ def __init__(self, num_topics=100, id2word=None, chunksize=2000, passes=1, updat
Alternatively default prior selecting strategies can be employed by supplying a string:

* 'asymmetric': Uses a fixed normalized assymetric prior of `1.0 / topicno`.
* 'default': Learns an assymetric prior from the corpus.
* 'auto': Learns an assymetric prior from the corpus.
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assymetric => asymmetric (here and elsewhere)

@menshikh-iv
Copy link
Contributor

Congratz with the first contribution @Laubeee 👍

@menshikh-iv menshikh-iv merged commit f9beeaa into piskvorky:develop Aug 13, 2018
@Laubeee Laubeee deleted the patch-1 branch August 13, 2018 15:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants