Skip to content

Commit

Permalink
MNT Removes unused optim_batch_size from UMAP's docstring (rapidsai#4732
Browse files Browse the repository at this point in the history
)

Closes rapidsai#4725

rapidsai#3848 removes the usage of `optim_batch_size` in code. This PR removes the parameter from the docstring and in `UMAPParams`.

Authors:
  - Thomas J. Fan (https://github.com/thomasjpfan)

Approvers:
  - Dante Gama Dessavre (https://github.com/dantegd)

URL: rapidsai#4732
  • Loading branch information
thomasjpfan authored May 13, 2022
1 parent e671fc8 commit 98fb4f4
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions python/cuml/manifold/umap.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,6 @@ cdef extern from "cuml/manifold/umapparams.h" namespace "ML":
float target_weight,
uint64_t random_state,
bool deterministic,
int optim_batch_size,
GraphBasedDimRedCallback * callback


Expand Down Expand Up @@ -270,11 +269,6 @@ class UMAP(Base,
consistency of trained embeddings, allowing for reproducible results
to 3 digits of precision, but will do so at the expense of potentially
slower training and increased memory usage.
optim_batch_size: int (optional, default 100000 / n_components)
Used to maintain the consistency of embeddings for large datasets.
The optimization step will be processed with at most optim_batch_size
edges at once preventing inconsistencies. A lower batch size will yield
more consistently repeatable embeddings at the cost of speed.
callback: An instance of GraphBasedDimRedCallback class
Used to intercept the internal state of embeddings while they are being
trained. Example of callback usage:
Expand Down

0 comments on commit 98fb4f4

Please sign in to comment.