You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice to follow a consistent naming convention for parameters and be as consistent as possible with sklearn. For instance:
In supervised versions of weakly supervised algorithms, num_constraints should be renamed n_constraints, num_chunks to n_chunks
In LMNN, the parameter k could be renamed n_neighbors like in sklearn's KNeighborsClassifier
There is also tol and convergence_threshold which are both used to refer to optimization tolerance (we should always use tol which is quite standard, cf `scipy.optimize)
The text was updated successfully, but these errors were encountered:
* Rename number_constrains to n_constraints
* Renamed num_chunks to n_chunks
* LMNN k parameter renamed to n_neighbors
* Replaced all 'convergence_threshold' with 'tol'
* Fix tests
* Fixed more test regarding rename of variable
* Warnings for n_constrains
* Add all warnings regarding n_constrains
* Deprecation warnings for n_chunks
* Add deprecation warn to n_neighbors
* Add convergence_threshold warnings
It would be nice to follow a consistent naming convention for parameters and be as consistent as possible with sklearn. For instance:
num_constraints
should be renamedn_constraints
,num_chunks
ton_chunks
k
could be renamedn_neighbors
like in sklearn'sKNeighborsClassifier
tol
andconvergence_threshold
which are both used to refer to optimization tolerance (we should always usetol
which is quite standard, cf `scipy.optimize)The text was updated successfully, but these errors were encountered: