Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

For a 0.11.0 release #500

Merged
merged 14 commits into from
Apr 24, 2020
Merged

For a 0.11.0 release #500

merged 14 commits into from
Apr 24, 2020

Conversation

ablaom
Copy link
Member

@ablaom ablaom commented Apr 24, 2020

Make compatibility updates to MLJBase and MLJModels to effect the following changes to MLJ (see the linked release notes for links to the issues/PRs)):

  • (new model) Add LightGBM models LightGBMClassifier and
    LightGBMRegressor

  • (new model) Add new built-in model, ContinuousEncoder, for
    transforming all features of a table to Continuous scitype,
    dropping any features that cannot be so transformed

  • (new model) Add ParallelKMeans model, KMeans, loaded with
    @load KMeans pkg=ParallelKMeans

  • (mildly breaking enhancement) Arrange for the CV
    resampling strategyto spread fold "remainders" evenly among folds in
    train_test_pairs(::CV, ...) (a small change only noticeable in
    small datasets)

  • (breaking) Restyle report and fitted_params for exported
    learning networks (e.g., pipelines) to include a dictionary of reports or
    fitted_params, keyed on the machines in the underlying learning
    network. New doc-strings detail the new behaviour.

  • (enhancement) Allow calling of transform on machines with Static models without
    first calling fit!

  • Allow machine constructor to work on supervised models that take nothing for
    the input features X (for models that simply fit a
    sampler/distribution to the target data y) (Unsupervised learning interfaces - is transformer too narrow? #51)

Also:

  • (documentation) In the "Adding New Models for General Use"
    section of the manual, add detail on how to wrap unsupervised
    models, as well as models that fit a sampler/distribution to data

  • (documentation) Expand the "Transformers" sections of the
    manual, including more material on static transformers and
    transformers that implement predict (Improve documentation around static transformers #393)

@ablaom
Copy link
Member Author

ablaom commented Apr 24, 2020

Doc generation to fail until MLJModels 0.9.9 is merged - waiting on JuliaAI/MLJModels.jl#239

@ablaom
Copy link
Member Author

ablaom commented Apr 24, 2020

Mmm. Actually, doc generation only depends on the master branch having the update (now done; release pending).

@ablaom ablaom merged commit 3936fd2 into master Apr 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants