Skip to content
This repository has been archived by the owner on Jul 9, 2021. It is now read-only.

Les #23

Closed
wants to merge 5 commits into from
Closed

Les #23

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: CI

on: [push, pull_request]

jobs:
test:
name: Python ${{ matrix.python-version }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.6", "3.7", "3.8"]

steps:
- name: Checkout source
uses: actions/checkout@v2

- name: Build
run: make build
env:
PYTHON_VERSION: ${{ matrix.python-version }}

- name: Unit test
run: make unit-test

- name: System test
run: make system-test
22 changes: 0 additions & 22 deletions .travis.yml

This file was deleted.

15 changes: 10 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,12 @@
Dask-LightGBM
=============
Dask-LightGBM - DEPRECATED
==========================

[![Build Status](https://travis-ci.org/dask/dask-lightgbm.svg?branch=master)](https://travis-ci.org/dask/dask-lightgbm)
THIS REPOSITORY IS DEPRECATED
-----------------------------

This repository is deprecated and it is no longer maintained. The code was migrated into LightGBM package - https://github.com/microsoft/LightGBM.

[![Build Status](https://github.com/dask/dask-lightgbm/workflows/CI/badge.svg)](https://github.com/dask/dask-lightgbm/actions?query=workflow%3ACI)

Distributed training with LightGBM and Dask.distributed

Expand All @@ -14,7 +19,7 @@ Load your data into distributed data-structure, which can be either Dask.Array o
Connect to a Dask cluster using Dask.distributed.Client.
Let dask-lightgbm train a model or make predictions for you.
See system tests for a sample code:
<https://github.com/dask/dask-lightgbm/blob/master/system_tests/test_fit_predict.py>
<https://github.com/dask/dask-lightgbm/blob/main/system_tests/test_fit_predict.py>

How this works
--------------
Expand All @@ -23,4 +28,4 @@ The library assures that both features and a label for each sample are located o
It also lets each worker to know addresses and available ports of all other workers.
The distributed training is performed by LightGBM library itself using sockets.
See more details on distributed training in LightGBM here:
<https://github.com/microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst>
<https://github.com/microsoft/LightGBM/blob/main/docs/Parallel-Learning-Guide.rst>