Skip to content

Commit

Permalink
[DOCS]: small update to "scaling-up" (#2577)
Browse files Browse the repository at this point in the history
  • Loading branch information
universalmind303 authored Jul 30, 2024
1 parent 006babd commit ddabd34
Show file tree
Hide file tree
Showing 8 changed files with 74 additions and 29 deletions.
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@
"learn/install": "../install.html",
"learn/user_guides/dataframes": "intro-dataframes.html",
"learn/user_guides/types_and_ops": "intro-dataframes.html",
"learn/user_guides/remote_cluster_execution": "scaling-up.html",
"learn/user_guides/remote_cluster_execution": "distributed-computing.html",
"learn/quickstart": "learn/10-min.html",
"learn/10-min": "../10-min.html",
}
Expand Down
2 changes: 1 addition & 1 deletion docs/source/migration_guides/coming_from_dask.rst
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ Dask supports the same data types as pandas. Daft is built to support many more
Distributed Computing and Remote Clusters
-----------------------------------------

Both Dask and Daft support distributed computing on remote clusters. In Dask, you create a Dask cluster either locally or remotely and perform computations in parallel there. Currently, Daft supports distributed cluster computing :doc:`with Ray <../user_guide/poweruser/scaling-up>`. Support for running Daft computations on Dask clusters is on the roadmap.
Both Dask and Daft support distributed computing on remote clusters. In Dask, you create a Dask cluster either locally or remotely and perform computations in parallel there. Currently, Daft supports distributed cluster computing :doc:`with Ray <../user_guide/poweruser/distributed-computing>`. Support for running Daft computations on Dask clusters is on the roadmap.

Cloud support for both Dask and Daft is the same.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/user_guide/poweruser.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ The Daft Poweruser

poweruser/memory
poweruser/partitioning
poweruser/scaling-up
poweruser/distributed-computing
67 changes: 67 additions & 0 deletions docs/source/user_guide/poweruser/distributed-computing.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
Distributed Computing
=====================

By default, Daft runs using your local machine's resources and your operations are thus limited by the CPUs, memory and GPUs available to you in your single local development machine.

However, Daft has strong integrations with `Ray <https://www.ray.io>`_ which is a distributed computing framework for distributing computations across a cluster of machines. Here is a snippet showing how you can connect Daft to a Ray cluster:

.. code:: python
import daft
daft.context.set_runner_ray()
By default, if no address is specified Daft will spin up a Ray cluster locally on your machine. If you are running Daft on a powerful machine (such as an AWS P3 machine which is equipped with multiple GPUs) this is already very useful because Daft can parallelize its execution of computation across your CPUs and GPUs. However, if instead you already have your own Ray cluster running remotely, you can connect Daft to it by supplying an address:

.. code:: python
daft.context.set_runner_ray(address="ray://url-to-mycluster")
For more information about the ``address`` keyword argument, please see the `Ray documentation on initialization <https://docs.ray.io/en/latest/ray-core/api/doc/ray.init.html>`_.


If you want to start a single node ray cluster on your local machine, you can do the following:

.. code:: shell
> pip install ray[default]
> ray start --head --port=6379
This should output something like:

.. code:: shell
Usage stats collection is enabled. To disable this, add `--disable-usage-stats` to the command that starts the cluster, or run the following command: `ray disable-usage-stats` before starting the cluster. See https://docs.ray.io/en/master/cluster/usage-stats.html for more details.
Local node IP: 127.0.0.1
--------------------
Ray runtime started.
--------------------
...
You can take the IP address and port and pass it to Daft:

.. code:: python
>>> import daft
>>> daft.context.set_runner_ray("127.0.0.1:6379")
DaftContext(_daft_execution_config=<daft.daft.PyDaftExecutionConfig object at 0x100fbd1f0>, _daft_planning_config=<daft.daft.PyDaftPlanningConfig object at 0x100fbd270>, _runner_config=_RayRunnerConfig(address='127.0.0.1:6379', max_task_backlog=None), _disallow_set_runner=True, _runner=None)
>>> df = daft.from_pydict({
... 'text': ['hello', 'world']
... })
2024-07-29 15:49:26,610 INFO worker.py:1567 -- Connecting to existing Ray cluster at address: 127.0.0.1:6379...
2024-07-29 15:49:26,622 INFO worker.py:1752 -- Connected to Ray cluster.
>>> print(df)
╭───────╮
│ text │
---
│ Utf8 │
╞═══════╡
│ hello │
├╌╌╌╌╌╌╌┤
│ world │
╰───────╯
(Showing first 2 of 2 rows)
22 changes: 0 additions & 22 deletions docs/source/user_guide/poweruser/scaling-up.rst

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@
"\n",
"*EDIT (June 2023): Our hosted version of the full dataset is temporarily unavailable. Please enjoy the demo with the sample dataset for now.*\n",
"\n",
"**Note:** This demo runs best on a cluster with many GPUs available. Information on how to connect Daft to a cluster is available [here](https://www.getdaft.io/projects/docs/en/stable/learn/user_guides/scaling-up.html). \n",
"**Note:** This demo runs best on a cluster with many GPUs available. Information on how to connect Daft to a cluster is available [here](https://www.getdaft.io/projects/docs/en/stable/learn/user_guides/poweruser/distributed-computing.html). \n",
"\n",
"If running on a single node, you can use the provided subsample of the data, which is 75MB in size. If you like, you can also truncate either dataset to a desired number of rows using `df.limit`."
]
Expand Down Expand Up @@ -468,7 +468,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.11.4"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion tutorials/intro.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@
"\n",
"![image.png](attachment:image.png)\n",
"\n",
"See ([Daft Documentation: Distributed Computing](https://www.getdaft.io/projects/docs/en/latest/learn/user_guides/scaling-up.html))"
"See ([Daft Documentation: Distributed Computing](https://www.getdaft.io/projects/docs/en/latest/user_guide/poweruser/distributed-computing.html))"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion tutorials/talks_and_demos/linkedin-03-05-2024.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@
"\n",
"![image.png](attachment:image.png)\n",
"\n",
"See ([Daft Documentation: Distributed Computing](https://www.getdaft.io/projects/docs/en/latest/learn/user_guides/scaling-up.html))"
"See ([Daft Documentation: Distributed Computing](https://www.getdaft.io/projects/docs/en/stable/learn/user_guides/poweruser/distributed-computing.html))"
]
},
{
Expand Down

0 comments on commit ddabd34

Please sign in to comment.