Jules Damji, Anyscale, Ray Team
An introduction to Ray (
https://www.ray.io/), the system for scaling your Python and machine learning applications from a laptop to a cluster. We'll start with a hands-on exploration of the core Ray API for distributed workloads, covering basic distributed Ray Core API patterns for scaling ML workloads:
* Remote Python functions as tasks
* Remote objects as futures
* Remote Python classes as stateful actors
* Multi-model training with Ray Core APIs patterns
If you are a data scientist, ML engineer, or a Python developer the key takeaways:
* Understand what Ray 2.0 is and why to use it
* Learn about Ray Core Python APIs and convert Python functions and classes into distributed stateless and stateful tasks
* Use Ray Dashboard for inspection and observation metrics
* Learn how use Ray core for multi-model training
If you want to set up your laptop or follow-along in the workshop, please set up your laptop ahead of the workshop
with instructions specified in this link:
https://github.com/dmatrix/devAIWorld23