Training gradient-boosted decision trees with LightGBM in a distributed Dask cluster
If you’d like to learn more about working in Dask before you incorporate it into your workflow, we have a reference that can help.
Use the XGBoost ML library to train a model in a distributed Dask cluster
How a scikit-learn pipeline can run with Dask parallelization
How scikit-learn grid search can be accelerated with Dask parallelization
A working end-to-end script showing how a Random Forest can be trained on a GPU using RAPIDS
Need help, or have more questions? Contact us at:
- On Intercom, using the icon at the bottom right corner of the screen
We'll be happy to help you and answer your questions!